Smyth Research Realiser A16
May 26, 2024 at 9:37 PM Post #16,066 of 16,078
I don’t suppose you’ve taken your A16 to your friend’s theatre and measured a PRIR there? Because if you haven’t, how do you expect your A16 to replicate the same room sound, or if not the room sound, the “openness”? I absolutely love my Omega 96’s, but my own speakers are at anywhere from 110” to 130” from the MLP, and there’s a big open room space behind. So pure Omega 96 listening is never as “open” for me as my own PRIR. I would also venture to guess that other A16 customers would find listening with my PRIR too “open,” too much reverb.

I actually have, on multiple separate occasions with the A8 at least and in multiple different locations. In most of the settings the A8 didn't really sound near as good as what I could hear in real-time with the speakers. The rooms themselves were nowhere near optimal however for recording purposes, but even capturing the room "defects" did not translate into how the speakers or room actually sounded in real-time even with crappy acoustics.

I spent close to two hundred hours doing this as well, take after take after take until I finally landed on something acceptable at the time in each location, and still they were only ever somewhat convincing (as in tricking me into believing I was listening to the actual speakers) when I had speakers in front of me to look at. The differences between the A8 and A16 aren't significant enough either that the recording of a 5.1 or 7.1 setup would have a night and day difference in sound between the two. There is a difference, and I can only attribute it to the differences in the mics, and possibly some small alterations in how the two units run their captures, but they aren't significant enough to make something go from not sounding open, to sounding absolutely the same as real speakers.

Even recently I was able get get some PRIRs done in a studio setting with my A8, and it still doesn't sound like the real thing. The positional cues are all fine, and the distances seem pretty close, but it still doesn't sound anywhere near as open as the real thing.

Are your speakers in the same room you normally listen to your A16 in, and are you in the same listening position when using the A16 as you are when you listen to your speakers? That does make a pretty big difference for some. It definitely did for me in one location I used in 2021. I was almost convinced. Then I went home...

Also, you can call what the A8 and A16 do a "render" if for whatever reason you take offense to "algorithm", but it's still producing a virtualization through an algorithm. The A8 actually had a lot more control over how you could manipulate your rooms as well with its reverb/delay/decay/distance and tone settings. The A16 is just a new iteration of what the A8 was already doing, yet with many of the "knobs" removed for whatever reason. They even call it the "SVS algorithm".

Don't get me wrong, I think it's an amazing piece of hardware, but I never have an experience where the headphones simply melt away and I forget that I am wearing them.
 
Last edited:
May 27, 2024 at 12:24 AM Post #16,067 of 16,078
I actually have, on multiple separate occasions with the A8 at least and in multiple different locations. In most of the settings the A8 didn't really sound near as good as what I could hear in real-time with the speakers. The rooms themselves were nowhere near optimal however for recording purposes, but even capturing the room "defects" did not translate into how the speakers or room actually sounded in real-time even with crappy acoustics.

I spent close to two hundred hours doing this as well, take after take after take until I finally landed on something acceptable at the time in each location, and still they were only ever somewhat convincing (as in tricking me into believing I was listening to the actual speakers) when I had speakers in front of me to look at. The differences between the A8 and A16 aren't significant enough either that the recording of a 5.1 or 7.1 setup would have a night and day difference in sound between the two. There is a difference, and I can only attribute it to the differences in the mics, and possibly some small alterations in how the two units run their captures, but they aren't significant enough to make something go from not sounding open, to sounding absolutely the same as real speakers.

Even recently I was able get get some PRIRs done in a studio setting with my A8, and it still doesn't sound like the real thing. The positional cues are all fine, and the distances seem pretty close, but it still doesn't sound anywhere near as open as the real thing.

Are your speakers in the same room you normally listen to your A16 in, and are you in the same listening position when using the A16 as you are when you listen to your speakers? That does make a pretty big difference for some. It definitely did for me in one location I used in 2021. I was almost convinced. Then I went home...

Also, you can call what the A8 and A16 do a "render" if for whatever reason you take offense to "algorithm", but it's still producing a virtualization through an algorithm. The A8 actually had a lot more control over how you could manipulate your rooms as well with its reverb/delay/decay/distance and tone settings. The A16 is just a new iteration of what the A8 was already doing, yet with many of the "knobs" removed for whatever reason. They even call it the "SVS algorithm".

Don't get me wrong, I think it's an amazing piece of hardware, but I never have an experience where the headphones simply melt away and I forget that I am wearing them.
Thanks, I really appreciate the detailed reply.

Yes, my speakers are in the same room as my A16, and in the same listening position when I typically use the A16. On the other hand, I recall very well the first time I tried a listening room built from my first PRIR. I was stuck with 2.15 firmware on new hardware, already had skewed HPEQs, but figured what the heck, let’s practice doing a PRIR for the eventual arrival of fixed firmware (many months later as it turned out). Happily the gain skewing worked in opposite directions PRIR vs HPEQ. I was sitting on the floor in front of my A16, which is in the right shelving unit next to the TV and the front speakers. I fired up some music and was absolutely amazed that my center speaker was apparently firing at its normal distance in front of me, where of course no speaker was mounted. Likewise for strong apparent location of my surrounds. I occasionally still sit in that position if I am making A16 adjustments, and the virtualization effect is as strong as my usual listening position (although I usually have to disable head tracking, which can lead to very distracting results). I also demonstrated my A16 to my brother, who lives about 100 miles away, and I was not conscious of a decrease in the quality of the effect. He also perceived a strong 3D effect (similar genetics, obviously, so head and ear shape match pretty closely). I’ve been using some of these speakers for over 25 years so it may very well be that my brain is strongly wired to accept the virtualized sound as matching the physical speakers. Human perception is amazing. There are even wilder things going on in your visual cortex (look up Edwin Land’s Retinex theory of color vision sometime - his article in Scientific American from the 60s is very readable, and I can testify that I’ve seen the colors in his experiments even when the wavelengths aren’t present).

I don’t necessarily take offense to “algorithm”, but simply am making the distinction between using an idealized HRTF and room layout, as in the Dolby and Apple formats, and using a measured BRIR using physical speakers and my head/ears.

We agree it is an amazing piece of hardware, and frustrating hardware as well. I regret that you don’t have the experience of the headphones disappearing, because for me it is amazing.
 
Last edited:
May 27, 2024 at 3:09 AM Post #16,068 of 16,078
I’ve been using some of these speakers for over 25 years so it may very well be that my brain is strongly wired to accept the virtualized sound as matching the physical speakers. Human perception is amazing. There are even wilder things going on in your visual cortex (look up Edwin Land’s Retinex theory of color vision sometime - his article in Scientific American from the 60s is very readable, and I can testify that I’ve seen the colors in his experiments even when the wavelengths aren’t present).

I don’t necessarily take offense to “algorithm”, but simply am making the distinction between using an idealized HRTF and room layout, as in the Dolby and Apple formats, and using a measured BRIR using physical speakers and my head/ears.

We agree it is an amazing piece of hardware, and frustrating hardware as well. I regret that you don’t have the experience of the headphones disappearing, because for me it is amazing.

When I was in the studio for the first time with John, he commented about a particularly powerful something that we have between our ears due to an oddity I was experiencing. Even slight shifts with my eyes closed would cause the positional cues to change. They didn't of course, but I was convinced that they were shifting with my movement in relation to where the physical speakers were in front of me. I wasn't looking at them either, because we were testing the accuracy of each speaker after my PRIR capture. This was without head tracking...which I don't use for various reasons. Mainly, it does nothing for me.

At home, after using these kinds of virtualiztions over headphones through various devices since 2010, I now essentially have head tracking without a head tracker and it's quite uncanny at times. I can be watching something, and if dialogue is happening on the screen, I can stand up and rotate ninety degrees and the sound is still coming from where the center channel would be if it were real. I can then rotate another ninety degrees and yes, the center is now coming directly from behind me, even though it shouldn't have moved at all.

Where I currently live, I have a 77'' OLED that I sit six and a half feet away from. Despite having a PRIR that is a further distance than that, even with my eyes closed, it rarely sounds like it's beyond six and a half feet. Until I move myself backwards a few feet, or I close my eyes and envision a speaker behind the screen, closer to the wall.

Whenever I go to my friend's house, the speakers are approximately eight and a half feet away. I see that they are that distance, and they sound that distance, but then there's also all of the crossfeed that the speakers are doing that have a very tangible effect on sound that is not replicated with the Realisers. Especially not in a drier setting. A studio is particularly dry, but not so dry that it eliminates crossfeed entirely with real speakers, but the diminishment of it in a PRIR is even more pronounced.

That kills the illusion of hearing the front right primarily with my right ear, and partially with my left and the front left primarily with my left ear and partially with my right, but the biggest offender is that when we are doing PRIR captures, we have earplugs in and the mics are at the ends of them. The mics do not record how the sound travels through our ear canals. Our hearing doesn't just stop at the opening, and up to the opening is all that is captured.
 
Last edited:
May 27, 2024 at 11:11 AM Post #16,069 of 16,078
I can send the file to 3D-print farm in the country where the HRTF measure service provided to reconstruct the module as a dummy head for HRTF measure purpose.
A physical model of the head you mean?
I did a similar suggestion in the past, with an additional idea:
Include the ear canal in the model, and build in high quality microphones inside the head, with the membranes in the position of the ear drums. So you can measure at the position of the ear drums, and without anything in the ear canal.
 
May 27, 2024 at 12:09 PM Post #16,070 of 16,078
A physical model of the head you mean?
I did a similar suggestion in the past, with an additional idea:
Include the ear canal in the model, and build in high quality microphones inside the head, with the membranes in the position of the ear drums. So you can measure at the position of the ear drums, and without anything in the ear canal.
Yes, but this might only be achieve by CT or MRI, as far as I know, to accurately depict the structure of ear canals which other type of scanners cannot. The dummy head should be solid, not hollow. And the material forming its surface should mimic the elasticity of skin instead of solid plastic in order to decrease reverberation in ear canals, theoretically. My job allows me to utilize CT on daily basis, also know how to reconstruct 3D module with 3D-Slicer. So yes, I believe this approach could possibly work out.
 
May 27, 2024 at 1:02 PM Post #16,071 of 16,078
At home, after using these kinds of virtualiztions over headphones through various devices since 2010, I now essentially have head tracking without a head tracker and it's quite uncanny at times. I can be watching something, and if dialogue is happening on the screen, I can stand up and rotate ninety degrees and the sound is still coming from where the center channel would be if it were real. I can then rotate another ninety degrees and yes, the center is now coming directly from behind me, even though it shouldn't have moved at all.
The visual cues are so strong and when it comes to estimating directions the eyes win the battle over the ears. When you rotate you still know where the person is speaking and hence still hear the sound from the screen. Also a accurate distance is depending on the visual cues. If you see a potential source of the sound the eyes overwrite the information from the ears.
 
May 27, 2024 at 7:55 PM Post #16,072 of 16,078
Yes, but this might only be achieve by CT or MRI, as far as I know, to accurately depict the structure of ear canals which other type of scanners cannot. The dummy head should be solid, not hollow. And the material forming its surface should mimic the elasticity of skin instead of solid plastic in order to decrease reverberation in ear canals, theoretically. My job allows me to utilize CT on daily basis, also know how to reconstruct 3D module with 3D-Slicer. So yes, I believe this approach could possibly work out.
There is a paper on simulating the impedance of the ear canal and a remark about how people tend to find such simulation to be tilted toward an overly bright FR(it's something I've seen appear consistently in most attempts to model acoustic at just about any level(even global HRTF simplified models for 3D whatever). So I wonder if the source of that is always in the ear canal, or if there is some hidden variable that kicks in in a more global way when we try to model human skin? I'm always interested in reading such papers, but I confess that it usually gets into heavy math quite fast, and this brain hardly remembers anything. :sob:

The visual cues are so strong and when it comes to estimating directions the eyes win the battle over the ears. When you rotate you still know where the person is speaking and hence still hear the sound from the screen. Also a accurate distance is depending on the visual cues. If you see a potential source of the sound the eyes overwrite the information from the ears.
+1
For those who care to look into how we identify positions of sound sources in more details than some wikipedia page, you should be able to find a cool paper that summarizes the entire thing by googling "ARL-TR-6016"
The paper is titled "Auditory Spatial Perception: Auditory Localization" by Tomasz R. Letowski and Szymon T. Letowski
It's a 'simple':sweat_smile: summary in just 80 pages, followed by as many pages for all the cited research.
The human auditory localization ability depends on a number of anatomical and physiological
properties of the auditory system as well as on a number of behavioral factors. These properties
and behaviors are referred to in the literature as localization cues. These cues are generally
classified as binaural, monaural, dynamic, and vision and memory cues

To those who played with various recorded rooms in various actual rooms with the A16 probably got some experience of how the room and seeing actual speakers can affect our experience. I seem to be a fairly extreme case of that where, as I mentioned a few times, I anchor the sound to speakers I can see while using the A16 and my headphone. And big virtual rooms don't feel like big rooms when I'm in a small real one. My brain reduces the perceived space to the actual room, and then it just feels weird having so much reverb.
 
May 27, 2024 at 8:12 PM Post #16,073 of 16,078
There is a paper on simulating the impedance of the ear canal and a remark about how people tend to find such simulation to be tilted toward an overly bright FR(it's something I've seen appear consistently in most attempts to model acoustic at just about any level(even global HRTF simplified models for 3D whatever). So I wonder if the source of that is always in the ear canal, or if there is some hidden variable that kicks in in a more global way when we try to model human skin? I'm always interested in reading such papers, but I confess that it usually gets into heavy math quite fast, and this brain hardly remembers anything. :sob:


+1
For those who care to look into how we identify positions of sound sources in more details than some wikipedia page, you should be able to find a cool paper that summarizes the entire thing by googling "ARL-TR-6016"
The paper is titled "Auditory Spatial Perception: Auditory Localization" by Tomasz R. Letowski and Szymon T. Letowski
It's a 'simple':sweat_smile: summary in just 80 pages, followed by as many pages for all the cited research.


To those who played with various recorded rooms in various actual rooms with the A16 probably got some experience of how the room and seeing actual speakers can affect our experience. I seem to be a fairly extreme case of that where, as I mentioned a few times, I anchor the sound to speakers I can see while using the A16 and my headphone. And big virtual rooms don't feel like big rooms when I'm in a small real one. My brain reduces the perceived space to the actual room, and then it just feels weird having so much reverb.
In my own personal experience, matching room size and having actual speakers as visual anchors both play major parts in making the virtualization work or not work for me. Also, using the HT setup is critical for me in making the illusion seem real.

As an aside, I wonder if there is a range of experiences for people using BACCH SP. I say this because they have the same in ear microphone limitation as the A16.
 
May 28, 2024 at 10:25 AM Post #16,074 of 16,078
In my own personal experience, matching room size and having actual speakers as visual anchors both play major parts in making the virtualization work or not work for me. Also, using the HT setup is critical for me in making the illusion seem real.
Another paper(I have it somewhere, but where and what is it called? IDK) suggests that it is normal to be influenced by those very things you list while trying to locate a sound source. It is the amount of impact that seems to change depending on experience or whatever. I mean, I've discussed with several people who describe non DSP headphone listening as something bigger than what I get on the A16. That's how far in the other direction some brains seem able to go.
We discussed at some point how several people here don't use the head tracker. To me, it seems like an impossible choice, but different people have different experiences and maybe just different priorities. Starting with the very obvious case of people who don't move much, if at all, while watching a movie.


Otherwise, is Gravity really good audio? I remember seeing it in theater when it came out and thinking it was a "meh" movie with impressive visuals. I also remember complaining a lot about a bunch of things that seemed super wrong. Yes, I'm that kind of guy. You give me a movie telling me it's a world where up is down and weapons are laser cats you recharge by feeding them Cheetos. And I'll be like, "OK" and accept that happily, so long as nobody goes on to shoot too long without reloading with some Cheetos. I don't care for realism, I care for coherence.
But tell me it's something of this world with real physics, and I'll bitch about every single mistake! Because they set it up that way and then failed to follow their own rules.
 
May 29, 2024 at 1:39 AM Post #16,075 of 16,078
Otherwise, is Gravity really good audio? I remember seeing it in theater when it came out and thinking it was a "meh" movie with impressive visuals. I also remember complaining a lot about a bunch of things that seemed super wrong. Yes, I'm that kind of guy. You give me a movie telling me it's a world where up is down and weapons are laser cats you recharge by feeding them Cheetos. And I'll be like, "OK" and accept that happily, so long as nobody goes on to shoot too long without reloading with some Cheetos. I don't care for realism, I care for coherence.
But tell me it's something of this world with real physics, and I'll bitch about every single mistake! Because they set it up that way and then failed to follow their own rules.

I barely move at all, and I also have company that uses their own PRIR and they also don't need head tracking.

As far as Gravity is concerned: I saw it at the theater upon release in IMAX, and the audio was pretty awful in the theater we saw it in. Mainly because they had the speakers up so loud that you could actually hear them distorting.

Audio wise with the disc it's great. It's one of the better ones out there for Atmos, and has been used as demo material for speaker setups ever since the Diamond Luxe version was first available.

As for rewatchability? Eh. It's mostly another one of those "chamber piece" films, which many have a low tolerance for (Phone Booth, Buired, 10 Cloverfield Lane, Clerks etc.). I "watch" the film all of the time. Watch very specific scenes that is. :yum:

Just get it and watch the first twenty minutes for the audio.

Here's a more recent article on the subject, and I have to agree with pretty much all of their picks. As far as audio is concerned at least. Can't really say I would actually want to sit through the full run time of all of them more than once however.

https://www.stuff.tv/features/the-best-films-to-experience-dolby-atmos/

Pretty much all on that list are demo worthy for the most part, and a few are some of my favorite films of the past ten years, but as far as outright immersion goes, I'd still have to give it to The Batman. Until Civil War hits physical anyway.

Oh, and I didn't mention this previously here, but Civil War threw me for an unexpected, but very welcome loop at the start, before getting into the film proper. It actually does a channel test of sorts, or at least it's doing something that can be used as one. It starts out by playing white/pink noise that pans across most of your speakers. I wasn't actually looking at the GUI of the A16 when this happened, so I'm not sure if it was using all that were available, but it does run the basic gamut of the main 7.1 layer as well as at least a few tops/heights. As soon as that happened I was thinking "Wow, this is going to be even better than I expected"...and it was.

edit: It runs 7.1.2 three times in a row. I guess this PRIR thingamabob really works, since those were the only speakers I thought I was hearing anyway. :laughing:
 
Last edited:
May 29, 2024 at 10:15 PM Post #16,076 of 16,078
OG A-16 For Sale! Asking 2,900

Hi all, I have finally decided to part with my A16 that was from the kickstarter campaign. It is posted on US Audio Mart under the user name Bricksalt. I am the second owner and the original purchaser never used it for several years until I bought it from him about a year and a half ago. I would say it has about 1 year of very light use on the unit (less than once a week). John (Litlgi74) patiently and graciously helped me get up to speed with it and it has been great since. No upgrades have been added to this 16 channel unit, but it does include the 3D soundshop Omega 65 and 95 PRIR's. I have used it with my HD800-S phones to great effect. I am more of speaker guy at heart and therefore the A16 has sat unused for the last 6 months. It is in excellent condition and I have had no problems with it. I have all the original packaging and accessories, but would prefer pick-up anywhere within a 2 hour drive of Cleveland. I am also selling my HD-800S's with Dekoni earpads on US audio mart.
Feel free to shoot me a PM if you know anyone who might be interested or have any questions.
I hope you all have a great summer!

Dave
 
May 30, 2024 at 9:47 AM Post #16,077 of 16,078
For those, who'd like to use the Realiser with MacOS and have lost the ability to connect since Ploytec (the supplier of the USB audio interface) has outright stopped producing drivers due to Apple's discontinuation of kext system extensions:

Apparently, they have worked out an intermediary hardware device by which to regain compatibility between the Realiser and newer versions of MacOS (> 10.15 Catalina, year of release 2019).

This might also affect users on Windows as it's doubtful whether Ploytec can retain compatibility with newer driver frameworks already in the works for Windows 11 and beyond.

https://www.kickstarter.com/projects/ploytec/ploytec-revival
 
May 31, 2024 at 2:38 PM Post #16,078 of 16,078
I also remember complaining a lot about a bunch of things that seemed super wrong. Yes, I'm that kind of guy. You give me a movie telling me it's a world where up is down and weapons are laser cats you recharge by feeding them Cheetos. And I'll be like, "OK" and accept that happily, so long as nobody goes on to shoot too long without reloading with some Cheetos. I don't care for realism, I care for coherence.
But tell me it's something of this world with real physics, and I'll bitch about every single mistake! Because they set it up that way and then failed to follow their own rules.
:handshake:
I'm totally with you here. I think Gravity is made well and has nice visuals and afaik also audio (I only have the 7.1 version, but with 3D picture), but some scientific facts are wrong. But you need to know a little bit about space flight and physics. I remember that I told someone that Hubble has a higher orbit than the ISS for example, and a normal person doesn't know this. (and I've not even started about orbital inclination...)
However for a Hollywood movie it's still fairly accurate.
The best realistic space flight movie is still Apollo 13. For me this is a masterpiece!
(and the worst one is probably Armageddon...)
 

Users who are viewing this thread

Back
Top