Click to view our Accessibility Statement or contact us with accessibility-related questions
Showing 1 of 50 conversations about:
psiclone
405
Dec 12, 2017
bookmark_border
Would not waste my money on this. All you need is an HDMI 2.0 cable, which you can get twice as long as this for 1/2 the price. You don't need anything else, but the rest is all marketing bs set up to make you think you need it. You don't.
Dec 12, 2017
psiclone
405
Dec 12, 2017
bookmark_border
psicloneActually if you go for HDMI 2.0a then it will also handle the HDR function for you. All that plus ARC for $16 vs what you have here.
Dec 12, 2017
shorkorde
127
Keyboard Club Member
Dec 12, 2017
bookmark_border
psicloneYou're spot-on with needing HDMI 2.0a spec for HDR support; but this is from your source (playback device) and display (TV). Not the cable. The HDMI cable is just a digital pipeline. So, short of an incredibly poorly made cable (which may not work at all anyway) or a really, really long run of cable (where transmission can begin to falter over distance); 99% (okay, like 95%) of all the old HDMI cables will still manage compliancy with HDMI 2.0a (and 2.1 spec as well). It's really just about the cable supporting enough throughput speed for the increased signal bandwidth.
Rolling back to this mCable for a moment: it doesn't make things HDR, or magically produce an HDR-like effect on a non-HDR supporting television. However, the throughput is enough that it *could* support an HDR compliant signal (again, from an HDR source, to an HDR supported TV). But exactly like you said: so will basically any other $16 HDMI cable that would also support ARC.
What this cable *does* do is a degree of digital signal processing as the image passes through. Basically, it upscales (lots of cheap things do this - doesn't make it special), provides anti-aliasing (to decrease blockiness on edges and across the upscaled pixels to help minimize stair-stepping), and seems to add in some sharpening (which can be a good thing or bad depending upon the source). And what you're really paying for is the microprocessor + algorithms on the chip that are trying to figure out which things to smooth and which to sharpen in realtime.
That's a lot of words to say I'm about 50-50 with you: it absolutely changes the image and does so in a way that most people will consider better than a non-computing upscaler in A-B testing. How much, is it worth it, and can people even see a difference when more than 2-3ft away from the screen? That's a highly subjective call. Personally, I'm not bothering with this specific cable - but I may do one of the newer ones someday (if they hit a lower price point). I have many older video sources that could benefit from good AA.
Just worth noting that they aren't completely blowing smoke (er...just don't count the images on their boxes and marketing materials. Those are *clearly* faked up).
[Edit to fix HDMI 2.2 to HDMI 2.1...got confused for a moment with HDCP 2.2] [Edit2: really having to eat some of my words here, bah. I felt that nagging feeling and looked up HDMI 2.1 spec. It increased far further than I had thought - to 48Gbps (from 2.0a 18Gbps) - so I lied. The vast majority of current HDMI cables most likely would need to be upgraded if that bandwidth is fully realized. My mistake.]
Dec 12, 2017
psiclone
405
Dec 13, 2017
bookmark_border
shorkordeYeah, I was skeptical when reading your post, but some may not remember that a lot of TVs that were 4K early on weren't showing it because the cables weren't up to spec for many purchasers. You're not entirely wrong by any means, but the spec is there to ensure all hardware between source to destination is in line. XBox One S requires a certain spec for instance. If you don't get the right cable, you won't see some of what you expect, assuming your TV can handle it. I have a 4K UHD HDR TV (wow that's a lot of letters). It does me no good with my current cabling between my XBox and the TV. I don't really care, but it does matter. What I'm getting at with this cable is that it won't likely help a ton more than getting your cabling up to snuff for far less money. One saving grace for this particular drop is that the shipping looks like it's free and I suppose if you only had very low-res stuff to watch it might help, but keep in mind that if you only have a 1080p TV, then it can only present 1080p fidelity.
Dec 13, 2017
shorkorde
127
Keyboard Club Member
Dec 13, 2017
bookmark_border
psicloneAh, I think we had a bit of a disconnect and are agreeing with one another (this happens to me a lot, so it's probably my fault); but this is a great point to bring up to others looking at the cable, so I'll try to clarify here:
The actual HDMI cable is nothing special. In fact, it's pretty old, and if you have a newer spec'd 4K HDR TV (such as yourself) it's entirely possible that the cable won't even have the throughput to support all the different modes on the TV (for example: a 4K@60Hz with HDR source on a player/TV combo that supports all of this).
In a case such as this? The cable will either under-perform a cheaper cable with better throughput OR will provide exactly the same image. It can only be equal or less than, quality-wise. Never better. Period. (Looking at my previous post, I may have worded that badly and added some confusion here. If so, I apologize.) And exactly like you said: if you have a 1080p TV? This won't give you 4K or even a "like-4K" experience. It'll be capped at 1080p, and very well may not even kick in at all.
So the cable is nothing of note. The thing you're paying for is the microprocessor attached to this cable. It's an upscaler that is designed to take the place of the upscaler that may be in your source (Blu-ray/game system/etc.) or TV set. **but** it only works on lower quality stuff! This is why the cable has the caveat that you need to run your lower resolution source natively.
If you pop a DVD (480p) into your Blu-ray player, and it upscales to 1080p or 4K before sending the signal along to your TV? This cable will do very little (if it sees a 4K source, it'll do nothing; and if your 480p/720p source has already been upscaled to 1080p, this cable tacking on additional upscaling to 4K could in fact make things look worse with artifacts and over-sharpening - since you're basically doing this to the source twice in a row). Alternately, if you have a newer TV that already carries a nice upscaling processor (rare, but becoming more frequent nowadays), you will once again see very little benefit (what your TV does and what this cable does may provide near-identical benefit).
This cable is for people who have lower quality (480p/720p/etc.) sources (or gaming hardware! more on that in just a second), and a TV that doesn't handle upscaling all that well (for example: older upscalers that would look at a 480p signal and just keep duplicating pixels or stretching/distorting the image until it fills the screen). In these cases, the cable's onboard chip will do that lifting for you and bypass the potentially less effective work done by your TV.
So yes, in the case you're referring to with already feeding in nice sources to your TV? This cable will not work any wonders, and you are definitely better off just buying good cabling for less money. But for lower quality sources (say, PS3 games capped at 720p), I do think the gaming edition cable did a great job with adding the right amounts of anti-aliasing and sharpening to the image to provide a better looking image across a 4K display. (Please note: I'm basing that off of the Linus Tech Tips review, not personal experience; and that cable *is not* this cable.)
Last note is that I mentioned hardware - and this is more for the PC gamer crowd. For those with modest gaming hardware, but a nice TV/Monitor in the 1080p/4K range of resolution; they may find that they often have to drop quality settings (resolution and effects) just to get a game running at a respectable frame rate. At that point, things start to look unimpressive on their otherwise impressive display.
A cable such as this might take those lower resolution sources and make them more visually pleasing since it will add in those upscaling tricks like anti-aliasing (which is, traditionally, one of the first things a lot of gamers try shutting off to regain a bit of performance). But again, this only works if the computer is running at the game's native resolution (so if you're running your game at 1366x768, but have your desktop set to 1080p with a 1080p monitor? The cable will once again do nothing. Your desktop will also need to be set to 1366x768 before the cable's AA and sharpening kick in).
Is this as good as having the hardware to play games at your display's native 1080p/1440p/2160p resolution? No way. But it would probably still be visually perceived as better than just tacking that 720p-ish source into your high-res display without any anti-aliasing at all.
That's a whole lot of writing, but hopefully clears up some confusion for a few people still looking to buy. This is definitely a, "for some, but not others," kind of item; and as I stated before: I'm actually steering clear of this one (but may eventually buy the gaming edition for some of my older consoles and/or cinema edition for my DVDs). Just trying to make sure others have the info they might need to make their decision.
Dec 13, 2017
Kody11
0
Jan 18, 2018
bookmark_border
shorkordeI agree. I got the new samsung 8000 series. It says "240", but realistically it's only 120hz. Anyways, I found 4k@60hz by Amazon Basics for 6 bucks for 6 feet, or $7 for 10 feet. They have a bunch of different sizes like 15,25. 25 is I think like $11. Unless I'm missing something that seems to be a better choice.
Jan 18, 2018
View Full Discussion