Intel originally designed its Light Peak interconnect as an optical technology that would replace all other PC connections, handling everything from LAN to storage devices to monitors. But after the company unveiled it in 2009, PC manufacturers called for a cheaper electrical incarnation – i.e. non-optical – and due to other …
Is it RAND and is it a standard
The obvious questions: Will intel license it under RAND terms and will make a proper standard out of it which other vendors can interop to and build to.
None of the news so far say a word.
In any case, if this gets standardised and if someone does some silicon for a switch fabric for this it will have a number of interesting non-apply, non-display applications in the supercomputing arena. PCI-Express to a distance of 100m makes for a very tasty interconnect :)
Cost is a non-issue as a long video cable means I don't need a pc under the telly.
HDMI is already here and will easily do 10M for a cable cost of ~£15 from the right place.
... but only short one
You may read more about inherent length limitation of HDMI here http://www.bluejeanscable.com/articles/how-long-can-hdmi-run.htm . It gets messy, or expensive (and messy) if you want to build proper cinema system where player is away from the screen, or send picture&audio from a PC which doesn't happen to sit next to TV. I know , 15m distance from TV seems a lot, but think where this cable will actually run - bend under the walls and going around the obstacles, not straight line.
On the other hand, optical has been tried (networks; toslink) and proven to work, and not as expensive as vendors want to make us think.
Expensive monster cables.
> Cost is a non-issue as a long video cable
> means I don't need a pc under the telly.
Of course cost is an issue. The cost of the cable and the ports to plug it into on either end might be more expensive than the PC you would plug it into.
There is also the tricky issue of getting a cable from point A to point B. Most people have a problem with this. That's why wireless networking is so popular.
Although you can already do "video distribution" of the sort you're talking about with existing cable technology. The big problem (as I said before) is running the cable.
At the risk of turning this into a cable bashing thread.... "expensive monster cables" are pointless. You don't need to pay $200 for a Monster HDMI cable unless you're running 50+ meters (at which point, you'll have to send your data with a wish and a prayer anyway). For your 3-15ft runs, those $0.01+2.99S&H cables from Amazon work perfectly (unless you're unlucky and get the 1in20(ish) defective cable, as any production environment churns out the occasional lemon). Likely these will work fine for 15m runs too, just make sure you get the 1.4b-rated cables so, even with a slight defect, you'll still manage a full 1080p if not the 3D it's rated for. (yes, HDMI will auto-downgrade your quality based on the capability of your link. If the cable can't handle a full 6.4Gb/s, it will step down until it finds a speed that works.)
#Tunderbolt and Lightning!
#Very very frightning!
(Mine's the leather one. No sleeves, doesn't do up at the front...)
I want one
as a low cost, faster alternative to optical FC
Copper makes sense
While the optical version makes sense for communications between *self-powered* items, if Intel really wants to overthrow USB then it needs a connection system which can power whatever is being plugged into it (I am thinking of Thunderbolt version of today's USB-powered memory sticks and HDD storage).
It may mean that a computer (or other self-powered device) requires both a copper and optical port, but unless Intel can supply power via photons...
physics to the rescue
perhaps some kind of photo...electric... effect... joking aside, getting electricity from light is pretty easy, the real question is how much energy's going to be lost to ensure the correct voltage at the correct amperes is delivered.
...you can just run the copper power cables down the same line as the optical.
It isn't rocket surgery!
The optical version did have power
As originally envisaged there would be a powered cables and unpowered cables. The power would run alongside the optical cable. Not much different to how is would work now I expect except of course the data lines and the powerlines need to be shielded from each other.
well of course fiber needs mechanical protection, but apart from that there is no reason to "shield" it from power cable - there won't interference!
LaeMing > It isn't rocket surgery
Thnx for that well-mixed metaphor.
He was clearly saying the copper data & power lines need to be shielded from each other, not an optical/copper combo
Don't want optical..?
... but yet the motherboard manufacturers still put SPDIF digital optical ports on motherboards, that are probably used by at most,1% of all PC users.
OEMs != Mobo Makers (well, y'know..).
Also it's not about wanting optical or not, it's about the fact that at the kind of quality we're talking it gets kinda - expensive. Think 10GigE fibre costs.. it isn't cheap - SPDIF is low bandwidth and quality doesn't matter.
We'd all kill for optical if it was viable at consumer prices.. Say USB3 cable prices.
Ummm for all my pig ignorance on the subject......
I have yet to find, use, seen for sale, seen in service, seen on any website - etc., etc., etc., ANYTHING, that runs via SPDIF....
Seen the connection // port on a few older MOBO's - but SPDIF is like the mad alcoholic uncle who has wartime flash backs, and thinks everyone is a Japanese soldier - Blind drunk - in a blackout, in the jungles of Borneo -- Carving knife in hand at the christmas family dinner.
SPDIF - It's talked of extremely infrequently and has never put in an appearance.....
SPDIF is an audio connection, you'll see it on audio equipment. Inputs are generally found on surround amps, you'll see the outputs on most TVs, games consoles, digital STBs etc.
They'll start to die out slowly in consumer gear because HDMI carries the audio channel on modern kit.
"I have yet to find, use, seen for sale, seen in service, seen on any website - etc., etc., etc., ANYTHING, that runs via SPDIF...."
I have SPDIF ports on the following items at home:
Home cinema amp
Pioneer DVD player
Sony LCD TV
So yeah, nothing mainstream uses SPDIF.
Don't confuse your ins and outs.
> I have SPDIF ports on the following items at home:
Yeah, but is that an input or an output?
My latest TV has an SPDIF port but it's only an output
intended to route whatever has been sent to the TV to
an external sound system. If you happen to have an
SPDIF cable on your device (like a PC), then it's pretty
Sound over HDMI is probably the expectation for
consumer electronics these days.
> PlayStation 2
> Pioneer DVD player
> Sky+ box
Aren't these all dinosaurs waiting for the extinction event?
Are you sure you actually meant PSX2?
That's positively pre-cambrian.
Downvoted the wrong post.
Now to see if I can add a balancing upvote...
Fail --- because I just did. I'm an idiot
I want optical. I love the idea of wrapping everything up into one.
Imagine my home mini data centre that I have in the roof cavity, all my servers, games PC, whatever. A single optical cable runs from there down 2 storeys to my office whereby my monitor, keyboard and mouse sit, PC 50 metres away, unheard.
Sounds like paradise
Make it happen!
You get the message that the game needs the DVD inserted for DRM checks...
That's what Steam is for. Who uses physical media anymore? :)
I just wanted to login to track back a comment
You can track a forum by clicking on the star icon at the top of the page...
So in other words
It was gimped. I somehow doubt OEMs would actually prefer a slower solution over copper unless the optical was prohibitively expensive, broken or had hit some technical roadblock which delayed it for another year or two.
Intel's roadmap has turned into a bit of a train wreck. Here's hoping Lightpeak appears as originally intended at some point.
What everyone really wants...
Is a long distance, high-speed / high bandwidth, low-cost, connection system, preferably capable of connecting anything and everything.
Unfortunately, while optical looks the most promising, it's caught in Catch-22 limbo as it's 'too expensive' and without ubiquitous adoption won't have economies of scale cost reductions. Until motherboard manufacturers pick it up as standard it won't happen.
There's also the other issue of how easy and cheap it is to plumb infrastructure cabling into households. Unless optical gets as cheap as a kilometre of cat 5/6 and a dozen sockets, is as easy to wire-up, there's going to be resistance to adoption other than for one-off uses (server in the upstairs back room to TV in the front etc) or within businesses.
In the meantime we're stuck with short-but-fast or long-and-slow.
you forgot ....
low latency. And please, explain to all those toslink vendors that we can't really afford their cables and they should look for profit elsewhere. Oh, you mean they are profitable already? How come?
Been there, done that.
> Unless optical gets as cheap as a kilometre of cat
> 5/6 and a dozen sockets, is as easy to wire-up,
> there's going to be resistance to adoption
I have fibre in my house because I used bundled cable. It was much cheaper than dealing with individual cables because my home builder charged by the drop and their pricing for labor made the cost of materials insignificant.
ANY sort of cable retro-fit is going to be an expensive mess. It doesn't matter what kind of cable it is.
The exteriorization of PCI Express
Upon the advent of the new MacBook Pro with Thunderbolt I found that it is just the exteriorization of PCI Express which formerly inter-links the non-peripheral components of a PC.
The so called silicon photonics technology which harnesses indium phosphide on-chip laser beam is much anticipatied. Although copper wire is currently used, Thunderbolt with its ideal of deperipheralization has achieved its rhetoric success.
Cost? What cost?
25 comments in (as I write this) and still no-one is offering estimates of what the cost would actually have been (for fibre) or what it will be (for copper).
The distances being talked about here would allow you to drop that gigabit ethernet card, with none of the leakage problems mentioned here the other day by someone with actual engineering experience of USB3, so the optical version must be *very* pricey. Or maybe the OEMs just haven't thought it through.
But in the absence of actual prices, who can say?
I'm also wondering what the difference in price is, but to be honest, it probably wouldn't have to be much to annoy the OEMs who are chasing some very thin margins.
Gigabit ethernet is pretty well entrenched on RJ-45 on copper, so it probably wouldn't have been able to displace it at least initially.
Have you seen the price of existing optical gear (FC, 10GE, etc) ?
Can this be repurposed?
This might be a _very_ disruptive technology in server rooms.
Apple missed a trick - 2nd Thunderbolt port, please
It was a good idea to bundle MiniDisplayPort into Thunderbolt (future displays won't need to hog a USB port for sound/FaceTime), but they should have included a second port on the MBPs. It's using the same MiniDP connector, but there are no thunderbolt-equipped ACDs (yet), so you plug in your legacy display and unless you get a hub (which Thunderbolt is supposed to obviate), you can't use any other Thunderbolt devices while your display is plugged in.
I also thought Thunderbolt was supposed to be protocol agnostic, so they could have removed the USB ports, added at least one more Tunderbolt and supplied adaptors for your legacy USB2 devices.
- Microsoft: We're hiking UK cloud prices 22%. Stop whining – it's the Brexit
- Despite best efforts, fewer and fewer women are working in tech
- And so we enter day seven of King's College London major IT outage
- Thanks, IoT vendors: your slack attitude will get regulators moving
- AMD is a rounding error on Intel's spreadsheet and that sucks for us all