* Posts by auser

32 publicly visible posts • joined 12 Apr 2007

Ethernet — a networking protocol name for the ages

auser
Happy

energy network

Just one little remark to the energy network idea: He is talking about active switched and buffered energy transmission. This is sometimes called a smart grid. It works by accounting for every amount of energy put into and taken of of the system and directs the required amount from the available sources to the destinations that need it, while buffering any exessive amount for later use. This would work like a big distributed battery, which can be filled when we have energy and drained when there is a need. This would allow distributed renewable sources to be used without unbalancing the network and also allows energy reuse, like regenerative braking in electric wehicles. I think this is a good idea.

Apple banishes Macs to old folks home

auser

Price?

"I wonder if this will make the price go down for the old used G4s?"

You can buy working G4-s and imacs in wienna for 1 euro as e-trash... (working means plug them in and they boot up with all the data of the last owner still intact)

Imho, macs got worser and worser since Wozniak stopped working on them. I have an old G3 I got for free, but I can't really find much use for it, even though the video capture cards still work. Besides that, it doesn't matter how I look at it, the sticker still reads 'made in china', while at that time, everybody else was assembling computers in ireland.

On the other side, the oldest computer I'm using is a 17 years old toshiba laptop, which happen to have the same form factor and screen size as todays netbooks. Suprisingly it still runs and it's still supported. (a year ago I could buy a bigger ssd for it in the first shop)

Intel UMPC chip enters service as server CPU

auser
Thumb Up

The intel atom cpu is

actually a classic pentium mmx core produced with the newest technology. It can execute at most 1.5 instructions/clock compared to at least 2 instructions/clock for the pentium pro. This is why you get only around half the effective speed at the same clock speeds compared to the ppro family. It was never intended to be a standalone cpu, it was designed to be embedded into gpus, because intel's gpu system is actually designed to run gpu shader code on x86 cpus. They just decided to sell them as standalone cpus too. However, imho intel should really come up with an atom variant with a built in north bridge. That would solve the power consumption problem and allow smaller systems, while retaining the same functionality.

How to beat AVG's fake traffic spew

auser

This is a slightly bugged feature...

The original idea behind the scanner was to put an icon on the search result page that shows if the site is dangerous or not, so the user doesn't have to click on every page to see if it's good or not. Imho this feature should be selectable independently from the normal link scanner, like precaching is only an option and not mandatory for most browsers.

This feature only works in ie and firefox. There is no opera support, mainly because opera has a similar feature, but they use a central repository of known good and bad sites, so the browser can check a site without connecting to it. Apparenly google has the same feature too, with a clickthrough screen. Recently microsoft has added this to their newest browsers, but only as an option, because it can be used to record user requests.

Imho avg should just go and do it the right way and check the pages when they get to the computer after the user has started downloading them. It might even pick out the bad data from the good one, so a user could visit an infected page without seeing anything dangerous.

Warning sounded over black hole in UK physics teaching

auser

There is a very simple system...

for solving such problems. You have to separate physics education into three levels:

-physics scientists: they only teach at universities, but usually do the research work

-physics engineers: they work in the industry (this is the best paying one)

-physics teachers with a university degree: they can teach at a university or in high schools

-physics teachers with a college degree: they can teach in elementary or high schools

-general teachers with a college degree: they can only teach in elementary school

This way, the teachers who chose the easier, shorter and cheaper /maybe free/ college degrees have to teach in schools, because they don't have the qualification to work as scientists or engineers.

Making every teacher take at least 2 (an average of 3) specialisations to get a degree usually results in teachers who can teach maths, physics and chemistry/biology. Any combination is possible, but I've never seen a teacher with physics and literature combined, but it's possible. (the usual other combination in the east european country called Hungary is to learn literature, history and grammar together, with art and foreign language teachers in another group)

I just don't understand what's happening in England, but I fear that the rest of Europe will soon starts to copy it.

US air force chiefs sacked in robot-armada brouhaha

auser

The air force...

is not only owns most us fighter aircrafts, but also bombers, transport planes, most ballistic missiles and the us space command also belongs to them. So in case of a really big war, they have most of the job to kill everyone on this planet. This is not something that can be divided between the other branches.

On the other side, the big problem with the nukes was not that they transported them across the us and left them unattended for some time, but that they exist. This weapon class was ment to be destroyed by an agreement between the usa and the soviet union. Yet, they still had them in a combat ready condition and thanks to this event, now everyone knows that they didn't destroy them in the first place.

The droners in vegas isn't a big problem, because they recruit them from other branches of the airforce, so there is no need for real pilots to be involved. The big resistance came from the leaders of the airforce, but not because of some remotely operated units, but because the government decided to use automated drones, without human controllers. Most people in the us airforce think this is a bad idea and don't want to use them at all. Getting rid of these people was necessary before drones with full autonomus ai can be used. The same thing will happen when ground units get replaced by robots in the future. Now the only question is which side will have better hackers?

Fellow from AMD ridicules Cell as accelerator weakling

auser

The cell has it's problems...

but not because it has a power core, but because the cell spe-s have a very low physical memory limit (256Kbytes) and don't have any way for using virtual memory. So they are very fast dsp chips. Even nvidia's gf8 line is better, because they can run c code (with cuda) without hacking around memory limits.

And the os never gets in the way, because most os kernels are fully multithreaded (except linux and macosx), even windows nt 4.0 was multithreaded. Not to mention that nvidia's gpu cores can run their own vliw risc based os so don't even need a cpu to work.

However the best idea comes from intel. They just put old pentium I cores into a single chip and call it a gpu. It's not the fastest solution, but certainly the most general purpose. You can even select a single core from this array and run a whole os on it, they call it atom. And yes, it runs crysis and yes it's a single core from intel's upcoming new gpu array.

On the other side, while even nvidia is pushing out their own cpu-s, amd is stuck with a classic old big x86 cpu line and a traditional gpu line.

Google silences Android critics

auser

The google version of java...

As long as they don't open up their java engine to the public it will be a closed platform. The rest of the closed code doesn't really matter since it's mostly just bytecode running on their virtual machine, but not opening up their vm means only the selected few hardware manufacturers can port it to their own handsets. This locks out open source communities and smaller companies who would choose android if they could get it to run on their own hardware. Without an open vm, this project could go the way of sony's closed source virtual machine which slowly got replaced by a standard java stack even in their own phones, because nobody supported it, despite having gcc as it's compiler and having open apis. Google should say it's an open platform running on a closed system (like windows) or open it up for everyone.

Welcome to Las Vegas - Home of the technology superpower you've never heard of

auser
Happy

This is just a show...

like everything is las vegas. They bought an unused datacenter, painted it nicely and connected it to all sorts of networks. The thermal management system is a decorated version of the standard equipment everybody uses. Not to mention the cooling system, which could be based upon a cheap closed cycle system, especially when the site is in the middle of a desert. I watched their cartoon too and it looks like the owner is just playing the big boy and trying to attract customers.

Actually, i've seen better built datacenters in eastern europe. The ones used in hungary for the government's communications monitoring system is: more secure, almost always based underground, secured against terrorists and nuclear attacks, have their own backup power plants and separte network backbone (not the small 40Gbps links switch uses, which is just a 4 color 10Gbps optical cable), fully distributed across the country and officially nobody would admit they exist. (except the funny failures, like the 'phographing is prohibited' sign before a plain mountain in one place and when one of the elder congressman said 'of course we record everything' and the government's ability to pull out of their hat phone and internet records and gsm location data months after a crime when it's finally discovered by the police)

ps: Did you know that the easiest way to collect personal information from the people taking part in an antigovernment protest is to cross reference the gsm cell location info with the residential database of the area? As seen from the political trials currently under way, this results in a small 3% error rate. (a few tourists were accoused too because they were in the same area)

Landmine charity: Ban the killer robots before it's too late!

auser

Old sea mines are a bad example...

as they had a tendency to float off their anchors and blow up civilian ships, mostly red cross and hospital ships. While human operators also tend to fire at civilians because they just want to kill them, the number of such cases is relatively low. One of the main reasons is the level of intelligence is tend to be higher in humans than in automated systems. A robot can only operate correcly if there are no neutral (civilian) targets in it's area. While this is also a problem for humans, they usually have more 'common sense', that robots are lacking.

Richard Branson dupes entire wireless industry with Google on Mars gag

auser

What is really funny is...

that this sounds like a better idea for the colonization of mars than everything else that the rest of the world came up with.

Mac OS X Tiger out, Leopard back in

auser

Compatibility...

I wonder when the difference between the two macosx-es is just a minor version number, how could they make them incompatible so much? It's like the difference between win5.0 (windows 2000) and win5.1 (windows xp). Different graphics but the same os. The last time apple did a real os change was switching from macos9 to macosx, it was like switching from win3.1 to win2000. But since then the kernel is the same. And companies selling software just want everyone to buy everything again. No matter how buggy it is, but at least linux keeps the compatibility with applications from the '60-ies and bsd unix versions even keept the whole os compatible for over 40 years now. Compared to this the macosx problems and vista's "not even compatible with itself" approach is a big step back from what we seen in the past.

DVB-H is the official mobile-TV standard

auser

There are two technologies here...

This is the case of broadcast tv versus ip tv. The first offers a more limited choice of channels but it's free for everyone. The second offers everything but can be controlled by the network operator, who can (and will) charge money for content. So the first is limited and the second is expensive. Imho, the best would be to give broadband access to everyone and let them download whatever they want from the internet. Of course this would be less profitable for big mobile network operators, but good for everyone else. (and you don't need new standards or protocols for that, imho the web is just good enough)

Microsoft admits big delay on Home Server bug fix

auser

I've actually read the documentation...

and it looks like that microsoft's dynamic mounting point handling has a bug in it's software raid driver when configured in jbod mode. All that an application has to do is to seek in a file while writing. So touching a file with almost any microsoft product will trigger the bug and trash the data partition on the disks. (office, ms photo, using a live account, etc.)

In short, microsoft's implementation of the unix 'mount' command is bugged. On the other hand, all unix versions use mount and it tends to work. One of the reasons is that mounting is done directly by the kernel and not by a software trigger in one of the filesystem drivers, reaching back above its head to restart parsing on a different path. Because of this the unix way can't have the cool feature of having data implicitly duplicated across disks by the filesystem layer, but going top-down only cleary results in a more stable implementation. (common unix versions using the classic mount point structure include linux, bsd, macosx and others)

Why is this takes so long to fix? Because they clearly left out a few locks from the kernel. And adding one big kernel lock on the whole code would slow down certain metadata operations (like file delete) to a crawl.

Stroustrup and Sutter: C++ to run and run

auser

For a correct oop language...

you should try smalltalk. Java is the descendant of c and smalltalk, not c++. Actually as long as c++ doesn't have a standard typeof() functionality or some form of dynamic classing (like the nonstandard microsoft class interface extension), it's very painful to write dynamic programs. C++ is a statically linked language, with such misfeatures as having static classes, class pointers and class references with different syntax and no proper unified interface. Not to mention the lack of dynamic interface types and proper property support. (all available as proprietary extensions in various microsoft products and as completly incompatbible variants in other compilers)

The only truely supported standard language is c. You can write good oop code in c, and it's not really harder if you have a properly typed macro preprocessor (aka. template support).

ps: Why kernel developers avoid c++? Because in c++ you can't make all memory allocations explicit or if you do, you end up with plain c. And in a classic kernel, you have to jump through various loops to get even the smallest amount of memory allocated. This is true for most kernels, including the winnt line, linux or bsd.

Giant PC outsourcer throws in the towel

auser

Foxconn...

They are the cheapest, but they produce the worst quality. In most brand machines I see the part that breaks first is usually labeled by foxconn. Considering their use of chinese youth (slave) labor, this isn't good news.

Time to rewrite DBMS, says Ingres founder

auser

This new technolgy is called object oriented database

and its first working implementation was the object manager of smalltalk...

The idea is simple, just use data without transaction control and use automatic serialization only when two application logic threads happen to write to the same object at the same time. The result is a consistent in-ram object space that can be serialized down to disk when needed. This is the basic idea behind java and this is what drives some hard realtime databases used for a few larger mmo-s.

Automated crack for Windows Live captcha goes wild

auser

Spammers...

are simply using the algorithms developed for mmo games. In most games, the only way to interact with the system is through image recognition and simulated input events, because the game contains code against traditional hacking. The solution used there is to OCR the whole screen, extract the required data (like location of a mob) and generate the input events (like attack). Using the same technology for capcha decoding is possible. The human vision system is well understood and there are good neural models that can give almost the same precision. There are cases when the captcha is so distorted that a computer has better chances to undersand it, than humans.

The problem is that if you require some intelligence to solve the problem, then some users won't be able to use the service. If you make it hard but dumb task, a neural algoritm can be used to defeat it. And computers are getting better than some people.

For mass mail detection, a distributed database would work, where every email is recorded, fingerprinted and checked. If a mail matches one of the known spam mails, the spam can be revoked from all participating servers on the internet. The problem with this solution is that this lends itself to political censorship. Actually the chinese government tries to do exactly this with every email and blog within china, that contains any political meaning. A working, but usually nonpolitical version is used by some izraeli email providers to flag and filter spam emails. If a mail hits more than one of their user accounts, they flag it as spam and if a user indicates that it's truely a spam (by clicking), they remove it from all other accounts. The result is that at most only one user sees every spam they get, no matter how many users get them in the inbox. Connecting multiple servers (and providers) decreases the redundancy of the checks and the amount of displayed spam.

IBM explores 67.1m-core computer for running entire internet

auser

This is just simple application level distributed computing...

This is exactly the architecture google was using in its entire life. Ibm just tries to make it available to everyone as a package, instead of the do it yourself design that google uses. From a computational standpoint, there is no difference between using an N core ibm computer and using N single core pc-s. (apart from the fact, that pc-s are cheaper)

Intel single-core 'Silverthorne' to sport HyperThreading

auser

This is the same core...

that will be used in intel's new multicore cpu and it's based on the classic pentium design. (the one before pentium pro) They plan to use this cpu as a building block for the new generation of intel based videocards too, in groups of 8,16,32,etc. 64 of such cores with 2 threads per core (=128 threads) would only mean 128Watts of power, about as much as a current day 8 core, 128 threads nvidia gf8800 card eats.

Man buys MacBook Air, pulls it apart, takes pics

auser

The layout of this system...

looks just like my old thosiba laptop. Even the ribbon cables are arranged in the same fashion. It's an ancient 10 inch laptop with a 7 inch color vga screen. The harddrive was placed right under the various plastic film ribbon cables, with a 2 mm thick led panel with glued on components and an L shaped integrated cpu/vga/bridge/bios motherboard. The cooling was done with a nice piece of thin aluminium sheet covering the chips and conducting the heat towards the keyboard. (no fan on that old 486) The two differences were that it had a memory expansion stack-on card at the bottom and an expansion bay in place of the air's wireless card. Btw, the laptop i'm talking about is still working and in use, and it has been outfitted with an ssd storage for vibration resistance and installed as remote operated wehicle control unit.

So far it seems the apple engineers have succesfully copied a 20 year old design. Except the leds based backlights, that is a new technology. Imho, even an asus eee has more technology in it.

Intel to tell all about roaring 96GB/s QuickPath interconnect

auser

30MB of cache...

means around 8MB for every core and 4MB for every thread. This is the minimal amount that is required for an itanic core to work at all. The architecture is fixed, so they can't upgrade it without forcing users to recompile everything. Increasing the clock speed and the cache size are the only options. For an x86, you can always play with the number and depth of pipelines. (the x86 has some reserve power in it, with a hardware based broader than loophole optimizer, you coud theoretically get 256 instructions for every clock and current cpus doing 4, while the itanium is limited to 4 instructions by its own specification and can't evolve)

Israel electric car project aims to wipe out oil

auser

Ever heard of the Edison electric car?

He had a working prototype and planned to put battery exchange stations along the roads. Sadly, the electrical network was essentially non extistent back then, but everybody had petrol for their lamps. His motor went into Ford's model-T as it's starting engine/generator and the car could run from its battery for a few dozen miles without petrol. This is why the Toyota prius used the same gear and motor aligment of the model-T. (with the embrassing backing on slopes bug also copied in first models)

This 'new' idea finally implements the electric car conect as it was intended around a hundred years ago. For power source, you could use clean nuclear stations (peeble beds are good, because all waste can be recycled into weapons or fuel), hydro, geothermal or anything you want. For distribution, you could charge batteries at recharge stations. So a user has the chance to charge or swap.

About the car's lifetime, I have very positive experience with electric vehicles. Some 25+ year old electric buses are still operating perfectly in eastern europe, despite lack of maintenance and notoriusly bad roads. The same can be said about the old 60 years old trams that were built after world war 2 and still running regular service or the old electric undgerground line of Budapest where the first upgrade was performed roughly 80 years after it started operating. (it was constructed in the 19. century, being the second line after London and the first fully electric with electromechanical program controlled automated train control built-in) In short, electric vehicles simply don't break too often or get old.

Home Sec in anti-terror plan to control entire web

auser

tor remarks...

""Because the ISP can see *who* your computer is talking to, and intelligence agencies have the internet tapped. They can see you visit website x, talking through tor node 3, talking through tor node 2, talking through tor node 1, talking to your computer. They can see who is talking to whom, and depending on your computer sending requests that get relaying through the network, they can follow the string right back to you because they are capable of observing the whole internet.""

This is only true if only one person is using the tor network at a time, or if tor uses a separate connection for every user between it's relay points. If the whole internet is monitored (and in some western countries the whole country is monitored), then the system can map who uses tor and what requests come out from tor. But if the network is working properly, they can't connect the requests to the users, so they can't know who did what from all the users connected at the time of the requests. And I didn't even mention hiding traffic amongst normal requests or using privately owned nodes in countries without monitoring.

Censoring a network that is designed for safe communication and to resist censoring doesn't sound like a smart idea. The government would get better results if they put a police officer behind every user's back or banned the use of technology altogether.

Google's Android - big name, big question on payment

auser

Platform problems...

The android platform is somewhere between the single program, no os smartphones and the full blown desktop (windows) environment. This is the same place where windows mobile lives. The only difference is that the android platform is based on linux and google's vm. The platform is as open as its windows variant, because you can't get 100 percent of the source code, but you can get the apis. You can write a new dialer for windows mobile or andoid, but you can't make a fully open source system. This takes out hardware developers who are looking for a free platform, since you have to pay to get it ported to your hardware in both cases. (you can't do it yourself without having the source code)

New 3D chip transistor may reach 50GHz

auser

Cpu speeds...

Currently the fastest working cpu is around 10Ghz and made by ibm. Intel works on a similar design and it's fastest cpu is around 8Ghz. Packing transistors in a vertical fashion is not new and the elimination of metal wires between transistors could make systems both faster, smaller and less hot. All you need is a way to put multiple layers of conducting silicium and insulating silicium dioxid on top of each other. Adding metal oxid gates to this technology could result in systems around 100 Ghz, with 1 Thz attainable in research labs. Apparently Moore's law is pretty much still works...

Nokia app monitors juice

auser

Shouldn't this be a built-in functionality?

Like in the case of the good old sony-ericsson-s where it's part of the hardware info, along with the battery status and temperature, the cpu temperature, the radiated power and lots of other junk... You can even get this info through the standard (serial or usb) modem interface without starting an exra application.

Melting ice kills polar bears, say boffins

auser

Only one small problem...

If we let evolution solve this, like it did with the dinos, then we will just die out like they did. So while evolution is a nice way to sort things out, shouldn't we go againsts it and save our _own_ species? Personally I don't care about what caused this situation but I would really like to know how to turn it back. We can accept that it's a natural phenomenon and do the same that the dinos did and simply die out. I would like to see a future where human intervention reverses the current trend, be it natural or man made.

Motorola signs up for tiny projectors

auser

PDA phones...

Today's pda phones have the big problem of being unusable as laptop replacements. However if you could get your phone to display a keyboard, and a touchable screen on your desk, using them would be much easier. Also mobile tv and video is only viable if you can watch the films on a bigger screen if you have the space. Adding an extra screen as a projected one is also very cheap.

ps: Lately, I've checked what is needed for a small screen rgb vga projector. It turns out that two servos (voice coils or piezo buzzers) and a single high brightness rgb led (and some tiny optics). The driver circuit is one microcontroller and at least one 8 channal mosfet driver ic and you could get vga resolution with 2.1 sound. And I'm talking about homemade stuff... In theory this could go down to the size of a matchbox. Add a camera and you have a projected touchscreen.

Israel deploys robo-snipers on Gaza border

auser

What they call a mistake is...

when they shoot their own civilians. The system is capable of a 360 angle fire and on one side of the border they have their own civilians. The idf needs the operators to make sure they only shoot non israeli children, not their own. Currently this is the only accident that would be considered a mistake. Current machine vision systems can only recognise targets by their size and movement speed and not their nationality or intentions.

Microsoft waves in Minority Report-style computing era

auser

The two ideas presented by microsoft

come from two university research programs. The multitouch interface was already mentioned by a comment, the other one is the 'items as keys' technology, where a japanese professor demonstrated it by placing his mobile phone on a table as a key while it was scanned by a camera. Actually this could be built very cheaply, but instead of infrared cameras, one should use optical cams. 3d triangulation is easy whith more than 3 fixed position cameras. The other factor is the price of the display.

However i think a better design would be a touch sensitive lcd, set up as a drawing board. (but we already have that) This tabletop layout is good for meetings, because fotos and documents can be laid out and shared between the viewers. In this case the fact that people could watch the content together gives it a plus. (the classic scenario of looking at family fotos with the owner marking things with his/her index finger and talking about the picture)

ps: For someone who asked if the system could detect empty beer glasses and order new ones. Yes it could do that, but it needs a barcode on tehe bottom of the glass.

My RFID-embedded car numberplate has a virus

auser

Classical example of technology used the wrong way.

License plates can be read with optical scanners and many toll road systems and most police cameras do that just right. Adding rfid is not needed in this case.

The threat can be seen in better light if we take active rfid tags into consideration. They have their own power source and can be reprogrammed via the radio link. They are used as writable stickers, when the system needs more info on the chip than its serial number. Many japanese mobile phones contain this active rfid chip. It's good because one can copy his entry card into his phone or in other cases, someone is able to 'borrow' a card from someone else and use it, along with the identity of the owner. This technology is largely unused now but already depolyed.

Adding strong cyptographic capabilities needed for secure data transfer would drive costs sky high, so this is not an option today. We are left with a chip that can only be protected with an electromagnetic shield (a faraday cage). But when we need this, it's much easier and usually cheaper to just use an optical scanner, like a videocamera.

ps: To crash or break a system with an rfid reader, you can just send garbage until you find a failure in the code.