Technology bottlenecks currently holding our future to ransom

Technology bottlenecks currently holding our future to ransom

W.H. Auden once wrote in his famous poem, ‘As I Walked Out One Evening’: “All the clocks in the city began to whirr and chime: O’ let not time deceive you, you cannot conquer time.” A facet technology seems willing to ignore at this juncture. Several years have passed since the introduction of 1 GHz CPUs, Blu-Ray Discs and even electric cars. And yet, technology seems stuck, unable to progress further into the beyond and achieve the next stage of performance. But it doesn’t mean it’s not close. Let’s take a look at some of the biggest technology bottlenecks currently holding the future to ransom, and the advances coming our way to set us free (without the horrifying consequences that discovering a world conquered by machines would bring).

CPU Speed 

When processors broke the 1 GHz barrier, it was a significant achievement in computer history. However, over the past few years, CPU speeds have hovered around 2.5 GHz, when you take single-core processors into consideration. Yes, multi-core systems seemed to be the breakthrough we needed but they carry their own bottlenecks (see next). Also, despite said breakthrough, a single core still clocks in at around 1.5-2 GHz. So why hasn’t processor speeds increased over the years? A prominent factor is transistor size – they’re getting smaller, which allows more transistors to be backed onto a single dye (thus obeying Moore’s Law), but they’re not getting faster. CPU transistors these days follow the metal-oxide semiconductor field-effect transistor or MOSFET scaling process, and they function as electronic signal switchers. As they become smaller, transistors are supposed to switch faster, which leads to increased performance. However, MOSFET scaling has its fair share of bottlenecks. The electrical circuit itself faces various issues with higher sub threshold conduction (which consumes half of the total power consumption), difficulties in scaling down the transistor size beyond a certain point (as Intel has done with its Ivy Bridge chipset, managing a 22 nm size); interconnect capacitance (which is the metal-layer capacitance between various portions of the chip. As it becomes greater, there is an increased delay in the travel of signals through the interconnect, leading to lesser performance), and many more.
 
Another issue is the “interconnect bottleneck”. Integrated circuits are downscaled in size to allow transistors to run at higher frequencies – at the cost of tighter packing of the already dense CPU. This increases parasitic capacitance – a type of capacitance created simply by the proximity of electrical components, which thus causes them to divert from their best possible performance. In more serious scenarios, it can lead to “crosstalk” wherein signals from one circuit meld into another, thus generating inference in operation. There’s also the issue of signal propagation delay, thus resulting in lesser speeds.
This chart showcases when the interconnect bottleneck occurs. As line width decreases, the interconnect delay grows larger
 
Solutions range from altering transistor material to using optical components for integrated circuits, both currently expensive methods. Intel has been working on a new 3D integrated circuit, wherein the components are arranged both vertically and horizontally in a single circuit. This allows for more transistors within a smaller volume of space, thus following Moore’s Law and providing increased performance. This would result in shorter interconnects, significant reduction in power consumption, lesser production costs, and a brand new range of design possibilities.
 
Multi-Core Processors
Multi-core CPUs were borne of a single desire: How do chip manufacturers continue following the deteriorating Moore’s Law, without rapidly altering transistor design or size? This came about by segregating the CPU into cores or distinct sections that followed the diktat of parallel processing. If a 1.5 GHz processor could thread 500 million instructions, then a quad-core processor clocked at the same speed could thread 2000 million instructions. Theoretically, this meant having a single CPU with the power of four. Yet two single-core CPUs could still easily eclipse the performance of a quad-core. Why has the potential thus far remained untapped? Blame it on current software architectures that must be extensively rewritten to take advantage of this new avenue of power. After all, if the program only understands enough to utilize one core while the remaining cores remain untouched, then what difference does it make? Smart phones with quad core CPUs and mobile OS’s like Android are also facing this problem. Developers always program for the lowest spectrum when creating software, since it’s not always a given that everyone will have multi-core CPUs. Nonetheless, as they become the norm across consumer systems, operating systems will need to be retooled and optimized to fully exploit their power. 
 
GPUs
Compared to CPUs, Graphical Processing Units (GPUs) are in a league of their own. Despite coming across the same bottlenecks in transistor size and parallel processing, game developers and manufacturers have still developed various algorithms and drivers to unleash their overwhelming power. On top of this, GPUs have only now begun realizing the full power of multi-processing after ATI’s sullen sojourn with CrossFire. Now regardless of manufacturer, two graphical chipsets can work in parallel to boost performance of a system. It’s no stretch that often it’s the other components of the system that bottleneck the GPU by simply not being able to keep up. However, with this much power, energy consumption and heat become major factors. So much so, that various, sometimes potentially problematic, cooling solutions have been devised for the heat. Energy consumption may be solved a bit differently in the year to come. An interesting development forsakes accuracy in computing for up to 15 times more efficiency in power management. Called “inexact” technology, it allows microchips can best be defined as “smart error management”, controlling the probability of errors while confining their frequency. This is achieved through “pruning” or removing the barely used parts within the processor, and showed that it could cut energy usage by 15 times for up to 8 percent performance deviations in chips. Today’s CPUs will theoretically be able to make up for this sacrifice in performance; GPUs, with their excessive power, would be able to manage better and benefit well from the reduced energy consumption. Currently, this technology will see greater exposure by 2013.
 
Front-Side Bus
The Front-Side Bus or FSB is a little-known but very vital component for every system. As part of the motherboard, it is the central link between the CPU and other devices within a computer. The overall bandwidth that FSB can provide is dependent on by its clock frequency (or the number of cycles it can perform each second), its data path width and the number of data transfers it performs per cycle. The problem arises with the number of clock cycles an FSB can perform. If it goes by the frequency set by the motherboard, which leads to a set number of clock cycles, then overall bandwidth will be limited. Hence no matter how much faster CPUs get, their speed will always be constrained if the FSB can’t keep up, as the CPU remains idle for one or more clock cycles until the memory returns its value. Memory in a system is also accessed via the front-side bus, which limits overall transfer speeds since bandwidth is being utilized for this purpose.
 
AMD’s Hypertransport forms point-to-point links between different components, eliminating the need for a front-side bus.
 
Both AMD and Intel are working on their own advancements over the old FSB, HyperTransport and Quickpath Interconnect respectively, which promise to provide point-to-point connections over the use of the FSB as a central connecting point. Using this AMD’s HyperTransport, for instance, allows for a theoretical transfer rate of 51.2 GB/s of aggregated throughput. To ensure none of the bandwidth for these point-to-point links is wasted, a separate memory controller on the CPU itself is used to access RAM. Currently, AMD and Intel employ their respective technologies within various components, and they are being tested for replacing front side bus in motherboards.
 
Intel showcases its new QuickPath Interconnect technology to replace front-side buses in the future
 
Bandwidth Limitations
At last count, there are 1 billion currently connected to the internet around the world. By 2016, the number of internet devices will outnumber the world’s people. Blame it on smart phones and netbooks, as the world lives half its life out online. And yet, despite advances made over dial-up and narrowband, internet speeds just feel diminished. We’ve been using the same routers for the past five years to connect to the internet, which really doesn’t do justice to the amount of data flowing through the future-proof fiber optics cables we have flowing underwater. And even though Wi-Fi removes this hurdle, with data flowing in its purest form, it still must respond to a transmitter or tower that is hard-wired to the vast global network of cables. Giving everyone their own personal fiber optics cable isn’t practical or economic to overcome this bottleneck.
 
However, optical memory devices are being developed to replace the routers of old. NIT, a Japan-based telecommunications company, have been working on such devices for years now which operate between light-transmitting and light-blocking states (like the signal switching of MOSFET) to create digital signals. The technology is very light on energy consumption, using just 30 nanowatts of power and retaining data for one microsecond. This leads to more efficiency than normal electrical routers, and helps maintain the high data rate transmitted via fiber optics cables.
 
Wireless technology is also being constantly improved, and the introduction of LTE or 4G showcases how far mobile bandwidth has come with download peak rates of 300 Mbit/s and uplink peak rates of 75 Mbit/s with a support for carrier bandwidths between 1.4 MHz to 20 MHz and transfer latency of less than 5 milliseconds. Of course, this depends on the equipment one is using. Not all devices currently support LTE, but as it becomes more popular, it should begin phasing out 3G in the years to come, and pave the way for an even more advanced iteration to come.
 
Visit page two to read about more ‘Technology bottlenecks currently holding our future to ransom’…
 
 
Hard-drive
If the FSB is one of the least known and most limiting bottlenecks of a system, then the hard disk drive is the more popular of the bunch. Hard disks record data by magnetizing a film of ferromagnetic material on the disk. Information is written and read from platters, layers of circular disks that hold the recorded data, which rotate past read-and-write heads at speeds of 4200 RPM (in normal systems) to 15000 RPM (in high performance servers and the like). As such, they rely on mechanical moving parts for data transfer, which are more susceptible to shock and damage and limited in terms of speed. This is due to the time needed for a hard disk to “get up to speed” so it can transfer data.
 
Solid State Drives (SSDs) were introduced to make up for the shortcomings of traditional hard drives. They are distinguished in their use of electronic components over mechanical moving parts like magnetic disks. This offers lesser access time and latency, thus giving better performance and data transfer rates. SSDs are also much more reliable than hard drives, having no exposed circuitry or moving parts that can be damaged. One major detriment, however, is that when a solid state drive does fail, there is little to no hope for recovering any data (whereas HDDs have some data which can be recovered). The price of SSDs is also quite high, and traditional hard drives beat their performance gains by offering more storage in return.
 
Compared to an HDD, a Solid State Drive has no moving parts and information is transferred digitally
 
Hybrid drives were introduced that incorporated aspects of both HDDs and SSDs. These drives contained the familiar platters, along with a non-volatile flash memory cache.
 
Battery
The battery is the quintessential component of just about every mobile device. They work by converting chemical reactions into electrical energy, when electrons flow from a positively charged cathode to a negatively charged cathode, passing through an electrolyte substance. This difference in charges thus produces an electrical current. Power management in desktop systems is important, but ever more so for smart phones and tablets. Their mobility and performance is defined by just how long you can go off a single charge. However, the technology itself has remained unchanged for more than 8 years. This is evidenced by no less than the newest iPad, whose battery takes up about 90% of its total weight. Lithium Ion cells show good energy density, with very little discharge when not in use. They rose to popularity because of their recharge ability, thanks in due to the lithium compound present as the electrode material.
 
Despite the improvements in the technology each day for mobile phones and tablets, it’s relative size and risk remains unchanged. Lithium ion cells also have the disadvantage of gradually losing capacity as they’re used for a longer time. There are also dangers of these cells facing leakage, or worse, combustion, if they are overcharged. The sheer number of safety measures incorporated to prevent this from happening take up valuable space in the battery, thus making it bulkier.
 
This diagram showcases the movement of electrons between electrodes within a lithium-air cell.
 
But researchers have been looking for alternatives. Some propose moving on from lithium to other metal compounds. IBM is showing the most promising of the bunch in its use of air. The cathodes are made of carbon rather than lithium, which then react with oxygen in air to produce electricity. However, there are issues with getting the battery to recharge. Plus, lithium ignites when in direct contact with moist air. Currently solutions are being worked on to rid the battery of this issue, and we could see developments by 2013, with plans for commercialization by 2020.
 
IBM showcases a comparison between current battery technologies and it’s upcoming Lithium-Air Cells
 
Electric Cars
Take any science fiction movie from the past 20 years. Blade Runner showed us a future of flying cars by the year 2019 (not to mention Replicants that would be indistinguishable from normal humans). And who could forget the time-travelling hover car from Back to the Future, affectionately built from technology from the year 2016? Yet, looking at the world, it’s still the usual oil and combustion engines dominating transportation. Whatever happened to the cars that could fly? Or cars that would drive themselves to where ever we asked? Or for that matter, the electronic cars that re-emerged with renewed interest in the
 
21st century with rising fuel prices and pollution concerns? The bottlenecks themselves are as economic as they are practical. A Chevy Volt automobile sells for about USD $40,000 – well out of the reach of most citizens even in developed nations. Then there’s the issue of charging. Whether we like it or not, for as long as the lithium ion cell is being used, there will always be a bottleneck on the true potential of the electric car.
 
A recharge station for electric cars in Israel, part of its new nationwide network
 
It hasn’t been easy, but we can comfortably say that we’re a lot further along in bringing significant advances to automobile technology than before. Google has already begun public testing for its experimental driverless vehicles. Better World has opened the world’s first nationwide electronic car network in Israel, with currently four battery stations to benefit travellers and another 40 planned to roll out this year. Several subsidies have also been provided to help in the net expense of electric cars. And in the end, it’s still a great investment which will pay for itself in the years to come.
 
Holographic Displays
Yes, the “ghost” of Tupac Shakur wasn’t really a hologram. But what we refer to is what’s been desired for years: An entire world emerging from our screens, seemingly becoming one with ours. Interactivity, functionality and true immersion have teased us for decades. So why are we still stuck with HDTVs and 3D glasses rather than watching the future as it was meant to be seen?
 
Paula Dawson’s ‘Luminous Presence’ uses ‘early mosaics and gilded aureoles to augment the interface between holographic images and the beholder’
 
Several companies like Zebra Imaging are making headway, with its ZSCAPE 3D prints displaying the sensation of parallax unique to holograms. There are several projects underway to bring out development in holographic content, and applications in the field of geology and architecture? Holographic displays are still a ways off but Stephen Benton of MIT is still working towards commercializing it. The basic aim it to have this as a consumer device within the next few years. Considering the progress being made, using the Microsoft Kinect motion-sensor, it won’t be long. However, you can bet it’ll be a while before the technology comes cheap.
 
The process that goes into Zebra Imaging’s holograms
 
Smart Homes
The Jetsons put forth the ideal, and quirky, notion of living in the future: A house that follows your every command, responds to what you want and need at any given time, with AI controlled servants handling the more mechanical labour. Decades later, smart homes, part of the “home automation” effort, are more possible than ever. There are already many commercial homes that feature smart devices for regulating the use of lights and other appliances, intercoms for communicating between different rooms, motion sensors for security, and much more. However, the true smart home is one that is fully automated; one that is an ecosystem unto itself.
 
Such smart homes face all the usual problems, however. Increased electricity costs, the sheer level of interacting devices required running the entire system, the amount of money for purchasing said homes…and sadly, our future robot help still has a long road to travel before it’s ready to serve us.
 
Cisco Systems is taking a great initiative in Bangalore, partnering with Mantri Developers Pvt. Ltd to develop such smart homes for the Indian population. To make things more appealing cost-wise, Mantri is offering residents to pick and choose what services they’d like to use, along with options to upgrade in the future. One of the bigger obstacles in home automation rests in communication. Just how do you get the house to do what you want, without getting into too many technical details? Microsoft is currently working on a solution to this with its HomeOS, an operating system in development specifically for home automation. A proposed app store for devices would allow anyone with a Windows phone to set controls for the house, besides assisting in finding new devices for the set-up. Several functions such as the regulation of internal temperature (switching off of fans, if one turns up the heat) and notifications if some one’s at the door (along with a live feed covering the same). Google has also announced its smart home initiative, dubbed Android@Home, but so far, there hasn’t been any news surrounding it.

Ravi Sinha
Digit.in
Logo
Digit.in
Logo