Monday, May 7, 2012

ITE 221 - Spring 2012 - Chapter 7

My blog for Chapter 7 is related to speech-recognition software.  The article, titled “Can Speech-Recognition Software Work in Mandarin?”, was published on BusinessWeek.com in March 2012.  At the time the article was published, Apple was getting ready to release a product called Siri in Chinese and in anticipation of that event, the author decided to try Nuance Communications’ Dragon software in Mandarin Chinese. Dragon is another speech-recognition product and Nuance Communications also produced the speech-recognition technology that Apple's Siri is built on. Although the author appeared to be impressed by Dragon's functionality/ability, he noted that Mandarin is a particularly challenging language for speech-recognition software, as there are only “400 monosyllabic sounds in Mandarin, which are differentiated by tone." Additionally, Nuance's vice president for Dragon research, Jim Wu, noted that "within mainland China, everyone has a different accent;" so basically, Nuance was tasked with creating a software that would work for “people who speak Mandarin with a slight accent." As mentioned in the textbook, as speech-recognition software Dragon is not 100% accurate, but it is built to "learn and improve." Dragon does this by selecting user speech data and updating your profile at the end of each usage, so the more it is used, the more accurate it becomes.

I actually used Dragon to produce this summary :-)  I am a HUGE advocate for Nuance Communications!



Friday, May 4, 2012

ITE 221 - Spring 2012 - Chapter 6

My Chapter 6 blog is related to the subject of multicore processing. Published on PCWorld.com in December 2011, my article describes trending more multicore processors in smartphones for 2012. While in 2011 "dual-core" processors became the "standard for high-and smartphones, in 2012 it's all about quad-core.  At the time this article was published, there was only one actual quad-core product on the market, Nvidia's Asus EEE Pad Transformer Prime TF201 tablet.  This Nvidia tablet is the first device to hit the market with Nvidia's 1.3 GHz Tegra 3 quad-core processor; the product was praised for its "stunning graphics" and superfast processing speed. Nvidia was tight lipped about when their quad core phones would be released, but noted they were on track for Q1 2012.  QUALCOMM was also slated to release its quad core snapdragon chip, the APQ064 (part of their S4 line), in 2012.  The S4 chips run specs indicate that they run at clock speeds between 1.5 GHz and 2.5 GHz.  The article author questioned whether more cores are indeed better. According to Nvidia, a quad-core processor will "bring your phone level of performance comparable to that of desktop computer." One argument against quad-core mobile devices is that not enough content is optimized to fully take advantage of the CPUs power. When looking for examples of applications that could fully take advantage of quad-core processors, gaming is often a popular example. Quad-core processors are able support apps that run multiple processes at once, allowing the user a more fluid experience with more high-quality graphics.  Another challenge is battery life. Processors are evolving at an exponential rate and batteries are having difficulty keeping up. Both Nvidia and QUALCOMM they that they are adapting their system cores to handle different processes at different power levels (i.e. when you open an e-mail versus accessing a flash-based website with video).



Thursday, May 3, 2012

ITE 221 - Spring 2012 - Chapter 5

Although not published in incredibly recent history, my article has absolute relevance to Chapter 5 and the subject of data storage. The article, titled "In the Race between Optical and Magnetic Storage, We Win," published on discovery magazine.com in September 2009, states that in 2009 scientists at Cornell University developed ferroelectric materials which would allow memory chips to "remember" even when the power is off. Ferroelectric materials are substances that hold an electric state without additional power. Applying a layer of a compound called strontium titanate to silicon caused the strontium titanate to become ferroelectric; scientists had theorized this result, but it had not been previously achieved. Jonathan Spanier, a materials scientist from Drexel University, noted that ferroelectric could "significantly reduced time spent reading by allowing a computer to keep a browser or document open without using energy."
In a separate project, a scientist at Swinburne University of Technology in Australia, James Chon, led a group of researchers in potentially shattering the current limitations on storing data optically. Chon's group was able to demonstrate the data could be stored in five dimensions, allowing more than a terabyte of data to be stored on a single DVD-size disk.  The researchers were able to achieve this by embedding the disk with tiny gold particles (nanorods). The article states that "by varying the length and orientation of the rods, the researchers were able to record data based on the three dimensions of space, plus polarization (the orientation of light waves) and color.  Currently, the most commonly used DVD type (DVD-5) holds about 5 GB of data. There are a little over 1000 GB in a terabyte, so using Chon's process would increase storage ability by more than 200 times.


My article link: http://discovermagazine.com/2009/sep/22-in-race-between-optical-magnetic-storage-we-win


Tuesday, April 10, 2012

ITE 221 - Spring 2012 - Chapter 4

My Chapter 4 Blog relates to the future of microprocessors. Due to limitations on current micro-processing processes and semiconductors, technology based corporations have had to explore other options including "optical processing, hybrid optical-electrical processing, and quantum processing." My article describes the current initiative by IBM Corp. to build a quantum computer.
IBM scientists and researchers have made some really incredible advances toward quantum computing in regard to "retaining the integrity of quantum mechanical properties and quantum bit (qubits) and reducing errors in elementary computations."  Quantum computing was proposed by Richard Feynman, a Nobel prize-winning physicist, in 1981, but at this point there are no practical quantum computers; quantum computing at this time is still just experimental.
In quantum mechanics, particles of matter can exist in multiple states simultaneously (i.e. "on" and "off"). This is kind of hard or the average individual to wrap their head around! You think that's hard to comprehend? How about the fact that a single 250-qubit state contains more bits of information then there are atoms in the universe...fascinating! For more than 50 years, scientists have theorized that harnessing these properties for computing would be the natural progression from today's silicon-based transistors. IBM researchers speculate that quantum computers would have the ability to work on millions of computations at one and potentially have the ability to solve previously unsolvable mathematical problems.
IBM's "superconducting qubit device" is approximately 1 mm and exists as a "qubit in the center of the cavity on a small sapphire chip," and although the device appears big income. Send to some of the "tiny conventional computer chips currently in use," IBM's scientists envision that future scaling will make it possible to operate perhaps thousands of qubits on the device the size.

Amazing…it sounds like the future is now people!



Wednesday, March 7, 2012

ITE 221 - Spring 2012 - Chapter 3

     Maybe I’m taking this ITE 221 blog thing a little too seriously, but for WEEKS I felt held hostage by the inability to chose/ find an article that would relate to any of the Chapter 3 topics (i.e. Boolean Data, Floating-Point Notation, and CPU Data Types).  After much deliberation, I resorted to associating my Chapter 3 blog with a “Technology Focus” box that appears on pg 87 of the text book.  I decided to do my blog on “The Birth of the IBM PC.”
     IBM introduced its personal computer to the American public in August of 1981.  It cost approximately $1,500 and included the following components: a system unit, a keyboard and a color/graphics capability. Buyers had the option of also purchasing a display, a printer, two diskette drives, extra memory, communications, game adapter and application packages.  Just 20 years prior, an IBM computer had cost upwards of [$9 million and needed to be stored in an air-conditioned, quarter-acre of space and required a staff of 60 people to keep it fully loaded with instructions]. The system was powered on the Intel 8088 microprocessor, which had 29,000 transistors and had a processing speed of 5MHz, with a turbo version released later that ran at 8MHz.  To put this in perspective, the Intel 8088 performed of a speed less than one-thousandth that of a modern processor.
     The following shows a brief comparison between the Intel 8088 and the more current Intel Core 2 Duo Processor E8300:

Processor
Clock Speed
Transistors
Addressable Memory
Bus Speed
Typical Use
Intel 8088 (introduced in 1979)
8 MHz (turbo)
5 MHz (standard)
29,000
64 KB
8 MHz (turbo)
5 MHz (standard)
Desktop PCs
Intel Core 2 Duo Processor E8300 (introduced in 2008)
2.83 GHz
410 million
64 GB
1033 MT/s
Desktop PCs

     The article was fairly short, but it was super interesting, almost more from a business strategy perspective than a technological one though.  Looking at the specs for the Intel 8088 didn’t really mean much until I compared it to more modern technology, than it was like “Wow!”  For the respective time period though, the personal computer and its capabilities were incredible.  It’s pretty amazing to see how far technology has come, how much it has advanced, isn’t it?
My article link: http://www-03.ibm.com/ibm/history/exhibits/pc25/pc25_birth.html

Friday, January 27, 2012

ITE 221 - Spring 2012 - Chapter 2

For my Chapter 2 blog, I chose to take a closer look at virtualization.  The article I reviewed, "10 Cool Things Virtualization Lets You Do," by Keir Thomas, appeared on PCWorld.com on February 25, 2011.  The article does a nice job of defining virtualization in layman's terms ("running two or more operating systems on one physical PC") and describes different ways of utilizing virtualization. The author notes that virtualization can benefit everyone, from the multi-billion dollar corporation to the one-man shop.  VMWare is the current virtualization market leader, with Oracle also functioning as a major contender; both companies even provide "free of charge" versions of their software.  Now on to the benefits… VMware Player offers a tool that allows you to run applications that were compatible with previous versions of Windows, but have difficulty interacting with Windows 7 or Vista.  But wait, it gets better!  Not only can you recycle old applications, there is virtualization software that allows you to reuse old hardware, turning your organizations forgotten workstations into what's called "thin-clients." Wise geek.com defines a "thin-client" as a computer that "contains enough information to start up and connect to a more powerful network server;" at that point, the server handles the computing workload.  In this respect, virtualization would benefit an organization in two major ways: (1) a reduction in cost for hardware upgrades and (2) reduction in administration costs.  Another benefit of utilizing virtualization is risk management.  Operating on a virtual machine allows you to access corrupt or potentially virus- infected data with the ability to return the virtual operating system to its previous state, using a snapshot feature, if the virus causes your virtual environment to go bananas.  Virtualization offers additional benefits with the ability to back up an entire operating system, providing disaster recovery space (by giving you the ability to back up your server), and the ability to "run headless" (which provides an ideal testing environment for web developers).  So, this article confirms that virtualization does INDEED provide a little "something for everyone!"


Tuesday, January 17, 2012

ITE 221 - Spring 2012 - Chapter 1

For my initial blog, I reviewed the website for The Association of Information Technology Professionals (AITP).  The AITP is a very valuable resource for both current IT professionals and students would are pursuing an IT related degree and have the intent of entering into an IT profession.  NVCC is not listed on the site as having an AITP Student Chapter, but I understand I could join as an “Individual Student Member At Large”.  AITP provides education through seminars, workshops, and various regional and national conferences and offers leadership development resources, and opportunities for networking with other IT professionals.  AITP also provides members with mentoring and knowledge sharing opportunities, which would be extremely beneficial to all members, but especially an IT student preparing to enter the workforce.  AITP assists members in staying on top of emerging technologies; one way that they do this is through their Research and Strategy Advisory Group (RSAG).  RSAG researches trends in the IT industry, reports their findings or conclusions, and then recommends AITP strategy positions.  There are some interesting RSAG research papers listed on the website with topics that include cloud computing, information security, and virtualization (IT “going green”). 
www.aitp.org