Sunday, May 27, 2007

Today I help my sister-in-law troubleshoot her computer

My sister-in-law, Janis has purchased a notebook computer for six months but she always find problems with her brand new computer. Today I go to her place and examine her computer. This is really a good news to IT specialist. Why? Basically, I found nothing wrong with her computer but found her have very limited IT knowledge. She has been working at a top American bank for many years as a secretary. I thought she would be a power PC user. It indicated that there are quite a few people who still need to employ IT supports.

But today I read an article about Ray Kurzweil, the Technological Singularity. Then I started worrying about my future. Of course, Ray Kurzweil is a very bright person and he has made great contributions to AI. 'The Microsoft chairman calls him a visionary thinker and futurist' claimed by O'Keefe 2007. He said, 'By 2027, computers will surpass humans in intelligence. And by 2045, "strictly Biological" Humans won't be able to keep up.' What does that actually mean? We are living in an accelerating era of change unlike anything we have ever seen. I somewhat agree with this statement as I have mentioned the speedy change of IT industry in the past two decades in my previous posts. This is no longer an imgination in an movie produced in Hollywood. O'Keefe (2007, p.46) gave an concrete evidence, back in the 1980s Ray predicted that a computer would beat the world chess champion 1998 (it happened in 1997).


Before I get into the details, I need to give you a terminology, Singularity. According to the Singularity Institute for Artificial Intelligence, this means the technological creation of smarter-than-human intelligence. After I have read through the first few pages of the website, I realise the ground for this notion. They estimated that the capacity and speed of CPUs will one day overtake the human brain. The biological and physical factors may sound logical to me. Can we really create a computer can think? I can still remember the movies such as 'The Terminator', 'Bicentennial Man', 'A.I.', 'I, Robot' and etc. In those movies, Computers not only can think, hate and love but also, fight against human beings. However, playing world chess is much simpler than human emotions. Besides, there are tens and thousands of different human mindsets and very often, humans behave illogically. How possible can we code infinitive possibilities into a program? I really get stuck...Of course, this is extremely difficult but it doesn't mean impossible as long as we have a supercomputer. SIAI (2007) claims that 'By comparison, speeds in modern computer chips are currently at around 2GHz – a ten millionfold difference – and still increasing exponentially'. Based on this ground, I believe it will happen one day...

References
SIAI - see The Singularity Institute for Artificial Intelligence

O'Keefe, B 2007, '(CHECK ONE*) □The Smartest □The Nuttiest Futurist on Earth', Fortune Asia, vol. 155, no.8, Time Asia (Hong Kong) Limited, 14 May, pp. 46 - 52.

The Singularity Institute for Artificial Intelligence 2007, 'What is Singularity?', Overview, Updated 2007, Singularity Institute for Artificial Intelligence, Inc., viewed 20 July 2007, <
http://www.singinst.org/overview/whatisthesingularity>.

Sunday, May 20, 2007

The obstacle of Open Sources and Free Software I




Open sources and free software are being used by more than half of fortune 500 companies (Parloff 2007).


Microsoft the software giant has highly patented its software products. Up until 2005, Microsoft has filed around 3,500 and registered over 1,500 patents (Microsoft cited in Parloff 2007, p.51) . As I myself works at a legal firm, I know that patents are very costly and not many sole proprietors can afford to patent their inventions. Many software developers has more or less included Microsoft's patented components in their products. Microsoft demanded them to pay the licence fees. For example, the Microsoft-Novell deal, Microsoft and Novell not only agreed to jointly develop and market products that allow Windows and Linux to work together more smoothly but also, Microsoft agreed to indemnify Novell Suse Linux Enterprise users from patent claims (Lemon 2007). This is definitely not a good news to the users. As a result, free software will not be free anymore. Probably, this is even a bad news to corporate users like AIG, Wal-Mart, AIG, and Goldman Sachs.

In December 2003, Microsoft's new licensing unit opened for business, and soon the company had signed cross-licensing pacts with such tech firms as SUN, Toshiba, SAP and Siemens (Parloff 2007).

Fortunately, Free and Open-source software (FOSS) has been fighting for the free world. Free Software Foundation president Richard Stallman, a talented programmer has dared to challenge the giant. I myself truly think him ground for the battle to Microsoft is very reasonable and widely accepted by free worlders.

To be continued

References


Lemon, S 2007, 'Dell joins Microsoft, Novell in Linux collaboration', ComputerWorld Hong Kong Daily, 7 May, viewd 8 May 2007, .


Parloff, R 2007, 'Microsoft takes on the Free World', Fortune Asia, vol. 155, no.9, Time Asia (Hong Kong) Limited, 28 May, pp. 49 - 55

Wednesday, May 16, 2007

Ubiquitous Computing

I have discussed before why thin client will be popular again in my previous blog. Nevertheless, probably, Ubiquitous Computing will be the destination. Mark weiser, the father of Ubiquitous Computing has given the idea of 'Ubiquitous Computing' in early 90s. He believes that 'Ubiquitous Computing' will be the third wave of computing after mainframes and PCs (Weiser, 1996). 'Ubiquitous Computing refers to the trend that we as humans interact no longer with one computer at a time, but rather with a dynamic set of small networked computers, often invisible and embodied in everyday objects in the environment' (UbiComp 2007).

Ubiquitous Computing will be the destination. But now how far are we from it?

Mark Templeton, CEO and president of Citrix Systems shared the company's vision of Ubiquitous Computing; he helps customers shift from distributed computing to application delivery service and the IT roles will change dramatically over next five years in response to the forces shaping today's business environment (Ramos 2007 p.26).


He lists out 5 factors driving the IT trends:

Consolidation - workers are required to share all their information

Regulation - governments and industries holding business more information accountable so the organization must find a way to easily control and monitor information access

Disruption & globalisation - high mobility of work force will need the delivery of applications from any endpoint, under every access scenario.

Echo generation - tech-savvy enterprise IT users will demand application access to variety of wired and wireless communication links

Templeton (2007) makes a conclusive statement that '... I guess what really saw us through is increasing relevance of our basic thinking about enabling people to work from anywhere over any type of connection'. This is the desire of people for getting Ubiquitous Computing ready. But there are still many issues waiting to be resolved. Broadband and wireless infrastructures are the basic requirement in the cities. Security is another important issue we can afford to ignore.

To be continued.

References

Ramous, L 2007, ‘Right place, right time’, Network World Asia, vol 3, no 4, pp. 26-27.

UbiComp 2007, 'What is Ubiquitous Computing?', 9th International Conference on Ubiquitous Computing, Innsbruck, Austria, viewed 21 May 2007, <http://www.ubicomp2007.org/scope/>.

Weiser, M 1996, 'Ubiquitous Computing', viewed 15 May 2007, <http://sandbox.xerox.com/ubicomp/>.

Monday, May 14, 2007

Thin client will dominate the market again

We all understand that PCs are getting more powerful and powerful. It is not unusual to have a Core 2 Duo desktop in your office nowadays. However, we also understand that thin clients are getting more popular. Why would it fall into two extremes?

What is Thin Client?

'Thin client is also used to describe software applications that use the client-server model where the server performs all the processing' (Chung, Zimmerman & Chandra 2006). Client-server technology has evoluted a new characteristic of "Client". HP, IBM, SUN and even Citrix have been promoting their thin client products. Ho (2006) reported that the growth of thin client sales has reached 279,513 units which is 64 percent over the previous year in the Asia/Pacific (including Japan. IDC says the region's thin client market is projected to expand at a compounded annual growth rate of 34 percent. I can say this has been a significant indication of the IT market trend - thin client again.

'Over the last decade, however, companies have begun to realize the expense and effort involved with administering a PC-based infrastructure. Now, there is a reversion to shifting intelligence and storage back to a centralized location' (Jakobsen 2007). This is a very valid statement according to my experience. My firm has around 600 legal and support staff working across 7 offices located in Hong Kong and Asia. I really find it difficult to deploy 600 workstations even though we have used some tools to install the patches, hot fixes, new anti-virus definition files and add-ins automatically. When we need to upgrade the workstations no matter hardware or software, we need to re-ghost them and reconfigure them one by one. With thin clients, we can save many hours on desktop deployment. For end-user support, we have adopted different kind of remote access software and therefore, I don't really see an issue.

Security is no longer an issue, Users cannot copy the documents (i.e. precedents) from the system. CIO can focus on the server security. But now, we need to block users from using the USB devices on their workstations. Whenever users need to copy documents from or to the system, IT team will handle it for them. Many years ago, my firm were running diskless workstations (i.e. no hard drive and floppy drive), the only storage was the file servers. However, the trend was shifted to heavy clients as client server technology was widely adopted by IT industries. This intended to off load the burden of the server and consumed more clients' power to execute the programs. Probably, because the costs to maintain main frames or mid-range computers were very high and their CPU power, memories and disk space were relatively expensive. Down-sizing was a solution to this problem. But now we all realise that we have incurred other costs to support/deploy desktop computers and theirs applications.

The high availability of ICT systems is vital to the success of business and therefore thin clients, web portals, Internet access, mobile devices and etc. are making you more competitive in the knowledge-based economy. Mark Templeton, CEO and president, has expressed his views on "ubiquitous computing" reported by Ramos (2007, pp 26-27). I will detail "ubiquitous computing" in next blog.

Jakobsen (2007b) refutes that the thin client network will ease end users to remote access their data and emails. This allows network administrators to cost-effectively manage personal computers in the data center, while the desktop users will use standard network protocols to display the results via a simple access device across existing networks. As a result, the network infrastructure enables lower total cost of ownership, greater security, and high reliability, all while providing a consistent, predictable end user computing exprience.

To be continued


References

Chung, K, Zimmerman P S., Chandra, S 2006, 'Networking Definition - Thin client', SearchNetworking.com Definitions, last updated 23 March, viewed 26 May 2007, <
http://searchnetworking.techtarget.com/sDefinition/0,,sid7_gci213135,00.html>.

Ho, A 2006, 'End of PC life cycle signals thin client boom', Asia's Enterprise All Stars, Network World Asia, vol 2, no 11, p.11.

Jakobsen, J 2007a, ‘Why thin is fashionable again (Part I)’, Enterprise Innovation, Technology, viewed 7 May 2007, <
http://www.enterpriseinnovation.net/article.php?cat1=2&id=1351>.

Jakobsen, J 2007b, ‘Why thin is fashionable again (Part II)’, Enterprise Innovation, Technology, viewed 7 May 2007,<
http://www.enterpriseinnovation.net/article.php?cat1=2&id=1354>.

Ramous, L 2007, ‘Right place, right time’, Network World Asia, vol 3, no 4, pp. 26-27.

Thursday, May 10, 2007

Infrastructure II - Data Management

In order to cope with the high volume of transactions and requests from users, we have to upgrade or replace the components of our network infrastructure from the front end to the back end. The most important thing is to identify the bottle neck of our network.
The capacities of the servers need to be upgraded regularly as our data is growing rapidly due to the emails and documents. As I mentioned in the last blog why our emails are eating up the server space, we need to upgrade the email servers nearly every year. Besides, we are now implementing a new document management system (DMS), in the similar fashion, the existing system is in-house developed that is no longer meeting our requirements. In the legal field or other professional fields, documents are the assets to the firms. More correctly, Knowledge management is tremendous to us. We are all now facing the problem of “Information Flooding” and are drown by the information. By the way, I would like to distinguish between “Data” and “Information”.

Whatis.com defines that ‘Information is stimuli that have meaning in some context for its receiver. When information is entered into and stored in a computer, it is generally referred to as data. After processing (such as formatting and printing), output data can again be perceived as information. When information is packaged or used for understanding or doing something, it is known as knowledge.’
Definitely, data, information and knowledge are interrelated. If we don’t have the good systems to convert our data to information and the tools to retrieve information, they will never be the knowledge we need. I always believe too much information is actually no information. With this in mind, we need to have powerful servers (i.e. high CPU speed and high capacity) to process and store our data. We are replacing the old servers with Rack Mount system that can stack up many servers and also, expanded the size of the data centre.

More powerful servers require more electricity supply and cooling control. Therefore, the power supply and the air conditioning system for the data centre were upgraded accordingly. Actually we should look into the design of the server itself. Researchers at Purdue University have demonstrated an ionic wind engine that promises to reduce the heat generated by semiconductors at a substantially faster rate that is possible with traditional cooling technologies. The logic behind is to activate the electrons and ions on the surface of the chips and the ions hit the air molecules and hence, increases the airflow which can cool the chip quicker (Lemon 2007). Details of this development has been published in the Sept. 1 issue of the Journal of Applied Physics. Anyway, I don't want to sidetrack you.

Currently, we are using multiple backup devices including magnetic tapes and optical disks. They just barely meet our needs and are still manageable. Penn (2007) in particular have reservations on optical technology despite ‘the recent claims of optical disk supremacy and the rapid rise from burnable CDs to DVD-Rs and onto Blu-ray and/or HD-DVD.’

Apart from that, everyday we used up a few backup tapes and optical disks and as time gone by we have accumulated a huge volume of them. Therefore, the metadata of tapes and disks are getting more and more important, which highly affects the recovery process. We all know that the recovery of data is very time consuming and never an easy task. However, we are usually required to fulfil the requests from users with a tight time frame. As a result, this is crucial to implement an effective backup and recovery solution with holistic view.
To be continued.
References
Hammond, S 2007, 'Metadata, data, and migration', Computerworld Hong Kong Daily, posted 1 August 2007, viewed 5 August 2007, <http://www.cw.com.hk/computerworldhk/TechWatch/Metadata-data-and-migration/ArticleStandard/Article/detail/447187>.
Lemon 2007, 'Researchers use ionic wind to keep chips cool', ComputerWorld Hong Kong Daily, viewed 19 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=449851>.

Whatis.com 2005, ‘What is information', last updated 1 April 2005, Whatis.com, viewed 6 August 2007, <http://searchsqlserver.techtarget.com/sDefinition/0,,sid87_gci212343,00.html>.

Infrastructure I - Data Management

It has been a concern of data management in my firm. Our data has been growing exponentially even though the number of users has varied less than 15 % throughout the years. I believe it would also happen to other businesses or industries. According to Graham Penn, Associate VP, Storage Asia Pacific for research firm IDC, the amount of data requiring business-level storage is escalating at 40 – 50% (Hammond 2007).

Nowadays data is increasing and accumulating in the offices. Probably, people have changed their mindsets so that they are accepting the softcopy and willing to eliminate the hardcopy, of the documents. Take my firm as an example, our senior partners or consultants are adopting the technologies such as wireless devices, emails or remote access etc. I can say that the email system is crucial to our business. There was an experience that our email system was down for nearly two days. It was really a chaotic situation to the firm even though it happened over 6 years ago. If it happened today, it would be even worse. More users are using Outlook (i.e. email client) as their personal file system due to its mobility and availability, and they can use Outlook Web Access (OWA) to access their mailbox anytime as long as they have access to the Internet. Alternatively, they can use the mobile devices including Smart Phone and Black Berry even though those devices might not be able to view many file types.

To users, truly this is a convenient way to store and retrieve their documents. The negative impact of this is keeping multiple copies of a single document at different locations. In other words, it would consume a lot of network resources, disk space to store the files, systems to backup and retrieve the files. For our email system, we have made full efforts to attain these tasks.

The Law of Moore says computing power will roughly double every 18 months. The logic is illustrated in the graph below:

Gordon Moore's original graph from 1965

‘The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer’ (Moore 1965).

Moore’s Law is well adopted and recognised by Intel. It is claimed that now we put 1.7 billion silicon transistors on one single chip (Intel Corporation 2005). What I actually want to illustrate the growth of technologies in another aspect. Probably, I should come back to my actual application of technologies.

To be continued


References


Hammond, S 2007, 'Metadata, data, and migration', Computerworld Hong Kong Daily, posted 1 August 2007, viewed 5 August 2007, <http://www.cw.com.hk/computerworldhk/TechWatch/Metadata-data-and-migration/ArticleStandard/Article/detail/447187>.


Intel Corporation 2005, video transcript, ‘Excerpts from A Conversation with Gordon Moore: Moore’s Law’, US, <ftp://download.intel.com/museum/Moores_Law/VideoTranscripts/Excepts_A_Conversation_with_Gordon_Moore.pdf>.

Moore, Gordon 1965, ‘Cramming more components onto integrated circuits’, Electronics Magazine, vol. 38, no.8.

Tuesday, May 8, 2007

Networking Infrastructures OSI and TCP/IP models

The Open System Interconnection (OSI) reference model was developed by the International Organisation for Standardisation (ISO) as a model for a computer protocol architecture and as a framework for developing protocol standards. The OSI Model includes 7 layers which are Physical, Data Link, Network, Transport, Session, Presentation and Applications (Ince 2004 p.41).

The TCP/IP Protocol Architecture is a result of protocol research and development conducted on the experimental packet-switched network, ARPANET, funded by the Defense Advanced Research Projects Agency (DARPA), and is generally referred to as the TCP/IP protocol suite. It has 5 independent layers which are Application, Transport, Internet, Network Access and Physical (Stalling 2005 p.106).

Stalling (2005) points out that the overall OSI model has been never flourished due to the following reasons:
  • The key TCP/IP protocols were mature and well tested at a time when similar OSI protocols were in the development stage.
  • When business began to recognize the need for interoperability across networks, only TCP/IP was available and ready to go.
  • Compared with the TCP/IP Protocol Architecture, the OSI model is unnecessarily complex with 7 layers.

Today Internetworking has highly adopted the TCP/IP architecture. The TCP/IP network has been dominating the market while there are many communications choices available, standardising on one particular protocol can make administration easier and reduce costs and complexity. Reduced complexity can also translate into increased up time and reduced configuration time. Sometimes we would still maintain more than one protocol in a network owing to some legacy systems and applications. For examples, keeping TCP/IP, AppleTalk and IBM Systems Network Architecture (SAN) protocols in a network will incur a lot of costs for translating data which can be accepted and communicated among all of them.

References

Ince D 2004, ‘Developing Distributed and E-commerce Applications’, 2nd edn, Pearson Education Limited, Edinburgh Gate Harlow Essex CM20 2JE, pp.41-42.

Stallings W 2005, ‘Business Data Communications’, International Edition, 8th edn, Pearson Education, Inc., Upper Saddle River, NJ 07458, pp.97-128.

IT trends trigger the e-governments

IT trends have not only moved businesses and industries but also, triggered the e-governments. E-government services are available in many Asian countries (Sharma 2007).Even the less developed nations have invested reasonably to e-government services. Nevertheless, scalability and stability are the obstacles to deliver the e-government services to users. Rather than do some transactions across online and others offline, citizens might end up doing all transactions in person at the government departments like in the past. Of course, the ideal solution is to build up the robust IT infrastructure. Lemon (2007) reports that Singapore’s government expects to issue US$ 1.47 billion in IT tenders in 2007 as part of efforts to expand the use of technology. Probably not many less developed nations might afford such big investment in IT and also, IT has been priority to them.

Business Development Director, Wily Division highlights the issue regarding the e-government services:

On-demand Capacity

To measure the actual user response times experienced by end users. Then the results can be the bench mark to monitor the e-service response times, and detect transaction problems as they happen.


Inter-agency collaboration

To improve e-government service levels by increasing inter-agency integration and providing end-to-end e-services. This will encourage more citizens to use e-government services. Whether it's cross-agency or depth of e-services, government agencies need common guidelines to define online user experience can be shared among IT staff and line-of-business owners.

Plan, do, check, analyze

The ability to manage an integrated IT infrastructure comprising highly distributed and complex web applications is vital. It needs a powerful tool to monitor transactions thoroughly and is able to detect problems. Behind the scene the entire transaction involve with many components such as Enterprise portals, web services, application components, databases, web servers, legacy systems and connectors. This will enable the e-service provider to track transactions end-to-end from end-user entry point through the application infrastructure and back.



References

Lemon, S 2007, 'Singapore to issue $1.5B in IT tenders this year', Computerworld Hong Kong Daily, viewed 26 April 2007, <
http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=422439>.

Sharma, D 2007, 'Taking e-government services to the next level', Enterprise Innovation, technology, viewed 28 April 2007,<
http://www.enterpriseinnovation.net/article.php?cat1=1&id=1384>.

Saturday, May 5, 2007

This is my first blog in my life

I have been working as an IT specialist for 13 year+ in a legal firm. I can really witness the growth of IT in the legal industry. It has changed the old practice of the lawyers. All the while, the legal practitioners are very conservative but they have adapted the IT trends. Besides, IT has successfully advanced the legal industry. The large legal firms invest substantially in the development /employment of IT to promote their businesses and make them more competitive by providing on-line services to their clients such as extranets and e-billing. Take my firm as an example, throughout the years the IT staff have been expanded greatly over last decade regardless of the economic downturns from late 90's to early millennium. I am part of the team of 20 which are supporting around 600 people in six different locations.

Burstiner (2006, p.57) reports that the big US legal firms are spending more to upgrade their technology and expand their staff and, in general the technology capital expenses increased 3.5% over last year but the operating expenses declined 2.6%. It indicates that the technology can increase the firmwide efficiencies. Wireless access and data storage are the main areas being invested. I believe IT has also moved other industries and businesses.

Reference

Burstiner, M 2006, ’Making It Better’, AMLAW Tech, pp.55-59