Wednesday, September 26, 2007

Security

Security is always important.

Sunday, September 23, 2007

Infrastructure IX - Next Generation Network (NGN)

The Telecommunication Standardization Sector of International Telecommunication Union (ITU-T) has made recommendations on Next Generation Network (NGN) which is conceived as a concrete implementation of the Global Information Infrastructure (GII). Throughout the years, different study groups have published the results of their researches and developments in this regard.

ITU-T (2004) note that ‘the target of NGN is to ensure that all elements required for interoperability and network capabilities support applications globally across the NGN while maintaining the concept of the separation between transport, services and applications.’

I quote the definition of NGN from ibid (2004, p.2)

‘A packet-based network able to provide telecommunication services and able to make use of multiple broadband, QoS-enabled transport technologies and in which service-related functions are independent from underlying transport related technologies. It enables unfettered access for users to networks and to competing service providers and/or services of their choice. It supports generalized mobility which will allow consistent and ubiquitous provision of services to users.’

Nowadays, there are many different NGN communication operators such as PSTN (Public Switched Telephone Network), ISDN (Integrated Services Digital Network) and GSM (Global System for Mobile communications), and they are internetworked by means of gateways. We see the applications of them in the following manner, the communication devices connected to NGN will include analogue telephone sets, fax machines, ISDN sets, cellular mobile phones, GPRS (General Packet Radio Service) terminal devices, SIP (Session Initiation Protocol) terminals, IP phones through PCs (Personal Computers), digital set top boxes, cable modems, etc.

The NGN is characterised by the following fundamental aspects (ITU-T 2004, p3):

  • Packet-based transfer

  • Separation of control functions among bearer capabilities, call/session, and application/service

  • Decoupling of service provision from transport, and provision of open interfaces

  • Support for a wide range of services, applications and mechanisms based on service building blocks (including real time/streaming/non-real time services and multi-media)

  • Broadband capabilities with end-to-end QoS and transparency

  • Interworking with legacy networks via open interfaces

  • Generalised mobility

  • Unfettered access by users to different service providers

  • A variety of identification schemes which can be resolved to IP addresses for the purposes of routing in IP networks

  • Unified service characteristics for the same service as perceived by the user

  • Converged services between Fixed and Mobile networks

  • Independence of service-related functions from underlying transport technologies

  • Support of multiple last mile technologies

  • Compliant with all Regulatory requirements,
    for example concerning emergency communications and security/privacy, etc.
  • The aspects have been illustrated the characteristics of NGN. Needless to say, the scopes of ITU-T are very wide and in-depth. But I myself find it the most conclusive aspect of NGN is to generalise mobility, which will allow a consistent provision of services to a user. In other words, the user will be regarded as a unique entity when utilizing different access technologies, regardless of their types (ITU-T 2007, p.3).

    I only focus the discussion on Generalised Mobility. In the future, mobility will be offered in a broader sense where users may have the ability to use more access technologies, allowing movement between public wired access points and public wireless access points of various technologies. It actually means that this movement will not necessarily force an interruption of an application in use or a customer service. However, this requires significant evolutions of current network architectures. Enabling more transparent fixed-wireless broadband communications and mobility across various access technologies appears as a major issue (ibid 2004, p.7).

    Therefore, the ICT industries are achieving this objective. Gohring (2007) reports that RIM plans to issue a new model of BlackBerry with both cellular and Wi-Fi wireless capabilities as well as Motorola and Nokia were both selling phones with Wi-Fi and cellular aimed at business users last year. This indicates that the developers and manufacturers of mobile devices need to enable their products to be compliant with multiple operators and multiple access capabilities.

    References

    Gohring N 2007, ‘RIM plans Wi-Fi/cell phone BlackBerry’, Computerworld Hong Kong Daily, posted 28 May 2007, viewed 15 September 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=429669>.

    ITU 2005, home page, ITU-T’s Definition of NGN, updated 19 December, viewed 13 September 2007, <http://www.itu.int/ITU-T/ngn/definition.html>.

    ITU-T 2004, ITU-T Recommendation Y.2001 (12/2004) - General overview of NGN, Series Y: Global Information Infrastructure, Internet Protocol Aspects and Next-Generation Networks, Next Generation Networks – Frameworks and functional architecture models, Geneva, Switzerland.

    ITU –T see Telecommunication Standardization Sector of ITU

    Monday, September 10, 2007

    Infrastructure VIII - IEEE 802.11n

    IEEE (Institute of Electrical and Electronics Engineers, Inc.) and ITU (International Telecommunication Union) both have task groups to research and develop the new standards of networking protocols and mobile technologies. These have indicated the directions of building up our infrastructure. Meanwhile, the ICT industries have been making great efforts to launch their new products and services in the wireless era. I myself receive a few promotions via emails or phone calls every week.

    First of all, I would like to look into the developments of IEEE. Without doubt, the Institute of Electrical and Electronics Engineers, Inc. (IEEE) have long-standing recognition in the field. They have established the ‘Wi-Fi’ standards (i.e. IEEE 802®), one of the well-known mobile technologies and have various working groups and study groups to research and develop these standards. Meanwhile, ‘WirelessMAN’, IEEE 802.16 specifications support the development of fixed broadband wireless access systems to enable rapid worldwide deployment of innovative, cost-effective and interoperable multi-vendor broadband wireless access products (IEEE-SA 2007). More information regarding IEEE 802® is able to be found from IEEE website. The other standard IEEE 802.3™ is moving steadily which is for Ethernet wired LANs/WAN. More information regarding IEEE 802® is able to be found from IEEE website. The other standard IEEE 802.3™ is moving steadily which is for Ethernet wired LANs/WAN. In addition, the IEEE 802.11 specifications address both the Physical (PHY) and Media Access Control (MAC) layers and are tailored to resolve compatibility issues between manufacturers of Wireless LAN equipment.

    The Wi-Fi Alliance is a global, non-profit industry association that have certified Wi-Fi products from March 2000 till now. According to (Wi-Fi Alliance 2007), they have certified over 3400 products in the industry and have more than 300 member companies devoted to promoting the growth of wireless Local Area Networks (WLANs).

    Nowadays, ‘Wi-Fi’ has been adopted by ICT industries. TGn (2007) has just approved IEEE 802.11n draft 2.05 in July 2007 and Draft 3.0 is on the way and will be finalised and approved in 2008. In fact, Many ICT companies have competed with one another to issue the Draft 2.0 compliant equipment such as wireless router, wireless switch and wireless client. Wi-Fi Alliance (2007 p.4) claims that the multiple-in, multiple-out (MIMO) technology, multiplies the performance of the Wi-Fi signal, and is reflected in the two, three, or even more antennas found on some 802.11n routers and support 5GHz radio frequency. Additionally, its capacity is five times of 802.11g, rise from 54Mbit/s to 300Mbit/s that is able to fulfil today’s multimedia applications and products demand. This is a breakthrough to the wireless technology. But in practice, Judge (2007) reports that it will not be able to reach the data rate 300Mbit/s as the Ethernet standard, 802.3af cannot support two different radio frequencies, the 2.4GHz band (802.11bg/n) and the 5GHz bands the 5GHz (802.11a/n). Therefore, it will probably reach only half of the data rate (i.e. 150 Mbit/s).

    The other concern about 802.11n standard or wireless LAN is security. Dr So (2007) mentions four major attacks on wireless LAN including Intrusion, Denial of Service, Phishing and Eavesdropping. The most risky one is Eavesdropping because the attacker listens to the traffic on the wireless network and grasps useful information including passwords for online banking service and e-commerce but, can hardly be identified.

    I would like to discuss about the researches and developments of ITU on Mobile Technologies tomorrow.

    To be continued

    References

    Judge P 2007, ‘Aruba set to launch 802.11n access point’, Computerworld Hong Kong Daily, posted 14 September 2007, viewed 15 September 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=457611>.

    Wi-Fi Alliance 2007, ‘Wi-Fi CERTIFIED™ 802.11n draft 2.0:Taking Wi-Fi® to the Next Level’, published May 2007, pp.1-2

    So R 2007, ‘Wi-Fi threats stay alive’, Computerworld Hong Kong Daily, posted 10 May 2007, viewed 15 September 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=425942>.

    TGn 2007, Status of Project IEEE 802.11n, Standard for Enhancements for Higher Throughput, updated July 2007, San Francisco, California, US, viewed 17 September, <http://grouper.ieee.org/groups/802/11/>.

    TGn - see IEEE P802.11 Task Group n 2007

    Infrastructure VII - Disaster Recovery in practice

    People still think Disaster Recovery (DR) is important but not necessary, or not cost effective. Many years ago, we proposed the management for replacing the Uninterrupted Power Supply (UPS) in our server room. The management asked us when the power failure occurred last time. Hence I understood that he did not see need for it.

    Fonseca (2007) reported that

    ‘Peltzman said he understands why corporate management puts constraints on disaster recovery spending even though business people are the ones who complain when their systems fail. However, many of his IT brethren are often at odds with business leaders on the importance of business continuity and disaster recovery technology.’

    Steven Peltzman who is the CIO of The Museum of Modern Art in New York, has also faced the disagreement with business executives regarding the IT spending on disaster recovery. In fact, many CIOs and business executives have very diverse point of views on Disaster Recovery/Business Continuity (DRBC) in accordance with SunGard’s survey. Harris Interactive Inc. polled 176 corporate executives and 351 IT managers in February and March. The brief results of the survey are listed below.

    From the above results, we can see that more likely IT respondents focus on the backend operations rather than the frontend applications. When identifying which systems are essential to safeguard from disaster, business and IT executives are in agreement regarding the top four systems that impact revenue. Both groups identified e-mail, back-office applications, customer service and telecommunications in their list of top five systems that would affect the bottom-line if they were unavailable. That means they have diverse views but also, share some views.

    I am an IT specialist and I would very often see only a corner of a picture. To really have successful disaster recovery solutions, we should at least pull all the key decision makers together from all parts of the business such as business development, finance, HR, IT and etc and work out the plan for DR in advance, going through a business impact analysis to define the critical systems and applications so the plan is clearly defined before any interruption occurs.

    To be continued

    References

    Robins B 2007,‘Survey Reveals Limiting IT Downtime Window Major Concern; Business And IT Executives Disagree On Importance Of Disaster Preparedness’, SunGard, <http://www.sungard.com/news/default.aspx?id=2&announceId=844>.

    Fonseca B 2007a, ‘IT trying to work with execs on disaster recovery’, Computerworld Hong Kong Daily, posted 2 May 2007, US viewed 6 September, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=423990>.

    Fonseca B 2007b, ‘Survey: Business, IT differ on disaster recovery’, Computerworld Hong Kong Daily, posted 2 May 2007, US online, viewed 6 September, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=423976>.

    Infrastructure VI - Disaster Recovery

    Last time I mentioned about Contingency Plan. Today I further the discussion of it in a practical sense and I would bring up something about Disaster Recovery (DR) which is actually a kind of contingency plans.

    The need for DR is to ensure the business continuity whenever the crisis arises such as fire, power failure, storm, disease outbreak (e.g. SARS) and any other unexpected events which can damage your business, and your precious data.

    Smit (2007) reports that:

    ‘According to the Meta Group, the average cost of an hour of downtime for data centre applications is $330,000. According to the Strategic Research Corp., if that data centre belongs to a credit card authorization company, the loss jumps to $2.6 million. And if it belongs to a brokerage house, it climbs to $6.5 million. One day of lost productivity costs a company an average of $432 per employee.’

    Without doubt this is a great loss to a company. Don’t expect your clients would understand your difficulties and accept your apologies. The best solution is to plan ahead before the disaster occurred. Reducing the downtime means cutting down the loss. But how?

    Ibid (2007) has given us the directions to ensure high availability and business continuance.

    Protecting, replicating and backing up data

    First of all, we need to build up a high-capacity and low-latency data centre, which is interconnected to MAN (Metropolitan Area Network) and WAN (Wide Area Network). This can enable zero-data-loss data mirroring to protect user sessions, prevent transaction loss, and support automatic failovers between mirrored sites. SAN (Storage Area Network) technologies which enhance the distance, security, bandwidth utilization of replication and backup to remote sites, however it has not been really popular. In addition, technologies such as write acceleration, tape acceleration and server-less backup reduce latencies, extend distances and reduce application impact of storage replication applications. Moreover, it needs support for business continuance applications, especially those that provide replication and data protection.

    sourced from Javvin


    Enhancing application resilience

    Companies can remove single points of server failure by deploying high-availability clusters or load-balancing technology across Web and application servers. Apart from that, connectivity can be extended between clusters in different data centres to protect against major disruptions. Achieving this type of redundancy requires a high-speed, low-latency metro network.

    Ensuring user access

    Companies can employ technologies such as VPN to allow users from branch offices and telecommuters to reconnect to applications quickly as soon as they are up and running. In addition, technologies such as global site selectors can allow users to manually or automatically connect to the most available web application available at any given time. In the case of a disruption in any one application environment, users continue to have access to the alternate site.

    Needless to say, we all realised the devastating impact of 911. It just happened once in the past 6 years. Do we really have to focus on this incident too much and then, spend tens and thousands dollar on the above systems. Some may not be used even once in 10 years. The answer is absolutely. I still remember the disaster happened around 6 years ago. Due to the disorder of the fire sprinkles of an office on the high floor, it flooded the whole commercial building with water and thereafter, the power was suspended for a day. At that time, what we could do was to shut down all our mission critical servers before the UPS (Uninterrupted Power Supply) has been worn out. This action was to protect our servers and data. Lucky we had installed the UPS for all mission critical servers.

    You can probably imagine how big the loss was caused by this incident. Very unlikely, DR is able to fully eliminate the loss but at least, it can lighten it. Anyway, DR is totally a choice of investment. What is your choice?

    To be continued

    References

    Javvin, ‘Metropolitan Area Network and MAN Protocols’, Javvin Technologies, Inc, California, <
    http://www.javvin.com/protocolMAN.html>.

    Javvin, ‘Storage Area Network and SAN Protocols’, Javvin Technologies, Inc, California, <
    http://www.javvin.com/protocolSAN.html>.

    Javvin, ‘WAN: Wide Area Network’, Javvin Technologies, Inc, California, <http://www.javvin.com/networkingterms/WAN.html>.

    Smit A 2007,’Data centre safety measures protect business’, Enterprise Innovation, Technology, posted 28 August 2007, viewed 8 September 2007, <http://www.enterpriseinnovation.net/article.php?cat1=2&id=1847>.

    Sunday, September 9, 2007

    Infrastructure V – Business Concern

    In the business world, people have to be very dynamic and are able to forecast the changes. We IT people need to have this sense as well to adopt the new technologies and adapt to the changes. If we fail to see the problems, sometimes the consequences will be very costly and unbearable.

    Last month, the typhoon Pabuk made Hong Kong very chaotic because the Hong Kong Observatory (HKO) announced they would hoist Signal 8 within an hour, then changed their minds and flashed the 8 signal almost immediately. Due to this sudden change, the mobile networks overloaded and landlines also crashed. Meanwhile the HKO site kept crashing due to overload as people checked the latest typhoon conditions (Hammond 2007). Obviously, this was a communication breakdown. The question I would ask is why didn’t the infrastructure function properly in an urgent situation? Any contingency plans?

    The other case just happened in US recently. Owing to the breakdown of the major switch in the LA airport, over 20,000 people were trapped on planes and 60 planes were sitting on the tarmac (Hammond 2007). Hammond claims that ‘If that switch was mission-critical, why was there no backup system?'

    Through the above two incidents, we understand that very often systems are functioning properly under normal circumstance, but they fail when encountering the crisis. In the service industry, this problem would not be acceptable and bearable. The worst thing is you will lose your creditability and you are actually paying very high price for not investing on the contingency plan or disaster recovery solution.

    To be continued

    Reference

    Hammond S 2007, ‘Communication breakdown’, Computerworld Hong Kong Daily, posted 1 September 2007, viewed 5 September 2007, <http://www.cw.com.hk/computerworldhk/Communication-breakdown/ArticleStandard/Article/detail/454130>.

    Wednesday, September 5, 2007

    Infrastructure IV - Mobile Solutions

    Ubiquitous Computing, I have mentioned previously in this blog. Very often, my friends ask me advices for purchasing computers. I would normally recommend them to buy notebook computers even though the mobility of the machines is not their major concern. Why? First of all, computers are the necessities and are no longer luxury to most of the households in Hong Kong. Take my family as an example, I have two notebook computers at home, one for me and the other for my wife and my son. Definitely, all of us need the Internet access. The best and simplest solution is to set up a wireless router at home so that we all can share the access. Not many people would cable their houses for this. Normally, apartments are relatively small in Hong Kong. Therefore, a wireless router 802.11g should provide powerful coverage for an apartment or even two. (I am actually still using 802.11b model at home.)

    Secondly, Hong Kong government has started providing Wi-Fi facilities at government premises (GovHK 2007). Besides, over 2,300 wireless access spots are established in Hong Kong by the registered licensees including Airport Authority. Apart from that, 3G has been available locally for a few years and the major mobile carriers are licensees. Nevertheless, the 3G access is quite costly at this stage. As a result, notebook computer becomes a powerful ubiquitous device as it can be very tiny now.

    Without a good infrastructure, there is no point to have the best mobile devices. We have to closely monitor the IT market and always give the best solution to our users. For example, we have mobile devices such as blackberries available for loans, and now upgrading them from GPRS (General Packet Radio Service) to 3G which has been implemented very well in some Asian countries including South Korea and Japan. The throughput rates of GPRS and 3G are up to 40 kbs and 384 kbs respectively according to GSM Association (2007). The following graph illustrates the development of mobile technologies.


    sourced from GSM World

    Certainly, from 2G to 3G is a big jump in terms of the data rate. Probably, we will be seeing the bigger jumps in nearly future. NTT DoCoMo, which is Japan's largest cellular carrier, has been working on Super 3G for some time and anticipates introducing the technology in Japan sometime around 2009 as a stepping stone between current so-called 3.5G technology and future 4G systems, has also been aggressively pursuing 4G system development. In experiments conducted in late December last year the carrier came close to hitting a 5G bps data transmission speed from an experimental 4G system to a receiver moving at 10 kilometers per hour (Williams 2007). 4G and 5G are on the way…

    To be continued

    References

    GovHK 2007, ‘Public IT Facilities’, Hong Kong Government, <
    http://www.gov.hk/en/residents/communication/publicit/itfacilities/wifi/index.htm>.


    GSM Association 2007, ‘GSM World’, viewed 1 September 2007, <http://www.gsmworld.com/technology/gprs/index.shtml>.

    Williams, M 2007, 'NTT DoCoMo targets 300M bps in Super 3G experiment', Computerworld Hong Kong Daily, posted 13 July 2007, Tokyo, viewed 18 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=441398>.

    Tuesday, September 4, 2007

    Infrastructure III - Data Communications

    As the IT trend is directed to E-Commerce, M-Commerce, Web Portal and etc, the back end processing is far more complicated and substantial than last decade. Two-tier client server model is retiring and instead multi-tier model has become the dominant model. Therefore, the processing is shifted to the application servers, Web servers and database servers. For more information about thin client or multi-tier client server system, please refer to my previous publications in this blog.

    Right back to 2000, we started using 1000 mega bit switch. We have always focused on the infrastructure. The bad infrastructure would hinder the performance of good systems and applications. On the contrary, the good infrastructure would lighten the problems of the bad systems and applications. For example, if an application is coded badly and inefficiently, it will still run reasonably fast on a high speed network.

    Within the Office, we are now running 1000BASE-T gigabit Ethernet with very high quality network cable which is category 5E or 6 Shielded Twisted Pair (STP). Certainly, we can see the return from the luxury infrastructure due to the high data rate and good transmission qualities. So far I can see our network is much more stable. In the past, (around 10 years ago) we were running Fast Ethernet network (i.e. 100BASE-T) and the 10/100 Ethernet dump hub could not stabiles the network traffic. Whenever one connection was malfunctioning due to various reasons, it would affect the operation of all other connections on the same hub. But now I do not really come across this situation again. Basically our network is very stable and we are planning to move forward to 10 gigabit network as long as the technology is mature. Of course, we won’t extend it to every workstation. Amul (2006) claims that 10G Ethernet will increase the capacity of the network back bone through eliminating a few elements required run TCP/IP and Data traffic over an infrastructure originally designed to transport voice. This will definitely benefit the ISPs. How about us? I would say “Yes” to this question. Even though we are not an ISP and not adopting IP phone system, it will still ease the high volume of data traffic among the servers. Hence, it would drop the response time and latency of user requests. This will be very effective to heavy backend processing systems.

    The decentralisation of computing power has been the IT trend in this decade and the mobile technologies are moving into a super highway. The first third-generation (3G) wireless phone was launched by NTT DoCoMo in Tokyo on 1 October 2001. At that time, the 3G service was only confined to within Tokyo National Route 16, a 30 km radius from central Tokyo (Purvis 2001). But 6 years later, NTT DoCoMo has developed Super 3G cellular system which can transmit data at 300 mbps at most and is pursuing 3.5G and future fourth-generation (4G) systems (Williams 2007). I will go further about this issue tomorrow.

    To be continued


    References

    Amul 2006, ‘10 Gigabit Ethernet’, Network World, viewed 5 September 2007, <http://www.networkworld.com/details/460.html>.

    Hammond, S 2007, ‘Consumer tech in the enterprise space’, Computerworld Hong Kong Daily, posted 1 August 2007, viewed 18 August 2007, <
    http://www.cw.com.hk/computerworldhk/content/printContentPopup.jsp?id=447180 >.

    Purvis J 2001, ‘World's first 3G launch on 1st October severely restricted’, International Market News, posted 4 October 2001, Hong Kong Trade Development Council, Tokyo, viewed 1 September 2007, <http://www.tdctrade.com/imn/01100401/info14.htm>.

    Williams, M 2007, 'NTT DoCoMo targets 300M bps in Super 3G experiment', Computerworld Hong Kong Daily, posted 13 July 2007, Tokyo, viewed 18 August 2007, <
    http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=441398>.

    Tuesday, August 21, 2007

    The vulnerabilities of iPone -> Fact or Fiction?

    Apple inc. has launched iPhones to the market since 29 June 2007 in the United States from Apple retail and online stores, and from AT&T Mobility. On September 18, 2007, Apple announced in a special event that the iPhone will be available on November 9, 2007 on the carrier O2 in the United Kingdom; on 19 September 2007, Apple and Deutsche Telekom's T-Mobile announced the iPhone would go on sale 9 November 2007 throughout Germany; on 20 September 2007, France Télécom also announced they would be selling the iPhone in France (Wikimedia Foundation, Inc., 2007). However, there is the rise of concern about the security of iPhone in the IT industry (Wikimedia Foundation, Inc., 2007). By the way, I would like to introduce you a video clip about iPhone that may cheer you up.

    First of all, we need to understand the security issues of iPhone in general. Keizer (2007) has consolidated different views from the IT security specialists. Some really worries if one uses iPhone to connect to the corporate network, it will cause the vulnerabilities the network as Apple in the first place did not design iPhone for the enterprise use.

    Secondly, it runs Mac OS X means that there is a good possibility that vulnerabilities found on the OS will also affect the iPhone. Hackers may be able to port the hacks they find on one to the other. Especially, recently all the press around the iPhone makes it a very enticing target for hackers.

    Thirdly, Lemon (2007) reports that ‘hackers may successfully unlock an iPhone in as soon as three to seven days, according to a representative of one effort that aims to unlock Apple Inc.'s new handset’. They actually crack the activation process of iPhone so that users do not need to use iTunes to carry out the process and hence, no need to pay to AT&T.

    Fourthly, Reed (2007) reports that Apple’s CEO Steve Jobs has declared war on iPhone hackers and Apple’s option is to stop hackers from creating new open source programs for its iPhone. It does indicate that the actions of hackers have really created a threat to Apple.
    Apart from the above factors, do we really see that iPhone is in particular an unsafe mobile device which will do harm more that other thing else to the enterprise security. Many IT security specialists have the similar view as I have.

    Neel Mehta, team lead for Internet Security Systems Inc.'s advanced research group, claims that ‘the iPhone poses the same risks as any other device connected to the network. It's going to be very hard to control who uses it, so the best thing to do is take the defense-in-depth approach’ (Keizer 2007).

    Damoulakis (2007) also reports that ‘the boundaries of where data actually resides within an organization now extend well beyond the data centre to desktop computers, remote offices, employees' homes and laptops, USB drives, and, yes, phones. The problem that I have with some of the iPhone alarmism is that it leaves an impression that enterprise data is highly secure and that there aren't lots of other potentially much larger holes on which to focus’.

    Finally, the question I post to you again, ‘Is the data protection and security of corporate laptops more akin to the BlackBerry or the iPhone?’ (ibid 2007).
    References

    Damoulakis J 2007, ‘Is your iPhone more secure than your laptop?’, ComputerWorld Hong Kong Daily, posted 5 July, viewed 6 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=439492>.

    Keizer G 2007, ‘iPhone security: Nightmare for IT or no big deal?’, ComputerWorld Hong Kong Daily, posted 27 June, viewed 6 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=438058>.

    Lemon S 2007, ‘Unlocked iPhones coming in one week or less, hacker says’, ComputerWorld Hong Kong Daily, posted 7 July, viewed 23 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=439427>.

    McMillan R 2007,'With Black Hat approaching, a rush to patch iPhone', ComputerWorld Hong Kong Daily, posted 27 July, viewed 3 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=444882>.

    Wikimedia Foundation, Inc., 2007, iPhone, Wikipedia, The free encyclopedia, last modified 22:00, 25 September 2007, Wikimedia Foundation, Inc., US, viewed 25 September, <http://en.wikipedia.org/wiki/IPhone>.

    Reed B 2007, ‘Apple’s options for stopping open source iPhone use‘, NetworkWorld.com, posted 20 September, viewed 22 September, <
    http://www.networkworld.com/news/2007/092007-apple-stop-open-source-iphone.html?page=1>.

    Wednesday, August 8, 2007

    Cell phone can provide information more than you expect

    GPRS and 3G have enabled your cell phones to browse on the web for a while. Nevertheless, many websites are not constructed based on the screen design of cell phones. When you are browsing those websites, you probably come across a lot of pains. Recently, the cell phones providers and the Internet content providers have been working together to resolve the issue. Williams (2007a) reports that LG Electronics Inc. plans to launch at least 10 cell phones this year with preinstalled software and services from Google Inc., and a Japan Web portal operator has repackaged the content of Wikipedia so it can be searched and viewed on cell phones Williams (2007b). Really! They are great news to customers.

    I really see two points here. Definitely, very thin client and wireless technologies will be the direction of the ICT developments. The other one is we have the infrastructure ready but we still need the applications and software to drive it. Taiwaneses are crazy with iPhone even though it is not yet available for sales in Taiwan and will not be able to be operated as a cell phone at least until 2008 (Nystedt 2007). I do not really understand why people rush to preorder a cell phone without the key function. Oh yeah! They can still use the camera and music player function of the iPhone. Perhaps they can go to US or other European countries and use it there.

    But anyway, this is the iPhone hype. We are dreaming that one day a cell phone is much more than a communication device, which could be a digital camera, DV, MP3 player, PDA, TV, wireless portal, GPS..., even a computer. This is the concept of ‘Ubiquitous computing’. Are we getting there soon?
    References

    Nystedt, D 2007, 'Taiwan hit by iPhone craze' , ComputerWorld Hong Kong Daily, Taipei Bureau, viewed 30 June 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jspid=438634>.

    Williams, M 2007a, 'Wikipedia appears on cell phones', ComputerWorld Hong Kong Daily, viewed 29 April 2007,<http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=423022>.
    Williams, M 2007b, 'LG phones to carry Google software, ComputerWorld Hong Kong Daily, viewed 1 April 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=414987>.

    Thursday, July 12, 2007

    The evolution of thin clients

    IDC expects enterprise converged mobile device shipments to reach 63 million units worldwide by 2010, up from 7.3 million in 2005. Wow! This will be over eightfold of 2005 within half decade. The Research In Motion's (RIM) BlackBerry has a firm standing in the market, Microsoft, Motorola, Palm, Nokia and others are joining the competition and therefore, RIM's position is being challenged (NWA Staff 2006).

    The launch of iPhone has been a success to Apple Inc. Most of financial companies has forecasted the high volume of sales of iPhone in 2007 and 2008 (Nystedt 2007). However, after a while, it is found that iPhone is engaging with the security issues. Anyway, Apple is responding to the issues and providing the solutions.
    IDC expects enterprise converged mobile device shipments to reach 63 million units worldwide by 2010, up from 7.3 million in 2005. Wow! This will be over eightfold of 2005 within half decade. The Research In Motion's (RIM) BlackBerry has a firm standing in the market, Microsoft, Motorola, Palm, Nokia and others are joining the competition and therefore, RIM's position is being challenged (NWA Staff 2006).
    The launch of iPhone has been a success to Apple Inc. Most of financial companies have forecasted the high volume of sales of iPhone in 2007 and 2008 (Nystedt 2007). However, after a while, it is found that iPhone is engaging with the security issues. Anyway, Apple is responding to the issues and providing the solutions.

    The other issue is about a ‘thumb down’ story, a Colorado man had his thumbs surgically whittled to sharper points so he could dial his Apple Inc. iPhone. This story could be found in Snopes.com. Finally, the suburban Denver monthly that ran the piece admitted this was a joke (not true) recently (Keizer 2007). It gives me an insight that these gadgets are too tiny to be operated or not very practical. My company has purchased a number of BlackBerries for loans since 2004. Throughout the years, a few different models have been brought including 7100, 7200, 7700, 8700 series and…). I do not know which model is the best or the worst but the most unpopular one. Why? Users are preferable to loan the larger one with proper keyboard rather than the smaller one with combined phone and QWERTY layouts (e.g. 7100g). In other words, the popularity of the model is determined by the user friendliness rather than the outward appearance.

    Last month I went to the Microsoft TechEd 2007 Hong Kong conference. Many IT companies had their booths there to sell their latest technologies and products. Thin clients and Blade PCs were the popular products. One product which is comparatively more impressive (not better) in my mind is a wireless touch panel with AJ45 network interface card support. Without doubt, this is actually a ‘colourful dump’ terminal with Wi-Fi support. Doctors would really find it very useful to his work.

    Finally, I want to share my view that thin clients or any digital devices must be well designed with practical use in terms of functions and human factors.


    References

    Keizer, G 2007, 'iPhone 'thumbed-down' story waves facts goodbye', ComputerWorld Hong Kong Daily, posted 14 August, viewed 19 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=449526>.

    Nystedt, D 2007, ‘Taiwan hit by iPhone craze’, Computerworld Hong Kong Daily, posted 29 June, viewed 23 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=438634>.

    Sunday, May 27, 2007

    Today I help my sister-in-law troubleshoot her computer

    My sister-in-law, Janis has purchased a notebook computer for six months but she always find problems with her brand new computer. Today I go to her place and examine her computer. This is really a good news to IT specialist. Why? Basically, I found nothing wrong with her computer but found her have very limited IT knowledge. She has been working at a top American bank for many years as a secretary. I thought she would be a power PC user. It indicated that there are quite a few people who still need to employ IT supports.

    But today I read an article about Ray Kurzweil, the Technological Singularity. Then I started worrying about my future. Of course, Ray Kurzweil is a very bright person and he has made great contributions to AI. 'The Microsoft chairman calls him a visionary thinker and futurist' claimed by O'Keefe 2007. He said, 'By 2027, computers will surpass humans in intelligence. And by 2045, "strictly Biological" Humans won't be able to keep up.' What does that actually mean? We are living in an accelerating era of change unlike anything we have ever seen. I somewhat agree with this statement as I have mentioned the speedy change of IT industry in the past two decades in my previous posts. This is no longer an imgination in an movie produced in Hollywood. O'Keefe (2007, p.46) gave an concrete evidence, back in the 1980s Ray predicted that a computer would beat the world chess champion 1998 (it happened in 1997).


    Before I get into the details, I need to give you a terminology, Singularity. According to the Singularity Institute for Artificial Intelligence, this means the technological creation of smarter-than-human intelligence. After I have read through the first few pages of the website, I realise the ground for this notion. They estimated that the capacity and speed of CPUs will one day overtake the human brain. The biological and physical factors may sound logical to me. Can we really create a computer can think? I can still remember the movies such as 'The Terminator', 'Bicentennial Man', 'A.I.', 'I, Robot' and etc. In those movies, Computers not only can think, hate and love but also, fight against human beings. However, playing world chess is much simpler than human emotions. Besides, there are tens and thousands of different human mindsets and very often, humans behave illogically. How possible can we code infinitive possibilities into a program? I really get stuck...Of course, this is extremely difficult but it doesn't mean impossible as long as we have a supercomputer. SIAI (2007) claims that 'By comparison, speeds in modern computer chips are currently at around 2GHz – a ten millionfold difference – and still increasing exponentially'. Based on this ground, I believe it will happen one day...

    References
    SIAI - see The Singularity Institute for Artificial Intelligence

    O'Keefe, B 2007, '(CHECK ONE*) □The Smartest □The Nuttiest Futurist on Earth', Fortune Asia, vol. 155, no.8, Time Asia (Hong Kong) Limited, 14 May, pp. 46 - 52.

    The Singularity Institute for Artificial Intelligence 2007, 'What is Singularity?', Overview, Updated 2007, Singularity Institute for Artificial Intelligence, Inc., viewed 20 July 2007, <
    http://www.singinst.org/overview/whatisthesingularity>.

    Sunday, May 20, 2007

    The obstacle of Open Sources and Free Software I




    Open sources and free software are being used by more than half of fortune 500 companies (Parloff 2007).


    Microsoft the software giant has highly patented its software products. Up until 2005, Microsoft has filed around 3,500 and registered over 1,500 patents (Microsoft cited in Parloff 2007, p.51) . As I myself works at a legal firm, I know that patents are very costly and not many sole proprietors can afford to patent their inventions. Many software developers has more or less included Microsoft's patented components in their products. Microsoft demanded them to pay the licence fees. For example, the Microsoft-Novell deal, Microsoft and Novell not only agreed to jointly develop and market products that allow Windows and Linux to work together more smoothly but also, Microsoft agreed to indemnify Novell Suse Linux Enterprise users from patent claims (Lemon 2007). This is definitely not a good news to the users. As a result, free software will not be free anymore. Probably, this is even a bad news to corporate users like AIG, Wal-Mart, AIG, and Goldman Sachs.

    In December 2003, Microsoft's new licensing unit opened for business, and soon the company had signed cross-licensing pacts with such tech firms as SUN, Toshiba, SAP and Siemens (Parloff 2007).

    Fortunately, Free and Open-source software (FOSS) has been fighting for the free world. Free Software Foundation president Richard Stallman, a talented programmer has dared to challenge the giant. I myself truly think him ground for the battle to Microsoft is very reasonable and widely accepted by free worlders.

    To be continued

    References


    Lemon, S 2007, 'Dell joins Microsoft, Novell in Linux collaboration', ComputerWorld Hong Kong Daily, 7 May, viewd 8 May 2007, .


    Parloff, R 2007, 'Microsoft takes on the Free World', Fortune Asia, vol. 155, no.9, Time Asia (Hong Kong) Limited, 28 May, pp. 49 - 55

    Wednesday, May 16, 2007

    Ubiquitous Computing

    I have discussed before why thin client will be popular again in my previous blog. Nevertheless, probably, Ubiquitous Computing will be the destination. Mark weiser, the father of Ubiquitous Computing has given the idea of 'Ubiquitous Computing' in early 90s. He believes that 'Ubiquitous Computing' will be the third wave of computing after mainframes and PCs (Weiser, 1996). 'Ubiquitous Computing refers to the trend that we as humans interact no longer with one computer at a time, but rather with a dynamic set of small networked computers, often invisible and embodied in everyday objects in the environment' (UbiComp 2007).

    Ubiquitous Computing will be the destination. But now how far are we from it?

    Mark Templeton, CEO and president of Citrix Systems shared the company's vision of Ubiquitous Computing; he helps customers shift from distributed computing to application delivery service and the IT roles will change dramatically over next five years in response to the forces shaping today's business environment (Ramos 2007 p.26).


    He lists out 5 factors driving the IT trends:

    Consolidation - workers are required to share all their information

    Regulation - governments and industries holding business more information accountable so the organization must find a way to easily control and monitor information access

    Disruption & globalisation - high mobility of work force will need the delivery of applications from any endpoint, under every access scenario.

    Echo generation - tech-savvy enterprise IT users will demand application access to variety of wired and wireless communication links

    Templeton (2007) makes a conclusive statement that '... I guess what really saw us through is increasing relevance of our basic thinking about enabling people to work from anywhere over any type of connection'. This is the desire of people for getting Ubiquitous Computing ready. But there are still many issues waiting to be resolved. Broadband and wireless infrastructures are the basic requirement in the cities. Security is another important issue we can afford to ignore.

    To be continued.

    References

    Ramous, L 2007, ‘Right place, right time’, Network World Asia, vol 3, no 4, pp. 26-27.

    UbiComp 2007, 'What is Ubiquitous Computing?', 9th International Conference on Ubiquitous Computing, Innsbruck, Austria, viewed 21 May 2007, <http://www.ubicomp2007.org/scope/>.

    Weiser, M 1996, 'Ubiquitous Computing', viewed 15 May 2007, <http://sandbox.xerox.com/ubicomp/>.

    Monday, May 14, 2007

    Thin client will dominate the market again

    We all understand that PCs are getting more powerful and powerful. It is not unusual to have a Core 2 Duo desktop in your office nowadays. However, we also understand that thin clients are getting more popular. Why would it fall into two extremes?

    What is Thin Client?

    'Thin client is also used to describe software applications that use the client-server model where the server performs all the processing' (Chung, Zimmerman & Chandra 2006). Client-server technology has evoluted a new characteristic of "Client". HP, IBM, SUN and even Citrix have been promoting their thin client products. Ho (2006) reported that the growth of thin client sales has reached 279,513 units which is 64 percent over the previous year in the Asia/Pacific (including Japan. IDC says the region's thin client market is projected to expand at a compounded annual growth rate of 34 percent. I can say this has been a significant indication of the IT market trend - thin client again.

    'Over the last decade, however, companies have begun to realize the expense and effort involved with administering a PC-based infrastructure. Now, there is a reversion to shifting intelligence and storage back to a centralized location' (Jakobsen 2007). This is a very valid statement according to my experience. My firm has around 600 legal and support staff working across 7 offices located in Hong Kong and Asia. I really find it difficult to deploy 600 workstations even though we have used some tools to install the patches, hot fixes, new anti-virus definition files and add-ins automatically. When we need to upgrade the workstations no matter hardware or software, we need to re-ghost them and reconfigure them one by one. With thin clients, we can save many hours on desktop deployment. For end-user support, we have adopted different kind of remote access software and therefore, I don't really see an issue.

    Security is no longer an issue, Users cannot copy the documents (i.e. precedents) from the system. CIO can focus on the server security. But now, we need to block users from using the USB devices on their workstations. Whenever users need to copy documents from or to the system, IT team will handle it for them. Many years ago, my firm were running diskless workstations (i.e. no hard drive and floppy drive), the only storage was the file servers. However, the trend was shifted to heavy clients as client server technology was widely adopted by IT industries. This intended to off load the burden of the server and consumed more clients' power to execute the programs. Probably, because the costs to maintain main frames or mid-range computers were very high and their CPU power, memories and disk space were relatively expensive. Down-sizing was a solution to this problem. But now we all realise that we have incurred other costs to support/deploy desktop computers and theirs applications.

    The high availability of ICT systems is vital to the success of business and therefore thin clients, web portals, Internet access, mobile devices and etc. are making you more competitive in the knowledge-based economy. Mark Templeton, CEO and president, has expressed his views on "ubiquitous computing" reported by Ramos (2007, pp 26-27). I will detail "ubiquitous computing" in next blog.

    Jakobsen (2007b) refutes that the thin client network will ease end users to remote access their data and emails. This allows network administrators to cost-effectively manage personal computers in the data center, while the desktop users will use standard network protocols to display the results via a simple access device across existing networks. As a result, the network infrastructure enables lower total cost of ownership, greater security, and high reliability, all while providing a consistent, predictable end user computing exprience.

    To be continued


    References

    Chung, K, Zimmerman P S., Chandra, S 2006, 'Networking Definition - Thin client', SearchNetworking.com Definitions, last updated 23 March, viewed 26 May 2007, <
    http://searchnetworking.techtarget.com/sDefinition/0,,sid7_gci213135,00.html>.

    Ho, A 2006, 'End of PC life cycle signals thin client boom', Asia's Enterprise All Stars, Network World Asia, vol 2, no 11, p.11.

    Jakobsen, J 2007a, ‘Why thin is fashionable again (Part I)’, Enterprise Innovation, Technology, viewed 7 May 2007, <
    http://www.enterpriseinnovation.net/article.php?cat1=2&id=1351>.

    Jakobsen, J 2007b, ‘Why thin is fashionable again (Part II)’, Enterprise Innovation, Technology, viewed 7 May 2007,<
    http://www.enterpriseinnovation.net/article.php?cat1=2&id=1354>.

    Ramous, L 2007, ‘Right place, right time’, Network World Asia, vol 3, no 4, pp. 26-27.

    Thursday, May 10, 2007

    Infrastructure II - Data Management

    In order to cope with the high volume of transactions and requests from users, we have to upgrade or replace the components of our network infrastructure from the front end to the back end. The most important thing is to identify the bottle neck of our network.
    The capacities of the servers need to be upgraded regularly as our data is growing rapidly due to the emails and documents. As I mentioned in the last blog why our emails are eating up the server space, we need to upgrade the email servers nearly every year. Besides, we are now implementing a new document management system (DMS), in the similar fashion, the existing system is in-house developed that is no longer meeting our requirements. In the legal field or other professional fields, documents are the assets to the firms. More correctly, Knowledge management is tremendous to us. We are all now facing the problem of “Information Flooding” and are drown by the information. By the way, I would like to distinguish between “Data” and “Information”.

    Whatis.com defines that ‘Information is stimuli that have meaning in some context for its receiver. When information is entered into and stored in a computer, it is generally referred to as data. After processing (such as formatting and printing), output data can again be perceived as information. When information is packaged or used for understanding or doing something, it is known as knowledge.’
    Definitely, data, information and knowledge are interrelated. If we don’t have the good systems to convert our data to information and the tools to retrieve information, they will never be the knowledge we need. I always believe too much information is actually no information. With this in mind, we need to have powerful servers (i.e. high CPU speed and high capacity) to process and store our data. We are replacing the old servers with Rack Mount system that can stack up many servers and also, expanded the size of the data centre.

    More powerful servers require more electricity supply and cooling control. Therefore, the power supply and the air conditioning system for the data centre were upgraded accordingly. Actually we should look into the design of the server itself. Researchers at Purdue University have demonstrated an ionic wind engine that promises to reduce the heat generated by semiconductors at a substantially faster rate that is possible with traditional cooling technologies. The logic behind is to activate the electrons and ions on the surface of the chips and the ions hit the air molecules and hence, increases the airflow which can cool the chip quicker (Lemon 2007). Details of this development has been published in the Sept. 1 issue of the Journal of Applied Physics. Anyway, I don't want to sidetrack you.

    Currently, we are using multiple backup devices including magnetic tapes and optical disks. They just barely meet our needs and are still manageable. Penn (2007) in particular have reservations on optical technology despite ‘the recent claims of optical disk supremacy and the rapid rise from burnable CDs to DVD-Rs and onto Blu-ray and/or HD-DVD.’

    Apart from that, everyday we used up a few backup tapes and optical disks and as time gone by we have accumulated a huge volume of them. Therefore, the metadata of tapes and disks are getting more and more important, which highly affects the recovery process. We all know that the recovery of data is very time consuming and never an easy task. However, we are usually required to fulfil the requests from users with a tight time frame. As a result, this is crucial to implement an effective backup and recovery solution with holistic view.
    To be continued.
    References
    Hammond, S 2007, 'Metadata, data, and migration', Computerworld Hong Kong Daily, posted 1 August 2007, viewed 5 August 2007, <http://www.cw.com.hk/computerworldhk/TechWatch/Metadata-data-and-migration/ArticleStandard/Article/detail/447187>.
    Lemon 2007, 'Researchers use ionic wind to keep chips cool', ComputerWorld Hong Kong Daily, viewed 19 August 2007, <http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=449851>.

    Whatis.com 2005, ‘What is information', last updated 1 April 2005, Whatis.com, viewed 6 August 2007, <http://searchsqlserver.techtarget.com/sDefinition/0,,sid87_gci212343,00.html>.

    Infrastructure I - Data Management

    It has been a concern of data management in my firm. Our data has been growing exponentially even though the number of users has varied less than 15 % throughout the years. I believe it would also happen to other businesses or industries. According to Graham Penn, Associate VP, Storage Asia Pacific for research firm IDC, the amount of data requiring business-level storage is escalating at 40 – 50% (Hammond 2007).

    Nowadays data is increasing and accumulating in the offices. Probably, people have changed their mindsets so that they are accepting the softcopy and willing to eliminate the hardcopy, of the documents. Take my firm as an example, our senior partners or consultants are adopting the technologies such as wireless devices, emails or remote access etc. I can say that the email system is crucial to our business. There was an experience that our email system was down for nearly two days. It was really a chaotic situation to the firm even though it happened over 6 years ago. If it happened today, it would be even worse. More users are using Outlook (i.e. email client) as their personal file system due to its mobility and availability, and they can use Outlook Web Access (OWA) to access their mailbox anytime as long as they have access to the Internet. Alternatively, they can use the mobile devices including Smart Phone and Black Berry even though those devices might not be able to view many file types.

    To users, truly this is a convenient way to store and retrieve their documents. The negative impact of this is keeping multiple copies of a single document at different locations. In other words, it would consume a lot of network resources, disk space to store the files, systems to backup and retrieve the files. For our email system, we have made full efforts to attain these tasks.

    The Law of Moore says computing power will roughly double every 18 months. The logic is illustrated in the graph below:

    Gordon Moore's original graph from 1965

    ‘The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer’ (Moore 1965).

    Moore’s Law is well adopted and recognised by Intel. It is claimed that now we put 1.7 billion silicon transistors on one single chip (Intel Corporation 2005). What I actually want to illustrate the growth of technologies in another aspect. Probably, I should come back to my actual application of technologies.

    To be continued


    References


    Hammond, S 2007, 'Metadata, data, and migration', Computerworld Hong Kong Daily, posted 1 August 2007, viewed 5 August 2007, <http://www.cw.com.hk/computerworldhk/TechWatch/Metadata-data-and-migration/ArticleStandard/Article/detail/447187>.


    Intel Corporation 2005, video transcript, ‘Excerpts from A Conversation with Gordon Moore: Moore’s Law’, US, <ftp://download.intel.com/museum/Moores_Law/VideoTranscripts/Excepts_A_Conversation_with_Gordon_Moore.pdf>.

    Moore, Gordon 1965, ‘Cramming more components onto integrated circuits’, Electronics Magazine, vol. 38, no.8.

    Tuesday, May 8, 2007

    Networking Infrastructures OSI and TCP/IP models

    The Open System Interconnection (OSI) reference model was developed by the International Organisation for Standardisation (ISO) as a model for a computer protocol architecture and as a framework for developing protocol standards. The OSI Model includes 7 layers which are Physical, Data Link, Network, Transport, Session, Presentation and Applications (Ince 2004 p.41).

    The TCP/IP Protocol Architecture is a result of protocol research and development conducted on the experimental packet-switched network, ARPANET, funded by the Defense Advanced Research Projects Agency (DARPA), and is generally referred to as the TCP/IP protocol suite. It has 5 independent layers which are Application, Transport, Internet, Network Access and Physical (Stalling 2005 p.106).

    Stalling (2005) points out that the overall OSI model has been never flourished due to the following reasons:
    • The key TCP/IP protocols were mature and well tested at a time when similar OSI protocols were in the development stage.
    • When business began to recognize the need for interoperability across networks, only TCP/IP was available and ready to go.
    • Compared with the TCP/IP Protocol Architecture, the OSI model is unnecessarily complex with 7 layers.

    Today Internetworking has highly adopted the TCP/IP architecture. The TCP/IP network has been dominating the market while there are many communications choices available, standardising on one particular protocol can make administration easier and reduce costs and complexity. Reduced complexity can also translate into increased up time and reduced configuration time. Sometimes we would still maintain more than one protocol in a network owing to some legacy systems and applications. For examples, keeping TCP/IP, AppleTalk and IBM Systems Network Architecture (SAN) protocols in a network will incur a lot of costs for translating data which can be accepted and communicated among all of them.

    References

    Ince D 2004, ‘Developing Distributed and E-commerce Applications’, 2nd edn, Pearson Education Limited, Edinburgh Gate Harlow Essex CM20 2JE, pp.41-42.

    Stallings W 2005, ‘Business Data Communications’, International Edition, 8th edn, Pearson Education, Inc., Upper Saddle River, NJ 07458, pp.97-128.

    IT trends trigger the e-governments

    IT trends have not only moved businesses and industries but also, triggered the e-governments. E-government services are available in many Asian countries (Sharma 2007).Even the less developed nations have invested reasonably to e-government services. Nevertheless, scalability and stability are the obstacles to deliver the e-government services to users. Rather than do some transactions across online and others offline, citizens might end up doing all transactions in person at the government departments like in the past. Of course, the ideal solution is to build up the robust IT infrastructure. Lemon (2007) reports that Singapore’s government expects to issue US$ 1.47 billion in IT tenders in 2007 as part of efforts to expand the use of technology. Probably not many less developed nations might afford such big investment in IT and also, IT has been priority to them.

    Business Development Director, Wily Division highlights the issue regarding the e-government services:

    On-demand Capacity

    To measure the actual user response times experienced by end users. Then the results can be the bench mark to monitor the e-service response times, and detect transaction problems as they happen.


    Inter-agency collaboration

    To improve e-government service levels by increasing inter-agency integration and providing end-to-end e-services. This will encourage more citizens to use e-government services. Whether it's cross-agency or depth of e-services, government agencies need common guidelines to define online user experience can be shared among IT staff and line-of-business owners.

    Plan, do, check, analyze

    The ability to manage an integrated IT infrastructure comprising highly distributed and complex web applications is vital. It needs a powerful tool to monitor transactions thoroughly and is able to detect problems. Behind the scene the entire transaction involve with many components such as Enterprise portals, web services, application components, databases, web servers, legacy systems and connectors. This will enable the e-service provider to track transactions end-to-end from end-user entry point through the application infrastructure and back.



    References

    Lemon, S 2007, 'Singapore to issue $1.5B in IT tenders this year', Computerworld Hong Kong Daily, viewed 26 April 2007, <
    http://www.cw.com.hk/computerworldhk/article/articleDetail.jsp?id=422439>.

    Sharma, D 2007, 'Taking e-government services to the next level', Enterprise Innovation, technology, viewed 28 April 2007,<
    http://www.enterpriseinnovation.net/article.php?cat1=1&id=1384>.

    Saturday, May 5, 2007

    This is my first blog in my life

    I have been working as an IT specialist for 13 year+ in a legal firm. I can really witness the growth of IT in the legal industry. It has changed the old practice of the lawyers. All the while, the legal practitioners are very conservative but they have adapted the IT trends. Besides, IT has successfully advanced the legal industry. The large legal firms invest substantially in the development /employment of IT to promote their businesses and make them more competitive by providing on-line services to their clients such as extranets and e-billing. Take my firm as an example, throughout the years the IT staff have been expanded greatly over last decade regardless of the economic downturns from late 90's to early millennium. I am part of the team of 20 which are supporting around 600 people in six different locations.

    Burstiner (2006, p.57) reports that the big US legal firms are spending more to upgrade their technology and expand their staff and, in general the technology capital expenses increased 3.5% over last year but the operating expenses declined 2.6%. It indicates that the technology can increase the firmwide efficiencies. Wireless access and data storage are the main areas being invested. I believe IT has also moved other industries and businesses.

    Reference

    Burstiner, M 2006, ’Making It Better’, AMLAW Tech, pp.55-59