Category Archives: Xbox

Parish declares that Facebook and Christianity don’t mix

A Roman Catholic parish in Chicago is warning parishioners about the dangers of Facebook. St. John Cantius parish leaders wrote in the church bulletin this past Sunday that Facebook is against the Christian culture. Social networking sites in general apparently encourage vanity and dishonesty by providing an outlet for children to create their own electronic version of reality, concocting their own identities and social realities with a reduced risk of real-world consequences.

 

Best Microsoft MCTS Certification – Microsoft MCITP Training at Certkingdom.com


 

Chicago Tribune has the full quote of what parish leaders wrote in the church bulletin:

[Facebook] is exactly the opposite of the Christian culture where people go into the secrecy and sacredness of the confessional to blot out their sins forever. God entrusted parents with the care of their children for one particular purpose, and that is to teach them the way “to know, love, and serve God in this life and save their souls hereafter.” Everything leads us to think that Facebook fits poorly into this plan and was devised for a very different goal.

The church wants families to raise children without Facebook, as it supposedly helps youth defy their parents and cultivate feelings of lust. It’s rather worrying that families trying to raise their children in a wholesome environment are being told to avoid rather than educate.

Kids are future adults, and must thus learn about this world as much as they can, since they’ll be the ones managing it one day. At least for the foreseeable future, Facebook is part of this world.

TheStartupBuzz VC Showcase

The first of its kind event in India will bring venture capitalists, angel investors, start-up founders and entrepreneurs under one roof.

TheStartupBuzz VC Showcase is going to be held at Capitol Hotel, Rajbhavan Road, Bengaluru on 31 October 2009. VC Showcase is the first of its kind event in India where venture capitalists and angel investors showcase about their firms, explaining about their expectations from entrepreneurs. Each VC/investor will have their own table set-up only for meeting great companies and entrepreneurs.

 

Best Microsoft MCTS Certification – Microsoft MCITP Training at Certkingdom.com


The event will be attended by VC/investor, start-up founders and entrepreneurs. It will bring together start-up founders to share experiences and interact with other start-ups. Three start-up founders will share their experiences of starting a venture, sustaining it and taking it to the next level.

Around 20 VC/investors will showcase their firms and 200 start-up CEOs and aspiring CEOs will meet and discuss during the event. In addition, there will be a wide range of sessions focussing on building successful start up, best practices, funding theory etc.

IT’s About Securing The Information DNA, And More!

The conference will provide opportunities for industry leaders, corporate decision makers, academics and government officials to exchange ideas on technology trends and best practices.



Microsoft MCTS Certification, MCITP Certification and over 2000+ Exams at Actualkey.com

 

Securitybyte and OWASP India, organisations committed on raising InfoSec awareness in the industry, are hosting an information security event called Securitybyte & OWASP AppSec Asia 2009 at Hotel Crowne Plaza, Gurgaon, Delhi NCR from 17 November to 20 November 2009.

The highlight of the event is India’s first information security focussed India Technology Leadership Summit 2009 with panel discussion on Security concerns for off-shoring between industry leaders representing outsourcers, service providers and regulators. The panel is being moderated by cyber security expert Howard Schmidt.

This year’s conference will draw attendance from information security professionals from all over the world. There are 18 international speakers coming in from USA, New Zealand, Sweden, Germany, UK, Canada, Thailand and Taiwan to talk on subjects like The international state of cyber security: Risk reduction in a high threat world by Howard Schmidt, Critical infrastructure security: Danger without borders by John Bumgarner, to name a few.

The conference has three main tracks focussed on security professionals, developers and leaders in the security space. Speakers like Kevvie Fowler will address the security professionals to talk about techniques used to bypass forensics in databases. Additionally, speakers like Jason Lam will reveal how their SANS Dshield Webhoneypot Project is coming along. Microsoft Security Response Center will reveal how things work under the cover in their team.

People attending the event will have the opportunity to partake in three different types of war games. These scenario-based games not only include attacking Web applications and networks, but also show how real world cyber attacks take place.

This event also marks an entry of international information security training leaders SANS and ISC2 who are conducting two days of hands-on trainings by their instructors from USA. The four-day event will also host many advanced trainings like advance forensics techniques, building advanced network security tools, advanced Web hacking, in-depth assessment techniques etc.

Click here to take part in the event.

Educated beyond common sense

A follow onto Knowledge Normalization Methodology.

A perspective on the evolution of technology and the corresponding view of educating the users to interface with next generation computer application, visions. The next frontier is understanding knowledge in such a way that it can be stored, interrogated, changed and managed according to business models. And engineering the meaning of relating knowledge elements according to specific: models, algorithms and formulae’s all seeking to explain the: how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

 

 

 

From clay tablets transaction record keeping to punch card file processing, information was stored as data (fields) elements, retrieved and presented as knowledge regarding a business activity. A watershed was reached when the hardware evolution introduced random access, capability, to data elements, the Relational Data Base.

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

● The advent of Data Normalization Methodology.

This era of integrating RDB capabilities with management methodologies created a software industry to rationalize the results of human endeavors though Data reasoning logic.

The USER education continues;____________________________

Canonical synthesis

From Wikipedia, the free encyclopedia

Generally, in mathematics, a canonical form (often called normal form or standard form) of an object is a standard way of presenting that object.

Database normalization

Entity-relationship model

From Wikipedia, the free encyclopedia

A sample Entity-relationship diagram using Chen’s notation

In software engineering, an entity-relationship model (ERM) is an abstract and conceptual representation of data. Entity-relationship modeling is a database modeling method, used to produce a type of conceptual schema or semantic data model of a system, often a relational database, and its requirements in a top-down fashion. Diagrams created by this process are called entity-relationship diagrams, ER diagrams, or ERDs.

Data modeling.

From Wikipedia, the free encyclopedia

Data modeling in software engineering is the process of creating a data model by applying formal data model descriptions using data modeling techniques.

Database normalization.

From Wikipedia, the free encyclopedia

In the field of relational database design, normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable characteristics insertion, update, and deletion anomalies that could lead to a loss of data integrity.[1]

Database Normalization Basics

By Mike Chapple, About.com Guide

….”If you’ve been working with databases for a while, chances are you’ve heard the term normalization. Perhaps someone’s asked you “Is that database normalized?” or “Is that in BCNF?” All too often, the reply is “Uh, yeah.” Normalization is often brushed aside as a luxury that only academics have time for. However, knowing the principles of normalization and applying them to your daily database design tasks really isn’t all that complicated and it could drastically improve the performance of your DBMS.

What is Normalization?

Normalization is the process of efficiently organizing data in a database. There are two goals of the normalization process: eliminating redundant data (for example, storing the same data in more than one table) and ensuring data dependencies make sense (only storing related data in a table). Both of these are worthy goals as they reduce the amount of space a database consumes and ensure that data is logically stored.

● The advent of Data Normalized

Project Management Methodology.

In this era of integrating RDB capabilities with project management methodologies created new software industries specializing in project management products and services specific to particular market industries.

The USER education continues;____________________________

from: The history of PERT Network and project management.

Wiest, Jerome D., and Levy, Ferdinand K., A Management Guide to PERT/CPM, New Delhi: Prentice-Hall of India Private Limited, 1974

1. INTRODUCTION

Basically, CPM (Critical Path Method) and PERT (Programme Evaluation Review Technique) are project management techniques, which have been created out of the need of Western industrial and military establishments to plan, schedule and control complex projects.

1.1 Brief History of CPM/PERT

CPM/PERT or Network Analysis as the technique is sometimes called, developed along two parallel streams, one industrial and the other military.

CPM was the discovery of M.R.Walker of E.I. Du Pont de Nemours & Co. and J.E. Kelly of Remington Rand, circa 1957. The computation was designed for the UNIVAC-I computer. The first test was made in 1958, when CPM was applied to the construction of a new chemical plant. In March 1959, the method was applied to maintenance shut-down at the DuPont works in Louisville, Kentucky. Unproductive time was reduced from 125 to 93 hours.

PERT was devised in 1958 for the POLARIS missile program by the Program Evaluation Branch of the Special Projects office of the U.S. Navy, helped by the Lockheed Missile Systems division and the Consultant firm of Booz-Allen & Hamilton. The calculations were so arranged so that they could be carried out on the IBM Naval Ordinance Research Computer (NORC) at Dahlgren, Virginia.

● The advent of Data Normalized

Business Intelligence Methodology.

Now begins an era of reverse engineering the meaning of relating data elements according to specific models, algorithms, formulae’s and others from differing RDB designs; ORACLE, SYBASE, Microsoft, all seeking to explain the how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

Each industry using RDB technology (that’s pretty much everybody) has developed unique language descriptions to best fit their business, product and services operations. And when combined with each industry’s unique product or service needs, has created a new software niche for developing ‘business theory models’ for analysis between software logic and business logic.

The ongoing business intelligence software development process will continue to orchestrate language as a source of knowledge to explain the how, why, who, where and when an action produced the correlated relational data elements. This process is ultimately destined to examine business knowledge, frozen in time and rewriting the language (English Grammatical Sentences, RULES) by which decisions are made that affect business operations. This process simply reverse engineers (rewrites) the business language that best explains decision making at a particular point in time. Real time access to the language of business decision-making would therefore provide for faster reaction to changes for the operation of the enterprise. To achieve a real time relationship with an enterprise operation requires that the language of the enterprise become normalized, for an enterprise operation’s Relational Knowledge Base.

The USER education continues;____________________________

Business intelligence

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Business intelligence (BI) refers to computer-based techniques used in spotting, digging-out, and analyzing business data, such as sales revenue by products and/or departments or associated costs and incomes. [1]

BI technologies provide historical, current, and predictive views of business operations. Common functions of Business Intelligence technologies are reporting, online analytical processing, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics.

Business Intelligence often aims to support better business decision-making.[2] Thus a BI system can be called a decision support system (DSS).[3] Though the term business intelligence is often used as a synonym for competitive intelligence, because they both support decision making, BI uses technologies, processes, and applications to analyze mostly internal, structured data and business processes while competitive intelligence is done by gathering, analyzing and disseminating information with or without support from technology and applications, and focuses on all-source information and data (unstructured or structured), mostly external, but also internal to a company, to support decision making.

● The advent of Knowledge Normalization Methodology.

IDENTIFYING KNOWLEDGE

From: WorldComp2010, Published research paper:

Paper ID #: ICA4325

Title:      Knowledge Normalization Methodology

ICAI’10 – The 2010 is the 12th annual International Conference on Artificial Intelligence.

Title:      Knowledge Normalization Methodology.

Introduction to CAMBO, a multi-Expert System Generator, and a new paradigm knowledge normalization (patent pending). “Any entity’s knowledge, which can be described in “English Grammatical Sentences”, can be extracted and software (Rule processing) managed through a Normalized Knowledge Base.” Unlike Single EXPERT system generators, AION, Eclipse, XpertRule, RuleBook, CAMBO’s kernel logic; Knowledge Normalization Methodology is based upon a methodology that closely observes the rules of medical science in identifying, applying and developing the science of machine intelligence.

Abstract of the Disclosure.

The invention’s name is CAMBO an acronym for Computer Aided Management By Objective. The title is a “multi-EXPERT System Generator”, and the vision an “artificial intelligent bridge between technology and the ability to automate the instruments of the MBO methodology, namely: Charters, Organization Charts, Operational Plans, Project Management, Performance Planning and others all containing the knowledge, expressed in ‘English Grammatical Sentences’, upon which an enterprise conducts business. It would require the design of a unique combination of advanced methodology and technology capabilities built upon and work in concert with current state of the art, ‘Data Normalized’, Relational Data Base structure. The “AI Bridge” would include an advanced methodology for Normalizing Knowledge, a unique definition for a unit or element of knowledge, an advanced structure for a Spatial Relational Knowledge Base and a 5th generation programming language to support a Natural Language Processing interface.

The USER education continues;____________________________

from: International Cognitive Computing, CAMBO a multi-Expert system generator.

What is it? Each employee of a business enterprise is a human expert possessing a measure of talent, developed through the experience of practical application, interaction and training with other employees. The result of job related experience is an acquired understanding of: How, When and Where each employee is expected to contribute towards the operation of an enterprise. This understanding is expressed and communicated through simple conversational language, in which each language sentence represents a “knowledge element.” CAMBO describes knowledge in segments called elements with each element structured as a simple sentence, in which the employee expresses a single step, that when added to other related elements provide a complete description of a job related task. CAMBO imposes a single convention upon the formulation of each element, as a control parameter to support CAMBO’s fifth generation programming language called “Language Instructions Per Sentence” (LIPS). The convention requires that each sentence must begin with an action verb, selected from the CAMBO-LIPS “Action Verb List.”

“Analytical thinking requires the ability to compose business issues into logical and functional models that correctly reflect business processing, coupled with the skill to communicate the results to all levels of an organization”

Normalizing Knowledge

Knowledge engineering, which includes methodologies, techniques and tools, produces knowledge models for populating a storyboard layout for the design of a multi-expert system. Each knowledge engineering model is a particular life cycle view of activity and it models the functionality of a knowledge engine that drives the events within the life cycle. These models identify, capture, profile and relate the language of the enterprise for which the multi expert system supports. Knowledge engineering models the relationship between the business practices of an enterprise and the functionality of an ERP methodology and this information contributes toward the knowledge normalization process. The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business, science and engineering knowledge into its most basic form, the English Grammatical Sentence (EGS).

Each EGS is grouped into rule-sets that become part of a knowledge domain and because the knowledge normalization process establishes cross-domain relationships, the knowledge of many disciplines unite to answer questions. The procedure for asking questions is a simple, intuitive and interactive web based menu system that leads the user through a question and answer cycle – including a cross discipline review of the issues leading to a final answer. It responds as though the questions were asked of numerous engineers or business managers, in different disciplines, all contributing their knowledge towards identifying and answering issues on a specific business or engineering requirement. However, while the methodology for data normalization remains as a standard for developing a relational data base, the processes described are integrated with the processes for knowledge normalization. Data element: definitions, profiles, format, relationships, where-used and ontological associations all compliment the process of knowledge normalization.

“The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business: philosophy, science, engineering, and art’s (the four prime domains of knowledge) “Relational Knowledge Base”, into its most basic form, language”

Knowledge elements contain the logic by which data elements are created and manipulated. The status or condition of any individual data element is the result of a computer program, executing a series of knowledge elements (rules). Knowledge engineering is used to front end the data methodologies for: ORACLE, SYBASE, DB2, FoxPro, SAS, SAP, IEF, Microstrategy and all RDB application generating software.

For those readers that remember Herman Hollerith, congratulations for still being part of the technology evolution landscape. We have witnessed and been part of a most incredible industry journey. For this reason my hope is that your common sense, derived from decades of structured thinking, will view knowledge normalization as the next logical step for the technology evolution. Within this article I have expressed a direction for next generation technology towards a language based methodology to normalize knowledge. And while it does not replace relational data base structure, knowledge normalization (storyboards, rules and NLP) will identify and relate data elements and function as a front end designer for building a RDB.

Knowledge normalization raises many questions about design, structure and most importantly its business application. Consider the capability of reviewing, monitoring and testing new ideas about the manner in which, an enterprise conducts business. Reviewing the decision-making relationships that reflect the rationale and reasoning logic of the individual employee job description. Monitoring the working relationships between employees in which each job description, knowledge elements (rules) are associated with all other employee job responsibilities. Testing the knowledge storyboard by changing the knowledge elements (English grammatical sentences) and coordinating your changes with ALL employees’ decision-making job description rules (English grammatical sentences).

The future of Knowledge normalization will reflect a repeat of the post data normalization technology evolution. Business intelligence, project management and process control about the manner in which, an enterprise conducts business will need ‘application developers’ to reprogram their products. The migration from data base systems to knowledge based systems will pass from gradually integrating and adapting new technology interface to adopting the benefits of real time decision-making management.

*  Special appreciation and kudos to the folks at Wikipedia, the free encyclopedia, that provides an open forum for the development and communication of new ideas.

Dells new servers come preloaded with hundreds of virtual machines

Dell on Thursday served up a new choice for buying its servers: plug-and-play configurations that include up to 200 VMware virtual machines along with the networking and storage needed to run them.

Dell is also offering a ready-made virtual desktop infrastructure in the same fashion, letting users buy servers pre-configured with hundreds to thousands virtual desktops in two flavors: VMware or Citrix XenDesktop.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

CLOUD NEWS: Rackspace, Dell push OpenStack cloud OS

But wait, there’s more. The company tossed in an announcement about a new e-mail backup and archiving service and expanded partnership with Microsoft, then promised to build 10 new data centers worldwide in the next 24 months – three in the U.S. — to support what it hopes will be a massive move to its cloud. In addition to hosting virtual storage, desktops and e-mail backups, Dell says these data centers will serve up VMware, OpenStack and Azure clouds. All told, the company says it is spending $1 billion  on the cloud.

Dell executives explained that despite what virtualization has done for hardware efficiency, it isn’t easy to manage. “Customers are more interested in buying VMs than in buying the physical hardware they have to assemble,” said Stephen Schuckenbrock, president of Dell Services, during a telephone press conference this week. He adds that research reports predict that soon “virtually half the workloads will be running on virtual machines … but it’s not that easy to do, to optimize those virtual machines.”

Enter the new vStart line of servers. These are built with the Xeon-based Dell PowerEdge, VMware hypervisors, Dell EqualLogic storage and Dell PowerConnect switches (which Dell OEMs from Juniper) and includes essential virtualization management extensions. The infrastructure is pre-assembled by Dell, and delivered to the enterprise’s site as a single unit, racked and cabled. Customers can own these units, though Dell is also willing to set them up as managed servers, promised Schuckenbrock.

The vStart 100 is a unit preconfigured with 100 VMs and priced at $99,000 while the vStart 200 is configured with 200 VMs and is priced at $169,000. The vStart family will eventually include configurations that offer more and less VMs, said Schuckenbrock, but these two standard configurations are available now.

MORE ON DELL: Dell servers preloaded with Ubuntu Enterprise Cloud software

Not wanting to completely tick off the mighty Microsoft, Dell also announced a vague three-year agreement with Redmond that will let virtualized systems be managed by Microsoft System Center. This agreement will one-day also allow Dell to offer Hyper-V as a VM choice. When the Hyper-V option materializes, it will be a less expensive choice, hinted Praveen Asthana, vice president, Enterprise Solutions & Strategy. “Microsoft System Center and Hyper-V will give customers more choice and radically change the economics of virtualization,” he said. However configurations available today star VMware.

Plus, Dell is encouraging enterprises to take the plunge with VDI by similarly offering plug-and-play servers bundled with hundreds or thousands of Citrix XenDesktop or VMware virtual desktops. Customers can choose the number of desktops according to their needs. Dell technically announced its Desktop Virtualization Solutions (DDVS) offering in March but on Thursday reiterated that the VDI bundle can be had anyway the customer wants it: as an owned piece of gear, as a managed service, a cloud service, or some kind of custom package in between, with Dell doing all the integration work to ensure customers’ apps work with it. It, too, is based on Dell PowerEdge servers, EqualLogic storage and PowerConnect networking.

The Best 3D TVs

For all the hype they’ve received in the past year, 3D TVs aren’t exactly flying off store shelves. The biggest problem is that there’s still very little 3D content available. Also, when they started hitting the market last year, 3D sets were priced much higher than their 2D counterparts. Now, however, prices have come down a bit—probably because these models aren’t selling as quickly as manufacturers had hoped.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

If you’re in the market for a new HDTV anyway, it doesn’t hurt to get a 3D-ready set if you can find a deal. After all, all the 3D TVs listed below deliver excellent 2D performance too. One thing to keep in mind, though: 3D TV doesn’t always just mean paying for the set itself. Many manufacturers sell the required glasses separately for as much as $150 a pair. So if you want to be able to enjoy 3D with family and friends, tack on a few hundred dollars the price of your TV. (The only exception here is Vizio’s XVT3D650SV, which uses passive 3D, and four pairs of glasses are bundled with the set.) Also if you’re going to watch 3D Blu-ray discs, there’s also the cost of a 3D-enabled player. But the good news is that you don’t have to buy everything at once; you can get the set first and add the 3D accessories later.

If you’re ready to make the move to 3D, check out our list of the best 3D TVs below, along with current street prices, or compare these 3D-ready HDTVs side by side. For a top-rated 2D TV, check out The 10 Best HDTVs. And for general HDTV buying advice, read How to Buy an HDTV.

Why Google’s tighter control over Android is a good thing

Limiting availability of Android 3.0 code and apparent tightening of Android smartphone standards means that Google finally gets it about the platform

I’ve argued before that Android’s fragmentation, encouraged by its open source model, was a mistake. Google should drive the platform forward and ride herd on those who use it in their devices. If it wants to make the OS available free to stmulate adoption, fine. But don’t let that approach devolve into the kind of crappy results that many device makers are so clueless (or eager — take your pick) to deliver.

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

So far, Google’s been lucky in that the fragmentation has been largely in cosmetic UI areas, which doesn’t affect most Android apps and only annoys customers when they switch to a new device. The fragmentation of Android OS versions across devices is driving many Android developers away, as are fears over a fractured set of app stores. Along these lines, Google has to break the carriers’ update monopoly, as Apple did, so all Android devices can be on the same OS page.

It is true that HTC’s Eris brought some useful additions to the stock Android UI, serving as a model for future improvements. But the HTC example is the exception, and Google’s apparent new policy would allow such enhancements if Google judges them to be so.

More to the point is what the tablet makers such as ViewSonic, Dell, and Samsung did with their first Android tablets. Their half-baked products showed how comfortable they are soiling the Android platform. For them, Android is just another OS to throw on hardware designed for something else in a cynical attempt to capture a market wave. The consistently low sales should provide a clue that users aren’t buying the junk. But do they blame the hardware makers or Google? When so many Android devices are junk, it’ll be Google whose reputation suffers.

Let’s not forget Google’s competition, and why Google can’t patiently teach these companies about user experience: Apple, a company that knows how to nurture, defend, and evangelize a platform. Let’s also not forget the fate of Microsoft and Nokia, who let their Windows Mobile and Symbian OSes fragment into oblivion. And let’s remember that the one company that knows how the vanilla-PC game is played, Hewlett-Packard, has decided to move away from the plain-vanilla Windows OS and stake its future on its own platform, WebOS, for both PCs and mobile devices. In that world, a fragmented, confused, soiled Android platform would have no market at all.

If Google finally understands that Android is a platform to be nurtured and defended, it has a chance of remaining a strong presence in the mobile market for more than a few faddish years. If not, it’s just throwing its baby into the woods, where it will find cruel exploitation, not nurturing or defense.

Why Google’s tighter control over Android is a good thing

Limiting availability of Android 3.0 code and apparent tightening of Android smartphone standards means that Google finally gets it about the platform

Last week, Google said it would not release the source for its Android 3.0 “Honeycomb” tablet to developers and would limit the OS to select hardware makers, at least initially. Now there are rumors reported by Bloomberg Businessweek that Google is requiring Android device makers to get UI changes approved by Google.

 

Best Microsoft MCTS Certification – Microsoft MCITP Trainng at Certkingdom.com

 

As my colleague Savio Rodrigues has written, limiting the Honeycomb code is not going to hurt the Android market. I believe reining in the custom UIs imposed on Android is a good thing. Let’s be honest: They exist only so companies like Motorola, HTC, and Samsung can pretend to have any technology involvement in the Android products they sell and claim they have some differentiating feature that should make customers want their model of an Android smartphone versus the umpteenth otherwise-identical Android smartphones out there.

[ Compare mobile devices using your own criteria with InfoWorld’s smartphone calculator and tablet calculator. | Keep up on key mobile developments and insights via Twitter and with the Mobile Edge blog and Mobilize newsletter. ]

The reality of Android is that it is the new Windows: an operating system used by multiple hardware vendors to create essentially identical products, save for the company name printed on it. That of course is what the device makers fear — both those like Acer that already live in the race-to-the-bottom PC market and those like Motorola and HTC that don’t want to.

But these cosmetic UI differences cause confusion among users, sending the message that Android is a collection of devices, not a platform like Apple’s iOS. As Android’s image becomes fragmented, so does the excitement that powers adoption. Anyone who’s followed the cell phone industry has seen how that plays out: There are 1 billion Java-based cell phones out there, but no one knows it, and no one cares, as each works so differently that the Java underpinnings offer no value to anyone but Oracle, which licenses the technology.

Google initially seemed to want to play the same game as Oracle (and before it Sun), providing an under-the-hood platform for manufacturers to use as they saw fit. But a couple curious things happened:

* Vendors such as Best Buy started selling the Android brand, to help create a sense of a unified alternative to BlackBerry and iOS, as well as to help prevent customers from feeling overwhelmed by all the “different” phones available. Too much choice confuses people, and salespeople know that.
* Several mobile device makers shipped terrible tablets based on the Android 2.2 smartphone OS — despite Google’s warnings not to — because they were impatient with Google’s slow progress in releasing Honeycomb. These tablets, such as the Galaxy Tab, were terrible products and clear hack jobs that only demonstrated the iPad’s superiority. I believe they also finally got the kids at Google to understand that most device makers have no respect for the Android OS and will create the same banal products for it as they do for Windows. The kids at Google have a mission, and enabling white-box smartphones isn’t it.

IBM: An Education Tourism Programme For IT Professionals And Students

lated to start in August 2009, the programme is for global students and IT professionals.

Here is an IT education tourism programme for IT professionals and students. Launched by IBM, the programme will enable IT professionals and students to come to India and receive IBM certified training in Bengaluru.
To avail of this offer, an individual needs to register for a course from the IBM Power and IBM System Storage curriculum.

 

Best Microsoft MCTS Training – Microsoft MCITP Certification at Certkingdom.com

 

The IT Education Tourism programme, slated to start in August 2009, is an initiative where IBM has partnered with Stratom IT Solutions Pvt Ltd, India to introduce IT Education Tourism as a package for global students and IT professionals. IBM plans to target around 300 participants in India for a minimum of 30 days this year. According to the company, those participating will gain hassle-free world-class education and training, as well as exposure to India’s IT industry, which is one of the world’s fastest growing, diverse and most matured IT markets in the world.

As part of the programme, IBM will offer a portfolio of technical training and education services for systems designed for individuals, companies and public organisations to acquire, maintain and optimise their IT skills. The new initiative will enable students to be further equipped with IBM technologies like IBM Power and IBM System Storage.

India Needs More Homegrown PhDs In Computer Science

India does not have an adequate number of PhDs in computer science. It’s an issue the country needs to address, if we want to invigorate India’s innovation drive.

The rate of innovation in any country depends a lot on its research capabilities. And to increase research activities, a country should have a good strength of researchers-a precondition that India has not fulfilled in the computer science domain. This is one big reason, cited repeatedly by industry leaders, for the lack of innovation in the country. In fact, both industry and government are well familiar with this fact. Bill Gates, chairman of Microsoft Corp, and Kapil Sibal, Union Human Resource Development Minister, government of India, recently shared some insights on the subject.

Bill Gates expressed his concern over the lack of PhDs in computer science at an event held in New Delhi recently. According to him, research activities could bring fruitful results in different fields if India had a good number of homegrown PhDs in computer science. He said, “There is a shortage of homegrown PhDs in India in computer science. The ratio of PhDs compared with engineering graduates is very low. If 70,000 students enroll as engineering graduates, only 1500 go for higher programmes. But the irony is, out of those 1500, only 250 enroll for PhDs.”

 

 

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

Kapil Sibal listed comparative statistics about India and China in support of this. Sibal said, “For research activities, you need a good number of PhDs in India to allow you to move ahead as a country. During 1991-2001, the number of PhDs in India increased only by 20 percent, whereas in the same period, China had an 80 percent increase in its PhD researchers, which is unacceptable for us. Similarly, between 2001 to 2006, China’s GER (gross enrollment ratio) improved to 10-20 percent, whereas in the 11th Plan we have set a goal of achieving a GER of a mere 5 percent.” GER is the total enrollment of pupils in a grade or cycle or level of education, expressed as a percentage of the corresponding eligible age-group population in a given school year.

Sibal also urged corporates to come forward and help government set up world-class research institutions in India.

Most experts believe that if India could invest more into its research and development activities, it will be able to drive growth through innovation. But they also point at the low level of research infrastructure in the country and believe that the government needs to address this issue. So it remains to be seen as to how the government and industry collaborate and work together to increase the number of researchers and the research activities in the country. If and when it happens, it will surely give a huge push to India’s economic growth.