Educated beyond common sense

A follow onto Knowledge Normalization Methodology.

A perspective on the evolution of technology and the corresponding view of educating the users to interface with next generation computer application, visions. The next frontier is understanding knowledge in such a way that it can be stored, interrogated, changed and managed according to business models. And engineering the meaning of relating knowledge elements according to specific: models, algorithms and formulae’s all seeking to explain the: how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

 

 

 

From clay tablets transaction record keeping to punch card file processing, information was stored as data (fields) elements, retrieved and presented as knowledge regarding a business activity. A watershed was reached when the hardware evolution introduced random access, capability, to data elements, the Relational Data Base.

Best Microsoft MCTS Training – Microsoft MCITP Training at Certkingdom.com

 

● The advent of Data Normalization Methodology.

This era of integrating RDB capabilities with management methodologies created a software industry to rationalize the results of human endeavors though Data reasoning logic.

The USER education continues;____________________________

Canonical synthesis

From Wikipedia, the free encyclopedia

Generally, in mathematics, a canonical form (often called normal form or standard form) of an object is a standard way of presenting that object.

Database normalization

Entity-relationship model

From Wikipedia, the free encyclopedia

A sample Entity-relationship diagram using Chen’s notation

In software engineering, an entity-relationship model (ERM) is an abstract and conceptual representation of data. Entity-relationship modeling is a database modeling method, used to produce a type of conceptual schema or semantic data model of a system, often a relational database, and its requirements in a top-down fashion. Diagrams created by this process are called entity-relationship diagrams, ER diagrams, or ERDs.

Data modeling.

From Wikipedia, the free encyclopedia

Data modeling in software engineering is the process of creating a data model by applying formal data model descriptions using data modeling techniques.

Database normalization.

From Wikipedia, the free encyclopedia

In the field of relational database design, normalization is a systematic way of ensuring that a database structure is suitable for general-purpose querying and free of certain undesirable characteristics insertion, update, and deletion anomalies that could lead to a loss of data integrity.[1]

Database Normalization Basics

By Mike Chapple, About.com Guide

….”If you’ve been working with databases for a while, chances are you’ve heard the term normalization. Perhaps someone’s asked you “Is that database normalized?” or “Is that in BCNF?” All too often, the reply is “Uh, yeah.” Normalization is often brushed aside as a luxury that only academics have time for. However, knowing the principles of normalization and applying them to your daily database design tasks really isn’t all that complicated and it could drastically improve the performance of your DBMS.

What is Normalization?

Normalization is the process of efficiently organizing data in a database. There are two goals of the normalization process: eliminating redundant data (for example, storing the same data in more than one table) and ensuring data dependencies make sense (only storing related data in a table). Both of these are worthy goals as they reduce the amount of space a database consumes and ensure that data is logically stored.

● The advent of Data Normalized

Project Management Methodology.

In this era of integrating RDB capabilities with project management methodologies created new software industries specializing in project management products and services specific to particular market industries.

The USER education continues;____________________________

from: The history of PERT Network and project management.

Wiest, Jerome D., and Levy, Ferdinand K., A Management Guide to PERT/CPM, New Delhi: Prentice-Hall of India Private Limited, 1974

1. INTRODUCTION

Basically, CPM (Critical Path Method) and PERT (Programme Evaluation Review Technique) are project management techniques, which have been created out of the need of Western industrial and military establishments to plan, schedule and control complex projects.

1.1 Brief History of CPM/PERT

CPM/PERT or Network Analysis as the technique is sometimes called, developed along two parallel streams, one industrial and the other military.

CPM was the discovery of M.R.Walker of E.I. Du Pont de Nemours & Co. and J.E. Kelly of Remington Rand, circa 1957. The computation was designed for the UNIVAC-I computer. The first test was made in 1958, when CPM was applied to the construction of a new chemical plant. In March 1959, the method was applied to maintenance shut-down at the DuPont works in Louisville, Kentucky. Unproductive time was reduced from 125 to 93 hours.

PERT was devised in 1958 for the POLARIS missile program by the Program Evaluation Branch of the Special Projects office of the U.S. Navy, helped by the Lockheed Missile Systems division and the Consultant firm of Booz-Allen & Hamilton. The calculations were so arranged so that they could be carried out on the IBM Naval Ordinance Research Computer (NORC) at Dahlgren, Virginia.

● The advent of Data Normalized

Business Intelligence Methodology.

Now begins an era of reverse engineering the meaning of relating data elements according to specific models, algorithms, formulae’s and others from differing RDB designs; ORACLE, SYBASE, Microsoft, all seeking to explain the how, why, who, where and when a ‘decision making’ action produced the correlated relational data elements.

Each industry using RDB technology (that’s pretty much everybody) has developed unique language descriptions to best fit their business, product and services operations. And when combined with each industry’s unique product or service needs, has created a new software niche for developing ‘business theory models’ for analysis between software logic and business logic.

The ongoing business intelligence software development process will continue to orchestrate language as a source of knowledge to explain the how, why, who, where and when an action produced the correlated relational data elements. This process is ultimately destined to examine business knowledge, frozen in time and rewriting the language (English Grammatical Sentences, RULES) by which decisions are made that affect business operations. This process simply reverse engineers (rewrites) the business language that best explains decision making at a particular point in time. Real time access to the language of business decision-making would therefore provide for faster reaction to changes for the operation of the enterprise. To achieve a real time relationship with an enterprise operation requires that the language of the enterprise become normalized, for an enterprise operation’s Relational Knowledge Base.

The USER education continues;____________________________

Business intelligence

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Business intelligence (BI) refers to computer-based techniques used in spotting, digging-out, and analyzing business data, such as sales revenue by products and/or departments or associated costs and incomes. [1]

BI technologies provide historical, current, and predictive views of business operations. Common functions of Business Intelligence technologies are reporting, online analytical processing, analytics, data mining, business performance management, benchmarking, text mining, and predictive analytics.

Business Intelligence often aims to support better business decision-making.[2] Thus a BI system can be called a decision support system (DSS).[3] Though the term business intelligence is often used as a synonym for competitive intelligence, because they both support decision making, BI uses technologies, processes, and applications to analyze mostly internal, structured data and business processes while competitive intelligence is done by gathering, analyzing and disseminating information with or without support from technology and applications, and focuses on all-source information and data (unstructured or structured), mostly external, but also internal to a company, to support decision making.

● The advent of Knowledge Normalization Methodology.

IDENTIFYING KNOWLEDGE

From: WorldComp2010, Published research paper:

Paper ID #: ICA4325

Title:      Knowledge Normalization Methodology

ICAI’10 – The 2010 is the 12th annual International Conference on Artificial Intelligence.

Title:      Knowledge Normalization Methodology.

Introduction to CAMBO, a multi-Expert System Generator, and a new paradigm knowledge normalization (patent pending). “Any entity’s knowledge, which can be described in “English Grammatical Sentences”, can be extracted and software (Rule processing) managed through a Normalized Knowledge Base.” Unlike Single EXPERT system generators, AION, Eclipse, XpertRule, RuleBook, CAMBO’s kernel logic; Knowledge Normalization Methodology is based upon a methodology that closely observes the rules of medical science in identifying, applying and developing the science of machine intelligence.

Abstract of the Disclosure.

The invention’s name is CAMBO an acronym for Computer Aided Management By Objective. The title is a “multi-EXPERT System Generator”, and the vision an “artificial intelligent bridge between technology and the ability to automate the instruments of the MBO methodology, namely: Charters, Organization Charts, Operational Plans, Project Management, Performance Planning and others all containing the knowledge, expressed in ‘English Grammatical Sentences’, upon which an enterprise conducts business. It would require the design of a unique combination of advanced methodology and technology capabilities built upon and work in concert with current state of the art, ‘Data Normalized’, Relational Data Base structure. The “AI Bridge” would include an advanced methodology for Normalizing Knowledge, a unique definition for a unit or element of knowledge, an advanced structure for a Spatial Relational Knowledge Base and a 5th generation programming language to support a Natural Language Processing interface.

The USER education continues;____________________________

from: International Cognitive Computing, CAMBO a multi-Expert system generator.

What is it? Each employee of a business enterprise is a human expert possessing a measure of talent, developed through the experience of practical application, interaction and training with other employees. The result of job related experience is an acquired understanding of: How, When and Where each employee is expected to contribute towards the operation of an enterprise. This understanding is expressed and communicated through simple conversational language, in which each language sentence represents a “knowledge element.” CAMBO describes knowledge in segments called elements with each element structured as a simple sentence, in which the employee expresses a single step, that when added to other related elements provide a complete description of a job related task. CAMBO imposes a single convention upon the formulation of each element, as a control parameter to support CAMBO’s fifth generation programming language called “Language Instructions Per Sentence” (LIPS). The convention requires that each sentence must begin with an action verb, selected from the CAMBO-LIPS “Action Verb List.”

“Analytical thinking requires the ability to compose business issues into logical and functional models that correctly reflect business processing, coupled with the skill to communicate the results to all levels of an organization”

Normalizing Knowledge

Knowledge engineering, which includes methodologies, techniques and tools, produces knowledge models for populating a storyboard layout for the design of a multi-expert system. Each knowledge engineering model is a particular life cycle view of activity and it models the functionality of a knowledge engine that drives the events within the life cycle. These models identify, capture, profile and relate the language of the enterprise for which the multi expert system supports. Knowledge engineering models the relationship between the business practices of an enterprise and the functionality of an ERP methodology and this information contributes toward the knowledge normalization process. The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business, science and engineering knowledge into its most basic form, the English Grammatical Sentence (EGS).

Each EGS is grouped into rule-sets that become part of a knowledge domain and because the knowledge normalization process establishes cross-domain relationships, the knowledge of many disciplines unite to answer questions. The procedure for asking questions is a simple, intuitive and interactive web based menu system that leads the user through a question and answer cycle – including a cross discipline review of the issues leading to a final answer. It responds as though the questions were asked of numerous engineers or business managers, in different disciplines, all contributing their knowledge towards identifying and answering issues on a specific business or engineering requirement. However, while the methodology for data normalization remains as a standard for developing a relational data base, the processes described are integrated with the processes for knowledge normalization. Data element: definitions, profiles, format, relationships, where-used and ontological associations all compliment the process of knowledge normalization.

“The methodology for knowledge normalization expresses a knowledge element as an English grammatical sentence. Knowledge engineering codifies the business: philosophy, science, engineering, and art’s (the four prime domains of knowledge) “Relational Knowledge Base”, into its most basic form, language”

Knowledge elements contain the logic by which data elements are created and manipulated. The status or condition of any individual data element is the result of a computer program, executing a series of knowledge elements (rules). Knowledge engineering is used to front end the data methodologies for: ORACLE, SYBASE, DB2, FoxPro, SAS, SAP, IEF, Microstrategy and all RDB application generating software.

For those readers that remember Herman Hollerith, congratulations for still being part of the technology evolution landscape. We have witnessed and been part of a most incredible industry journey. For this reason my hope is that your common sense, derived from decades of structured thinking, will view knowledge normalization as the next logical step for the technology evolution. Within this article I have expressed a direction for next generation technology towards a language based methodology to normalize knowledge. And while it does not replace relational data base structure, knowledge normalization (storyboards, rules and NLP) will identify and relate data elements and function as a front end designer for building a RDB.

Knowledge normalization raises many questions about design, structure and most importantly its business application. Consider the capability of reviewing, monitoring and testing new ideas about the manner in which, an enterprise conducts business. Reviewing the decision-making relationships that reflect the rationale and reasoning logic of the individual employee job description. Monitoring the working relationships between employees in which each job description, knowledge elements (rules) are associated with all other employee job responsibilities. Testing the knowledge storyboard by changing the knowledge elements (English grammatical sentences) and coordinating your changes with ALL employees’ decision-making job description rules (English grammatical sentences).

The future of Knowledge normalization will reflect a repeat of the post data normalization technology evolution. Business intelligence, project management and process control about the manner in which, an enterprise conducts business will need ‘application developers’ to reprogram their products. The migration from data base systems to knowledge based systems will pass from gradually integrating and adapting new technology interface to adopting the benefits of real time decision-making management.

*  Special appreciation and kudos to the folks at Wikipedia, the free encyclopedia, that provides an open forum for the development and communication of new ideas.

Educated beyond common sense
Scroll to top