Communication is the process by which people transmit information among each other. The communication model consists of an information source that goes through an encoding process that passes through a channel. Thereafter, the systems decode the information, and it reaches its destination. Noise may interfere while feedback may improve it; these are the functional components of communication. Communication can also be nonverbal; it entails analyzing body language.
In project management, communication skills including speaking, listening, technology-mediated interaction, interpersonal, writing, and team communication. Speaking entails making oral presentations, controlling anxiety, and responding to queries from audiences. Listening in communication involves following directions, paying attention, taking notes, and deducing emotional meanings. Technology mediation entails writing good emails, displaying effective telephone skills, and leaving clear voice mails.
Interpersonal communication denotes giving sound instructions, demonstrating effective conversation skills, showing professional courtesy to others, persuading others, and carrying out interviews. Writing entails making proper use of grammar, writing effective formal documents, and constructing questionnaires. Team communication covers meeting contributions, managing conflict, giving advice, and leading.
Cultural competency is a key trait in communication where individuals must demonstrate an understanding of cultural norms, and interact with other cultures. The Direction of communication may be upwards (to one’s superiors), lateral (to colleagues), and downward (to one’s subordinates). Unidirectional communication is one-way while bidirectional is two-way, in that it involves equal exchanges. Cardinality is the means with which one selects the best method of communication for an audience. Audiences may include suppliers, colleagues, authorities, shareholders, customers, and management. The communication square entails having a message and diffusing it into its purpose, argument, audience, and persona.
Taylorism focuses on maximizing efficiency by producing many goods at low prices. This school of thought uses division of labor, mechanization, salary manipulation, time maximization, and the scientific organization of work. Post Taylorism aims at achieving quality, flexibility, productivity (efficiency), deadline realization, and variety. It achieves this through networking, autonomous workgroups, knowledge management, and process reengineering.
Volvo team experiments created the notion of existential leadership. The theory states that employees must find meaning in work to make their organizations successful. Therefore, the theory advocates for flat structures, polyvalence, responsibility for function, and performance skills.
The Dreyfus model dwells on skill acquisition. The lowest level is the novice who rigidly adheres to rules. The advanced beginner uses situation perception then the competent worker can cope with divergent demands. The proficient level involves having a holistic view while the expert level involves transcending rules or guidelines.
McGregor’s theory X of motivation holds that people predominantly dislike work so management must control or threaten them. Theory Y presumes that workers want respect and involvement in decision-making. Organizations can exercise control through competition, bureaucratic, or clan systems. The competition system is common in General Electric and involves working with the market. Bureaucratic systems involve strong command structures and emphasize efficiency like McDonald’s. Clan systems motivate through environmental and value ownership among workers. Examples include IBM and Kodak.
Belbin came up with a model of team roles. It has the following positions: coordinators, shapers, evaluators, team workers, implementers, specialists, planners, specialists, resource investigators, and finishers. The model is useful in finding missing roles and dealing with tensions.
Good Enough Software
In IS project management, quality is a subjective attribute that must involve tradeoffs. The project team should trade off aspects of scope, quality, and time. Project members must determine situations where good enough quality is the right approach for them. This will depend on whether clients can notice differences between the best quality and good enough quality. Therefore, the user or sponsor is the best determinant of good enough software. A case in point was the Apple Hypercard, which had 500 identifiable bugs, yet the product recorded massive success. The company focused on the effects of the bug rather than their quantity.
The utilitarian approach provides a theoretical basis for good enough software. IS scholar – Bach asserted that effective software aims at creating the greatest amount of happiness for the highest number of people. It entails the use of heroic teams, evolutionary development, and dynamic processes or infrastructure. However, good enough software approaches also have their limitations. They do not gel well with quality standards like ISO 9000. Furthermore, the programmer’s perceptions may diverge from these parameters. Several domains may not be applicable. Good enough software necessitates frequent testing and release of new batches, which may strain the organizational resources of time and money.
One typical example of how good enough software works for organizations is Nokia. The company has the challenge of doing large-scale development of software through small groups. Its navigation through this predicament is a solid illustration of how good enough software works in practical business contexts. First, the company’s software process improvement group has mapped software processes based on certain traits that are relevant to their business situation.
These include complexity, technology, and size. The software teams do not apply standard processes. Instead, they analyze a given context and choose the best aspects for the same. These processes include the easy zone, the sweet zone, and the difficult zone. In the easy zone, the software development group only develops software for small and uncomplicated software. Here, it involves minimal software process improvements (SPI). In the sweet zone, the team dedicates only a reasonable amount of SPI settings. Here, they develop software for complex but context-stable software or small-scale but complicated software.
In a difficult zone, the group develops software for complex and volatile situations in business. By dividing their processes into various components, they have ensured that only the required qualities are part of software development, and this constitutes good enough practice. The organization makes use of software reuse and maturity increment strategies, like agile and CMMI, to ensure that they tackle all the different levels of volatility and increases in the size of the software. In CMMI, Nokia makes tradeoffs between the time it takes to assess software with confidence in the accuracy of assessment results.
When measuring quality, a project team needs to consider the following parameters: usability, users’ confidence in performance, ease of performance verifiability, security, ease of maintenance, and smooth degradation during dire conditions. Good quality systems ought to have software that is easy to change. It needs to be portable and easily convertible. Furthermore, interface with other systems should be easy. Conversion into other applications should be easy as well as the way it utilizes the resources of the system. Performance should be easy to expand, as well.
In quality systems, the key areas of concern should be portability, maintainability, efficiency, usability, reliability, and functionality. Functionality refers to the security, compliance, interoperability, accuracy, and suitability of the system. Reliability encompasses the restorability, defect acceptance, and advancement of the system. Usability denotes the operability, learn-ability, and understand-ability of the system. Efficiency refers to time and resource behavior. Maintainability is the testability, stability, change-ability as well as the analyze-ability of the system. Portability encompasses replace-ability, conformance, adaptability, and install-ability.
To recognize good code, a person must look and analyze it. However, the entity must know what to look for. The entity can pay attention to connectivity or modularity. Connectivity involves coupling where modules interact. The interactions occur across a continuum of no coupling, which is desirable, to content coupling, which is undesirable. In content coupling, the transference of modules takes place thus modifying data.
Conversely, no coupling involves independent pairs of modules that do not communicate. Other forms of coupling between no coupling and content coupling in order of preference include data coupling, stamp coupling, control coupling, and common coupling. In data coupling, component pairs x and y communicate with each other using parameters but no control element exists. Stamp coupling entails the acceptance of record types of the same type as a parameter. Control coupling involves passing parameters between x and y to control the behavior of y. Common coupling involves referring to similar global data, which necessitates alterations in all coupled modules for alterations in global data.
Modularity refers to the functional robustness of system components. They can have coincidental cohesion, which is unwanted or functional cohesion, which is desirable. Coincidental cohesion involves a module that has several functions, which are logically unrelated. Functional cohesion refers to a module that only has one function that the system defines well. Other dimensions of modularity between these extremes, in increasing order of preference, include logical, temporal, communicational, and sequential cohesion.
Logical modules have two or more than two logically-related functions. Temporal modules also have two or more than two functions that must occur in the same period. Procedural cohesion covers two or more than two functions with general procedures in the software. Communicational modules also perform multiple functions but within the same body. Sequential modules perform two or more than two functions in a well-described specification.
Some scholars question the relevance of the above parameters by asserting that it was difficult to see code in modern business contexts. However, if one’s structure is inaccessible, then one can find it difficult to maintain it. These parameters are highly relevant even in modern programming contexts.
During information system project management, the top cost estimation risks entail personnel shortages, unrealistic budgets and schedules, wrong functions, wrong user-interface, and gold plating. Other risks are continual changes in requirements, failures in external components, failures in tasks performed externally, performance shortfalls in real-time, and straining capabilities in computer science.
In the field, cost estimating project teams encounter the following problems. They rarely use professional estimates. Additionally, they make estimates before accessing specification details. Cost estimators can learn a lot from engineering. In the latter field, project teams use professional estimators; they also base their estimates on firm specifications. They rely on better data and better measures. Additionally, they often separate the known from the innovative.
Firms have the option of doing cost estimates using an analogy, line of code, programming, and the analysis effort method. They may also use the Delphi technique, functional point analysis, and direct estimation. In the analogy method, estimators compare the similarities or differences of parameters. Some of them include the scope and size of the application, business type, and technical matters like language, and standards. Other factors, which can be compared include the extent of management support, the extent of computer culture, or the company culture of the buyer. Conversely, line-of-code cost estimates involve counting the number of code lines.
In the analysis effort method, estimators simplify their systems development cycle into the analysis, coding and unit testing, design, and testing. Conversely, the programming method focuses on programming efforts. Here, estimators determine whether a program in the system is small, medium, or large. Thereafter, they must establish whether it is simple, average, or complex. Indirect estimation, a list of project tasks allows complete specification. However, the method is limited in use since project tasks are rarely known at the beginning of projects.
The Delphi technique involves using anonymous and independent assessors who use an iterative process to reach convergence; theirs is the best estimate. The function-point analysis involves either Albrecht’s method or Symons’s method. The latter considers performance influences like environment and personnel. Albrecht’s method is a three-phase process in which estimators first identify inputs and outputs in the system and then they assign functional points.
This precedes a second step which is the calculation of processing complexity adjustment. Thereafter, one will use final project scores to define man-hours. A typical example is a user who may need a word processor or dictionary; word processing and the personal dictionary are function points, which lead to the spell checker. The number of processed words or misspelled words reports would be the function points exiting the spellchecker to the user.
When estimating, an entity may require proportional activities like team-leading, quality control, quality assurance, familiarisation, documentation, customer reviews, post-implementation or technical training. The project team may also need to be elapsed- time activities like project management, implementation management, project office, configuration management, and systems management. It is better to rely on more than one cost-estimation method. User and staff characteristics as well as warranty and installation influence cost estimation. Furthermore, one must incorporate fixed and overhead costs. Estimates are often dependent on assumptions.