Systems analysts typically gather facts about existing systems by interacting with end users.

Multinational corporations (MNCs) recognize the importance of optimizing their international site-selection decision(s). Similarly, host regions recognize the potential economic benefits of securing ‘greenfield’ investment projects. Thus, there is intense intra-regional competition to secure ‘trophy investments’. The extant research offers little insight to the very final stages of the site-selection negotiation process. No previous study has examined in detail the final stages of international site-selection negotiations involving South Korean MNCs. This paper focuses on the role of US government officials and Southern state stakeholders seeking to secure investments by Hyundai Motors and Kia Motors. It offers an exploratory eclectic theory to explain site-selection decisions.

Read

by Lucky Twister

BUS 499 All Weeks Quizzes Click below link for Answer https://www.sobtell.com/q/tutorial/default/206375-bus-499-all-weeks-quizzes https://www.sobtell.com/q/tutorial/default/206375-bus-499-all-weeks-quizzes Video Quiz: lululemon (Ch 1) Lululemon’s... More

Read the publication

This preview shows page 1 - 2 out of 2 pages.

dictum vitae odio. Donec aliquet. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Nam lacinia pulvinar tortor nec facilisis. Pellentesque dapibus efficitur laoreet. Nam risus ante, dapibus a molestie consequat, ultrices ac magna. Fusce dui lectus, congue vel laoreet ac, dictum vitae odio. Don

m ipsum dolor sit ame

Recap of Agile DW/BI Coding Practices

Ralph Hughes MA, PMP, CSM, in Agile Data Warehousing for the Enterprise, 2016

Systems Analyst

The systems analyst (SA) is another role that goes by many names. Calling this position a business analyst, process architect, or the team’s technical lead does somewhat change the function of the role, but a core set of duties always remains. Whereas the DM supplies the design of the warehouse’s data structures, the SA provides the design of the transformation processing that will populate those structures with information. This responsibility naturally imparts to the SA four subsequent duties that greatly assist developers when it comes time to code data transforms:

1.

Maintain the application’s upper-level data flow diagrams

2.

Standardize the design patterns and technical requirements for the ETL modules

3.

Author the source-to-target mappings that the ETL modules will implement (using the 80/20 specs described later)

4.

Profile the source data

As with the PA role, the SA can delegate portions of the work to the other developers on the team but in the end must be able to certify the results and thus cannot escape overall responsibility for these duties.

The agile manifesto claims that the best designs arise out of self-organized teams. In practice, however, the project will move faster and achieve a more coherent result if the SA takes the lead on authoring level 0 and level 1 data flow diagrams. Similarly, for the sake of intelligibility and maintainability, all warehouse modules should follow a single set of design patterns, and DW/BI teams save time if the SA will take ownership of authoring and validating these patterns as well.

Although teams gather incredible speed and quality when ETL developers work eye-to-eye with the customer to derive source-to-target logic, this happy arrangement is unfortunately a very rare occurrence. The analysis required to plan the transforms is frequently just outside the skill set of the average ETL coder, and so most programmers are neither willing nor competent to perform this work. When the data transforms become complex and intertwined enough, most ETL developers will insist that team leaders provide at least draft specifications for them to follow while programming.

Because they are charged with drafting source-to-target maps, SAs often inherit the duty of profiling source data. One cannot draft source-to-target logics without understanding the source data, which must be explored and documented, either with a software utility or by hand. Identifying relationships and data quality problems within operational systems is also outside the interest and skills of typical ETL developers, so the SA addresses these objectives as well while profiling source data.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123964649000059

Adapting Agile for Data Warehousing

Ralph Hughes, in Agile Data Warehousing Project Management, 2013

Systems analyst

The systems analyst is responsible for specifying the transforms implied by both the source and target databases. This responsibility requires him to understand not only the intent of the system as provided by the project architect, but also the selected data sources in detail and the data models provided by the project’s data architect.

The need to understanding the sources requires the systems analyst to perform data profiling on relevant portions of the source systems. These data profiles provide extensive information on such notions as the cardinality of records between source tables (i.e., whether the relationship is one to many or many to many), the domains and consistency of values within the columns, and the ability to join source data dependably to other source tables. The need to understand the target makes the systems analyst one of the first and most important reviewer of the data models provided by the data architect. That position allows him to request columns as needed to support the data transformation the project architect’s solution objectives will demand.

The responsibility to specify the data transforms for the developers requires the system analyst to comprehend the user stories provided by the product owner thoroughly. Moreover, he must understand just as well the developer stories so that he can detail not just the end-to-end data transforms, but also how to transform data as it moves from one layer of the warehouse’s data architecture to the next. The systems analysts must collaborate with the data architect to specify the data integrity rules that determine which records from one layer, including the source systems, will be accepted by the warehouse into each of these architectural layers of its database.

In addition to profiles of source data, the primary work product of the systems analyst is source to target maps (STM). Often there is one STM per target table in each physical model of the data warehouse. The simple items on these mappings are the columns that are simply replicated between source and target. More complicated are the meta data columns and aggregates. The most involved STM elements are usually the derived columns, for which the systems analyst must specify the inputs, the transformation algorithm, and the target columns. Systems analysts must pay particular attention to a few special situations occurring on the source systems, such as instances (a) when records get deleted in the source systems, (b) when certain columns trigger the need to update slowly changing dimension records, and (c) when foreign key values for a target record do not identify a record in the related table. Source-to-target mappings become even more complex when the systems analyst adds in rules for the disposition of records that cannot be loaded, as well as variations for initial load versus incremental loads of a table.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123964632000089

Systems Analysis and Database Requirements

Jan L. Harrington, in Relational Database Design and Implementation (Fourth Edition), 2016

Conducting the Needs Assessment

The needs assessment, in the opinion of many systems analysts, is the most important part of the systems development process. No matter how well developed, even the best information system is useless if it doesn’t meet the needs of its organization. A system that is not used represents wasted money.

A systems analyst has many tools and techniques available to help identify the needs of a new or modified information system:

Observation: The systems analyst observes employees without interference. This allows users to demonstrate how they actually use the current system (be it automated or manual).

Interviews: The systems analyst interviews employees at various levels in the organizational hierarchy. This process allows employees to communicate what works well with the current system and what needs to be changed. During the interviews, the analyst attempts to identify the differences among the perceptions of managers and those who work for them.

Sometimes a systems analyst will discover that what actually occurs is not what is supposed to be standard operating procedure. If there is a difference between what is occurring and the way in which things “should” happen, then either employee behavior will need to change or procedures will need to change to match employee behavior. It is not the systems analyst’s job to make the choice, but only to document what is occurring and to present alternative solutions.

Occasionally observations and interviews can expose informal processes that may or may not be relevant to a new system. Consider what happened to a systems analyst who was working on a team that was developing an automated system for a large metropolitan library system. (This story is based on an incident that actually occurred in the 1980s.) The analyst was assigned to interview staff of the mobile services branch, the group that provided bookmobiles as well as individualized service to home-bound patrons. The process in question was book selection and ordering.

Here is how it was supposed to happen. Each week, the branch received a copy of a publication called Publishers Weekly. This magazine, which is still available, not only documents the publishing trade, but also lists and reviews forthcoming media (at the time, primarily books). The librarians (four adult librarians and one children’s librarian) were to go through the magazine and place a checkmark by each book the branch should order. Once a month, the branch librarian was to take the marked up magazine to the central order meeting with all the other branch librarians in the system. All books with three or more checks should be ordered, although the branch librarian was to exercise her own judgment and knowledge of the branch patrons to help make appropriate choices.

The preceding is what the systems analyst heard from the branch librarian. The five librarians, however, told a different story. At one point, they concluded that the branch librarian wasn’t exercising any judgment at all but was simply ordering all books with three checks. There was only one children’s librarian and therefore children’s books almost never received three checks. Few children’s books were being ordered.

To test their theory, the librarians placed four checkmarks next to a significantly inappropriate title—a coffee table book that was too heavy for many of their elderly patrons and that exceeded the branch’s price limit—and waited to see what would happen. The coffee table book was ordered. It was clear to them that no professional judgment was being used at the order meeting.

The librarians, therefore, took the situation into their own hands. When the branch librarian returned from the order meeting, she gave the copy of the Publishers Weekly to one of the clerks, who created cards for each ordered book. The cards were to be matched to the new books when they arrived. However, the librarians arranged for the clerk to let them see the cards as soon as they were made. The librarians removed books that shouldn’t be ordered and added those that had been omitted (primarily, children’s books). The clerk then phoned the changes to the order department.

What should the analyst have done? The process was clearly broken. The branch librarian wasn’t doing her job. The librarians had arranged things so that the branch functioned well, but they were circumventing standard operating procedure (SOP) and were probably placing their jobs in danger by doing so. This was a case of “the end justifies the means.” No one was being harmed and the branch patrons were being helped. How should the analyst have reported her findings? Should she have exposed what was really happening or should she simply have documented how the procedure was supposed to work? What would happen when the ordering process was automated and there were no longer any centralized order meetings? There would be no order cards held at the branch and no opportunity for the librarians to correct the book order.

This was a very delicate situation because if it were exposed either the branch librarian and/or the other librarians would face significant problems. A systems analyst’s job is to observe, interview, and record, not to intervene in employee relations. The book ordering process would be changing anyway with an automated system. If the librarians were to need to continue to work around the branch librarian, they would need to change their informal process as well. Therefore, the best strategy for the analyst probably was to remove herself from the personnel problems and report the process as it was supposed to work.

In other cases, where informal procedures do not violate SOP, an analyst can feel free to report what is actually occurring. This will help in tailoring the new information system to the way in which the business actually operates.

Questionnaires: The systems analyst prepares questionnaires to be completed by employees. Like interviews, questionnaires give the systems analyst information about what is working currently and what needs to be fixed. The drawback to questionnaires, however, is that they are limited by the questions asked—even if they include open-ended questions—and may miss important elements of what a system needs to do.

Focus groups: The systems analyst conducts a focus group to hear from employees who don’t use the system directly and those who may not be employees, but who use the system in some way. For example, accountants may not use the payroll system directly, but may receive output from that system as input to the accounting system. A focus group can give them a forum to express how well the input meets their needs and how it might be changed to better suit them. A retail firm that sells through a Web site may conduct a focus group for customers to find out what changes should be made to the catalog and shopping cart interfaces.

Focus groups can be a double-edged sword. The members of the focus group are not privy to many of the constraints under which a business operates and the limitations of technology. Their suggestions may be impractical. The analyst conducting the focus session needs to be careful not to make promises to group members that can’t be kept. Participants in a focus group can have their expectations raised so high that those expectations can never be met, creating disappointment and disaffection with the business.

Brainstorming sessions: When employees know that something is not right with the current system, but are unable to articulate how it should be changed, a brainstorming session allows people to toss about ideas that may or may not be feasible. The intent is to stimulate everyone’s thinking about the needs of a new or modified system without being critical.

The results of the needs assessment are collected into a requirements document or a requirements database. At this point in the process, the needs identified by the analyst are expressed in general terms. For example, the requirements document might include a need for a new Web site shopping cart that allowed users to check out on one page rather than three. The fact that the redesigned Web page needs to include the customer’s area code as a separate piece of data is not documented at this point.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128043998000028

Data Flow Diagrams

Sagjin Yoo, in Encyclopedia of Information Systems, 2003

II. Future Direction of DFD Methodology

About twenty years ago, systems analysts—Gane and Sarson, DeMarco—began to resolve the conflicts between managerial and information systems design perspectives in an elegant way. This solution described processes by focusing almost exclusively on the data that each process generated (→ data-oriented process activities). This notion suggested that by focusing on data movement and the processing activities that occurred within a business process, one could simplify process descriptions in ways that supported successful information systems. To a large extent, this idea proved valuable: systems analysis techniques based on this perspective now dominate most software systems engineering work. Focusing on the data had the following advantages:

1.

A focus on data minimized the distractions generated by other process characteristics. For example, knowing what data an order entry process required enabled system designers to build systems that would support order entry no matter who did the work, no matter where the work was done, and no matter when it was done. The system design, from this perspective, depends less on attributes of the existing process than on the data within the process. In this sense, a focus on data alone enabled systems analysts to move quickly beyond process details to recognize the minimum necessary characteristics of a process that their systems would have to support.

2.

A focus on data made it easier to recognize potentially new process designs. Abstracting a general picture of the data used in a process from the specific details that represented examples of that process tended to provide new insights for process redesign. Consider the ordering example we discussed above. In the more complex version of that process, accounting matches two types of purchase orders with a shipping manifest. A systems analyst using data-driven techniques would quickly recognize that all three forms carried largely the same data, ask why they were all needed, and ask why the matching step needed to take place. In organizations where Accounting waited days or weeks to obtain copies of all three forms prior to sending out an order (not really so unlikely in many businesses), such a process innovation could have important business benefits (e.g., shorter delivery times).

In effect, focusing on data enabled systems analysts to extract only those details most necessary to the development of the information system from the masses of detail collected to describe organizational processes. This simplified process view often offered insights for potential process redesigns. Insights arose for at least two reasons:

1.

Developing software-based process support forces analysts to examine the logic by which organizations make choices within processes. Electronic support for order entry, for example, requires that a system “know” how orders are officially approved, how products are confirmed to be backordered, how packages are approved for shipment, etc.

2.

Developing the algorithms (e.g., the specific instructions) that software can use across a range of process variations forces analysts to generalize generic process characteristics from observed process specifics. Orders from both large and small customers, for example, are likely to have features in common as well as specific differences based upon customer size: an efficient software system would know how to handle common characteristics using the same software code, thereby minimizing the amount of customized code that would have to be produced to manage process variations.

This interaction between process descriptions and the requirements of software engineering led to a standardized way of analyzing business processes. This perspective understood processes from the point of view of the data and processing steps that they generated. It led to a four-step analysis of data flows and their related processing steps:

1.

Data flows and processing steps were observed within the process as it was currently practiced in the organization. The resulting model was often referred to as the “physical” or “implementation” model of the system.

2.

The underlying logic represented by observed process steps was abstracted to build a generalized model of the process. This model was often referred to as the “logical” or “essential” model of the system. “Logical,” in this sense, referred to the programming logic that analysts would have to use to write the software code needed to build the system.

3.

After closely examining patterns of data flows and processing that emerged from the physical and logical models, analysts could often suggest ways of executing process logic more effectively. In other words, they could suggest process improvements. The logical model of these improvements would comprise the key characteristics of a new process that, if implemented, would enable improvements in business performance based on a combination of process redesign and newly available information system support.

4.

If management decided to adopt the new process, the combination of process design and information system support would become the basis of new practice within the organization, giving rise to a new set of physical process models.

A summary of this approach to process analysis is shown below in Fig. 2. It suggests how a systems analyst could analyze data flows (DFA-perspective) to move from observations of current practice in an organization to understanding (1) key characteristics of current practice, (2) key characteristics of improved practice, and (3) potential redesigns that could become future practice to support improved business performance. The curved arrow in the figure suggests the analytical progression that would support such conclusions.

Systems analysts typically gather facts about existing systems by interacting with end users.

Figure 2. Types of data flow process models: physical, logical, and process redesign. See http://roger.babson.edu/Osborn/doit/readings/dfdsumry.htm.

We will refer to the analytical perspective that focuses on data movement and the processing implications of work tasks as DFA. The following sections describe a graphical approach to understanding data flows that provides a way to build data flow models quickly and consistently. These models offer a useful way to compress large amounts of process information into a two-dimensional space that assists in both understanding the logic of a process and identifying key data entities within the process. For this reason, DFA can be seen as a technique complementary to ERD. Indeed, the DFA advanced data flow diagrams methodology, can be seen as a data-driven process representation that is one step further away from the database-design-specific characteristics of ERD.

In conclusion, while DFDs do depict data, they are more aptly described by most authors as being process oriented. In DFDs, data is described/shown in chunks (data stores) and the structure and details of the data are not the primary focal point in a DFD, even though they are part of techniques related to DFDs such as the data dictionary (DD). In DFDs, processes are the focus of decomposition and refinement and hence the notion of DFDs being process oriented. The data flows to and from the processes are indeed important, but processes drive the construction and use of the diagram. At most, it is reasonable to say that DFDs are both process and dataflow driven. While DFA is a tool that analyzes the data movement or patterns in a process, ERD is more focused on the data not on the process utilizing the DBMS in most cases for analysis, design, and implementation of the data-oriented information systems (Fig. 3).

Systems analysts typically gather facts about existing systems by interacting with end users.

Figure 3. Complementary data-driven system design techniques and process design. See http://roger.babson.edu/Osborn/doit/readings/dfdsumry.htm.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0122272404000319

Remediation and Improvement Planning

David Loshin, in The Practitioner's Guide to Data Quality Improvement, 2011

12.4.2 Root Cause Elimination

If the data quality team and system analysts have determined the specific location and root cause of the introduction of the error, and they have determined that there are flaws in the processes that can be corrected, they can assess the options for correcting the process to eliminate that root cause. At this point, they can:

Evaluate the level of effort for each alternative;

Select one of the alternatives as the proposed fix;

Determine the time frame for the fix;

Provide a development plan; and

Provide a test plan.

If the level of effort and the associated costs are reasonable and the resources are available, then it makes sense to execute the plan to eliminate the root cause of the issue.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123737175000129

Systems Analysis

Tonya Barrier, in Encyclopedia of Information Systems, 2003

II. People Involved

The individuals involved in systems analysis projects include systems analysts and nontechnical individuals (sponsors and users). The system analyst is the individuals) that works closely with the nontechnical individuals to determine the system's requirements. Basically, it is the systems analyst's responsibility to manage the specific methodology so that the system can be developed or modified. The nontechnical people include sponsors and users. The sponsors are the individuals responsible for assigning resources (money, time, materials, and man hours, etc.) to the project. The users are the individuals that receive the benefits of the system. The user may or may not have direct contact with the system. A user could simply receive a written report or have the ability to retrieve an online report from the system.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0122272404001775

Artifacts for the Enterprise Requirements Value Chain

Ralph Hughes MA, PMP, CSM, in Agile Data Warehousing for the Enterprise, 2016

Special Requirements

This section of the use case allows the systems analyst to describe important details that cannot be recorded easily on the diagrams used for flow of events. For DW/BI modules, developers often use the special requirements section to document design topics that can be addressed in a sentence or two. A common example is the action that a fact-table load module should take when a lookup fails to provide a surrogate key for a dimension table record. Typically, EDW load modules place a surrogate key value in the fact table that links to a pre-seeded dimension record indicating “unknown value.” The need for a seeded record and its role as a default return value should be noted in a module use case’s special requirements section.

Another common example is the restart logic that a given module should follow after an ETL run ends abnormally. The systems analyst preparing the module use case might note in this section that the records loaded by the earlier, failed ETL process should be removed at the beginning of the next run so that the load module can simply restart its work at the top of the landing table without fear of duplicating target records.

A third special requirement frequently found in ETL module use cases is the desire of certain roles in the organization to be notified when the ETL for a given portion of EDW data has finished processing. Frequently, different managers monitor different areas of the warehouse. Order entry management cares most about conformed customer dimension, for example, and product managers care only about loading the conformed product tables. A module use case would spell out this notification pattern in the special requirements section.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123964649000102

Systems Analysis and Database Requirements

Jan L. Harrington, in Relational Database Design (Third Edition), 2009

Conducting the Needs Assessment

Many systems analysts believe that the needs assessment is the most important part of the systems development process. No matter how well developed, even the best information system is useless if it doesn't meet the needs of its organization. A system that isn't used is just a waste of money.

A systems analyst has many tools and techniques available to help identify the needs of a new or modified information system, including:

Observation: To conduct an observation, the systems analyst watches employees without interference. This allows users to demonstrate how they actually use the current system (be it automated or manual).

Interviewing: The systems analyst interviews employees at various levels in the organizational hierarchy. This process allows employees to communicate what works well with the current system and what needs to be changed. During the interviews, the analyst attempts to identify the differences among the perceptions of managers and those who work for them.

Sometimes a systems analyst will discover that what actually occurs isn't what is supposed to be standard operating procedure. If there is a difference between what is occurring and the way in which things “should” happen, then either employee behavior will need to change or procedures will need to change to match employee behavior. It's not the systems analyst's job to make the choice but only to document what is occurring and to present alternative solutions.

Occasionally observations and interviews can expose informal processes that may or may not be relevant to a new system. Consider what happened to a systems analyst who was working on a team that was developing an automated system for a large metropolitan library system. (This story is based on an incident that actually occurred in the 1980s.) The analyst was assigned to interview staff of the mobile services branch, the group that provided bookmobiles as well as individualized service to housebound patrons. The process in question was book selection and ordering.

Here's how it was supposed to happen: Each week, the branch received a copy of a publication called Publishers Weekly. This magazine, which is still available, not only documents the publishing trade but also lists and reviews forthcoming media (primarily books). The librarians (four adult librarians and one children's librarian) were to go through the magazine and place a check mark next to each book the branch should order. Once a month, the branch librarian was to take the marked-up magazine to the central order meeting with all the other branch librarians in the system. All books with three or more checks were to be ordered, although the branch librarian was to exercise her own judgment and knowledge of the branch patrons to help make appropriate choices.

The preceding is what the systems analyst heard from the branch librarian. The five librarians, however, told a different story. At one point they concluded that the branch librarian wasn't exercising any judgment at all but was simply ordering all books with three checks. There was only one children's librarian, and therefore children's books almost never received three checks. Few children's books were being ordered.

To test their theory, the librarians placed four check marks next to a significantly inappropriate title—a coffee table book that was too heavy for many of their elderly patrons to handle and that exceeded the branch's price limit—and waited to see what happened. The coffee table book was ordered, so they thought that no professional judgment was being used at the order meeting.

The librarians therefore took the situation into their own hands. When the branch librarian returned from the order meeting, she gave the copy of the Publishers Weekly to one of the clerks, who created cards for each ordered book. The cards were to be matched to the new books when they arrived. However, the librarians arranged for the clerk to let them see the cards as soon as they were made. The librarians removed books that shouldn't be ordered and added those that had been omitted (primarily children's books). The clerk then phoned the changes to the order department.

What should the analyst have done? The process was clearly broken, and it appeared that the branch librarian wasn't doing her job. The librarians had arranged things so the branch functioned well, but they were circumventing SOP and were probably putting their jobs in jeopardy by doing so. This was a case of “the end justifies the means.” No one was being harmed, and the branch patrons were being helped. How should the analyst have reported her findings? Should she have exposed what was really happening, or should she simply have documented how the procedure was supposed to work? What would happen when the ordering process was automated and there were no longer any centralized order meetings? There would be no order cards held at the branch and no opportunity for the librarians to correct the book order.

This was a very delicate situation because if it were exposed, either the branch librarian or the other librarians (or both) would face significant problems. A systems analyst's job is to observe, interview, and record, not to intervene in employee relations. The book-ordering process would be changing anyway with an automated system. If the librarians had to continue to work around the branch librarian, they would need to change their informal process as well. Therefore, the best strategy for the analyst probably was to remove herself from the personnel problems and report the process as it was supposed to work.

In cases where informal procedures do not violate SOP, an analyst can feel free to report what is actually occurring. This will help in tailoring the new information system to the way in which the business actually operates.

Questionnaires: The systems analyst prepares questionnaires to be completed by employees. Like interviews, questionnaires give the systems analyst information about what is working currently and what needs to be fixed. The drawback to questionnaires, however, is that they are limited by the questions asked—even if they include open-ended questions—and may miss important elements of what a system needs to do.

Focus groups: The systems analyst conducts a focus group to hear from employees who don't use the system directly and those who may not be employees but who use the system in some way. For example, accountants may not use the payroll system directly but may receive output from that system as input to the accounting system. A focus group can give them a forum to express how well the input meets their needs and how it might be changed to better suit them. A retail firm that sells through a Web site may conduct a focus group for customers to find out what changes should be made to the catalog and shopping cart interfaces.

A focus group can be a double-edged sword. The members of the focus group are not privy to many of the constraints under which a business operates and the limitations of technology. Their suggestions may be impractical. The analyst conducting the focus session needs to be careful not to make promises to group members that can't be kept. Participants in a focus group can have their expectations raised so high that those expectations can never be met, creating disappointment and disaffection with the business.

Brainstorming sessions: When employees know that something isn't right with the current system but are unable to articulate how it should be changed, a brainstorming session allows people to toss about ideas that may or may not be feasible. The intent is to stimulate everyone's thinking about the needs of a new or modified system without being critical.

The results of the needs assessment are collected into a requirements document. At this point in the process, the needs identified by the analyst are expressed in general terms. For example, the requirements document might include a call for a new Web site shopping cart that allowed users to check out on one page rather than three. The fact that the redesigned Web page needs to include the customer's area code as a separate piece of data isn't documented at this point.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123747303000024

Managing Information Security

John J. Fay, David Patterson, in Contemporary Security Management (Fourth Edition), 2018

The Security Analyst

The security analyst performs work similar to the typical systems analyst except the security analyst has a background in information security. The role of the security analyst is to inspect the risks posed to electronic assets, investigate relevant threats and vulnerabilities, and recommend safeguards appropriate to the size, scale, and complexity of the business. Usually, the security analyst will have at least a bachelor of science degree in mathematics, engineering, or computer science, a business or industry background, and technology or industry certifications. The security analyst may not have direct hands-on experience with implementing security countermeasures, and it is not unusual for an analyst to transition to engineering work in the course of their career.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128092781000177

Business Process Management

Fred A. Cummins, in Building the Agile Enterprise, 2009

Modeling Languages

Before computer-based modeling tools, managers and systems analysts would draw diagrams on paper with boxes and arrows to depict the flow of control or work products, or they would write “playscripts” expressing the sequence of “who does what” in a tabular structure. These models were cumbersome and inflexible.

Computer-based tools were developed to help design and manage these models. The computer can draw the boxes and arrows more quickly and precisely, the boxes and lines can easily be moved, and the computer can do some basic checks to see that the diagrams are consistent with process concepts.

These tools can provide specifications of processes that can be automated. Some tools provide high-level “business” views of processes. These high-level process models might be viewed as “requirements” for process automation specifications. Some tools are designed to specify process details, to resolve exceptions and abnormal terminations that must be precisely defined for automated processes.

Standards of the Object Management Group (OMG), an international standards organization, have been developed to provide robust modeling for business processes, including both those performed by humans and those that are automated. Business process modeling notation (BPMN) was developed to give businesspeople standard graphical modeling elements so that they could share and collaborate on business process diagrams.

The BPMN graphics have been implemented in many business process modeling systems (BPMS) for specification of automated processes. However, due to differences in interpretation and technical approaches along with proprietary extensions, BPMN diagrams are not entirely consistent from one tool to the next. Until recently, there was no standard for exchange of BPMN diagrams. XML Process Definition Language (XPDL) was developed by the Workflow Management Coalition for this purpose. However, though XPDL may repeat the same diagrams in the new tool, due to variations in interpretation of the graphics a process model transferred from one tool to another still may not produce the same runtime result.

The Business Process Definition Metamodel (BPDM) was developed by OMG as a computer-based representation of BPMN diagrams to provide a platform-independent modeling (PIM) language for business-level process modeling. The rigor of designing the computer language clarifies the meaning of BPMN graphics. Insights on modeling processes for SOA have made BPDM more robust than BPMN, particularly for reconciling an internal process with a specification for interaction with another process (that is, a choreography). Finally, BPDM supports XML for Metadata Interchange (XMI) a standard form of model exchange, so business process models can be exchanged between tools and interpreted consistently. At the time of this writing, an OMG initiative is under way to reconcile BPMN and BPDM to a single language specification, called Business Process Model and Notation (BPMN) 2.0, preserving the BPMN brand.

Some tools translate BPMN models to languages designed for automation. For example, Business Process Execution Language (BPEL) is an alternative language used for specification of automated processes. It is likely that tools relying on BPEL will be phased out in favor of tools that implement BPMN models more directly and display the processes in terms of BPMN graphics during actual execution. This will simplify BPMN application and will improve understanding and control of the business processes.

BPMN 2.0 will provide full representation and integration of choreography. Choreography provides a shared specification of the interaction between participants in a collaboration. BPEL and XPDL do not address choreography. There are currently two alternative choreography specification languages: Web Services Choreography Definition Language (WS-CDL) from the World Wide Web Consortium (W3C) and e-business Business Process (ebBP) from the Organization for Advancement of Structured Information Systems (OASIS). Both these languages are based on XML. It may be appropriate to transform the choreography specification of a BPMN model to a choreography specification in one of these languages for runtime specification of service interactions.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780123744456000030

Which of the following is a technique used to create Web sites that will conform to the screen resolution of the user?

Responsive web design entails the process of designing websites that adapt to all devices, platforms, and screen sizes in terms of both form and function.

Which is the first step in developing a new information system?

Gathering information requirements is the first step in developing a new information system.

Which method below uses fourth generation computer languages?

Which method below uses fourth generation computer languages? End User Development - uses fourth generation computer languages, enabling users to code with little expert assistance.