SlideShare a Scribd company logo
Running Head: NETWORK INFRASTRUCTURE
VULNERABILITIES1
NETWORK INFRASTRUCTURE VULNERABILITIES3
Project Paper: Network Infrastructure vulnerabilities
Name
Institutional Affiliations
Section 1: Infrastructure Document
Computer networks have increasingly become ubiquitous and
synonymous especially with the organizations that thrive on
excellence, as well as, those who would want to adopt cloud
technology and virtualization within their companies. Today,
most organizations that set up their businesses ensure that they
have incorporated an efficient computer network infrastructure
that will connect the business to the outside world through
Internets. This is because, research has shown that the present
business depend heavily on network infrastructure platforms
that make communication easy, efficient, available, as well as,
accessible. Consequently, despite the fact that robust computers
networks have made it easier by providing a basis of
interactivity and bringing a whole lot of people and businesses
together, all these at one point have amounted to growing
security concerns over the past years across various sectors and
industries. This paper will therefore identify some of the
possible network infrastructure vulnerabilities, as well as,
describing a comprehensive security policy that helps in
protecting the company infrastructure and assets by applying
the principle of CIA.
A network consists of devices such as routers, firewalls, generic
and hosts which include servers and workstations. Equally,
there are thousands of network vulnerabilities; therefore,
organizations should ensure that they focus on tests that will
produce a good overall assessment of the network especially
when they store their data in the cloud, however, there may be
risk of non-compliance and regulation, due to lack of control
over where data is stored. The possible network infrastructure
vulnerabilities include; improper system configuration, poor
firewall deployment, poor anti-virus implementation, weak
password implementation, lack of efficient physical security,
lack of appropriate security policies and many others.
Vulnerabilities can be successfully contained by putting
measure in place, for example, the Network Administrator
should be in position to gather information about viruses and
worms, as well as, identifying network vulnerabilities by
getting information that helps in preventing security problems.
Security measures for Network vulnerabilities can be accessed
through three main stages which involve planning, conducting
and inference (Markluec, 2010). In planning stage, there is an
official agreement that is signed between the concerned parties.
The document signed is important because it will contain both
legal and non-disclosure causes that serve to protect the ethical
hacker against possible law suit. Conducting stage involves the
evaluation of technical reports prepared based on testing
potential vulnerabilities. Lastly, in inference stage, the results
of the evaluation are communicated to the organization and
corrective action is taken if needed.
A logical and a physical layout of planned network
A logical topographical layout of network illustrates all the
logical features of the network which comprises of logical
networks, routing tables, as well as, assigned IP addresses to a
variety of hosts and devices. Conversely, a physical layout of
the networks, on the other hand, represents the physical location
and the association between the different devices participating
on the network. In a physical network layout, each computer is
connected to the hub and a cable line, this is important because
it helps the administrator to visualize how much equipment he
or she will need. Having a well planned illustration view of
both physical and logical network is significant because it helps
to facilitates identify probable security problems. For example,
when unnecessary visitor attempts to acquire access to sensitive
data or information, he or she will first generate a map of
network to check on security checkpoint that is firewall or other
similar devices that are established, and to what access can be
attained. The diagrams below show a logical and a physical
layout of planned network.
Physical network diagram
Internet
Router
Switch
WIFI router
Switch
PC
PC
PC
PC
Ring
Server
PC
Printer
Scanner
PC
IP Phone
IP Phone
Logical Network diagram
PC
PC
PC
PC
172.16.44.1172.16.48.21
172.16.52.1
172.16.56.23172.16.60.9
PC
Illustrate the possible placement of servers
When designing a logical and physical network, the network
administrator should thoroughly illustrate the possible
placement of servers, including the access paths to firewalls, as
well as, internets. Conversely, facility limitations, routers,
printers, switches, bridges, workstations, and access points
should be as well considered when designing a network. The
basic design of this network demonstrates the relation to the
Internet using a broader router and firewall. In order to design a
well secured network, various factors must be considered into
contemplation such as the topology and placement of hosts
inside the network, the selection of software and hardware
technologies, as well as, the suspicious configuration of the
components.
Comprehensive Security Policy for the Company
The acts of providing trust information; confidentiality,
Integrity and availability (CIA) of the information are not
violated. For instance many companies ensure that they have
backups just in case data is lost when critical issues arise.
Confidentiality, integrity and availability, also known as the
CIA triad is a model that has been designed to guide policies for
information security within organizations. Confidentiality is a
set of rules that limits access to information, integrity is the
assurance that the information is accurate, while availability is
a guarantee of reliable access to the information by authorized
people. The business’ assets may be measured in terms of its
employees and buildings, all these are stored in the form of
information, whether electronic data or written documents,
therefore, if these information are disclosed to unauthorized
individuals, is inaccurate, or not available when it is needed,
then the business may suffer significant harm, which include,
loss of customer confidence, contract damages, or even
reduction in market share.
Confidentiality of information is very important not only for
corporations but for Governments as well. Most organizations
have developed various measures to ensure the confidentiality
of information by preventing sensitive information from
reaching wrong people, as well as, making sure that the right
people get the information at the right time they need them.
Access must be restricted to those authorized to view data,
however, sometimes, safeguarding data confidentiality may
require special training, which typically include security risks
that may threaten information. One way to ensure
confidentiality of information is through data encryption. For
example, users are encouraged to use Passwords that have at
least eight characters that will prevent the attackers from
hacking into their passwords and gaining access to their
personal information. Integrity on the other hand involves
maintaining the accuracy and consistency of data over its entire
life cycle. For example, data should not be changed in transit;
therefore, steps should be taken to ensure that data cannot be
altered by unauthorized people (TechTarget, 2016).
Employees within organizations should ensure that they do not
disclose any information regarding the organization because
when they do so, those who are not authorized to accessing the
information may get the opportunity to leak out the information
by hacking into the system. Conversely, users should ensure
that they created passwords that make it hard for attackers to
hack these passwords. Passwords should never have only lower
case or upper case letters, however, both lower and upper case
letters can be used together, since this will make it a bit
difficult for hackers to guess the password of the users.
Organizations on the other hand, should make sure that they
have implemented a secured network within the organization
using firewalls that prevent hackers from bringing the whole
system down.
Section 2: Revised Project Plan
One of the critical factors for project success is having a well-
developed project plan. There are various step approaches to
follow when creating a project plan. These include; project
description, project objectives, project management plan
purpose, project deliverables, project milestones, project Roles
and Responsibilities, project scope management, project time
management, and many others. In this case, we shall revise the
previous project plan while updating the project plan template;
from Project Deliverables 4: Cloud Technology and
Virtualization with at least three new project tasks each
consisting five to ten subtasks.
I personally support the need for use of cloud technology and
virtualization. Use of cloud computing can change the risk
posture and profile of a company. For example, companies that
have adopted cloud technology avoid the risks that a large
investment in IT resources will not pay off. Conversely, cost
savings in hardware infrastructure are relatively easy to
quantify particularly for those companies that adopt cloud
technologies in a public cloud, on-site hardware will be
replaced less often and less new hardware is purchased.
Cloud and virtualization technology align with the company’s
business processes and assist with attainment of organizational
goals by providing a more productive environment not only for
a collaborative working, but also it improves the company’s
productivity by enabling participants in a business ecosystem to
share processing logic (Parker, 2012). Some of the new project
tasks in project deliverables for Cloud Technology and
Virtualization include; host server patching, monitoring the
virtual machine sprawl, as well as exploring backup options.
These tasks are important in ensuring efficient, healthy and a
secure virtual environment. Some of the subtasks involve in
host server patching include configuring the software updates
installation and running compliance and vulnerabilities. A
compelling recommendation for solution providers and partners
that could help a company secures a firm competitive advantage
by using cloud and virtualization technologies is to ensure that
all the systems are up-to-date and firewalls are installed in the
systems to prevent attackers from hacking the system.
References
Markluec, M. (2010). Some Common network vulnerabilities
persist. Retrieved on 26 Feb, 2016 from
http://gsnmagazine.com/article/20994/some_common_network_
vulnerabilities_persist
TechTarget, (2016). Confidentiality, Integrity, and availability
(CIA triad). Retrieved on 26 Feb, 2016 from
http://whatis.techtarget.com/definition/Confidentiality-
integrity-and-availability-CIA
Parker, J. (2012). Business Requirements vs. Functional
Requirements. Retrieved on 26 Feb, 2016 from
http://enfocussolutions.com/business-requirements-vs-
functional-requirements/
Running Head: PROJECT PAPER1
PROJECT PAPER4
Project Paper
Name
Institutional Affiliations
Section 1: Design Document
There are many reasons why many companies would want to
adopt cloud technology and virtualization within their
companies. The most significant ones are identified as business
performance resourcing, rapid-go-to-market, business agility, as
well as, cost reduction. Often, there are no serious reasons why
organizations choose to adopt cloud computing given that the
decisions always depends on the complex combination of
reasons, rather than being based on a single factor. Use of cloud
computing can change the risk posture and profile of a
company. For example, by companies using public cloud, they
will definitely avoid the risk that a large investment in IT
resources will not pay off; however, it can introduce security
risks because the resources are shared with unknown parties.
Conversely, there may be risk of non-compliance and
regulation, due to lack of control over where data is stored. The
need for using cloud technology within the company is the
ability to take advantage of new business opportunities, cost,
agility, and productivity.
Cloud computing delivers improved agility because it has on-
demand rapid elasticity. For instance, the IT resources can be
deployed more quickly and increased as needed to meet demand.
This gives opportunities for enterprises to innovate, introduce
new products and services, enter new markets, as well as,
adapting to changing circumstances. The other reason for the
need for the use of cloud technology and virtualization within
the company is that, it increases productivity. Cloud technology
provides a more productive environment not only for a
collaborative working, but also it improves the company’s
productivity by enabling participants in a business ecosystem to
share processing logic. Cloud computing or technologies create
new business opportunities, for example, it can give an
enterprise new business opportunities as a provider of cloud
services or added services. Companies that excel in quality of
its IT can become a public SaaS (Software-as-a-Service), PaaS
(Platform-as-a-Service) and IaaS (Infrastructure-as-a-Service)
provider. A good example of this is in a case where a company
implements a private cloud, has spare capacity, as well as, sells
that capacity as public cloud. Cloud computing within the
company also cut businesses costs. For example, given the
benefits of agility, productivity, and quality that cloud
computing have, companies might expect these to be generally
more expensive, however, this is not the case; reduced cost is
one of the main reasons why many companies are turning to the
cloud technologies (Perilli, 2009).
Research has shown that few technologies have affected the IT
industry. An example is cloud computing that delivers
computing as a service. Part of cloud’s appeal is evidently
financial; that is, it allows organizations to shed some of their
expensive IT infrastructure (software and hardware) and shifts
computing costs to more manageable operational expenses.
Cloud computing is not only about purchasing hardware and
infrastructure, instead, the companies, as well as, individuals
usually compare the cost of on-premise server to the cost of
cloud server. In the mind of these entities, the on-premise
server is most likely to depreciate itself may be in 3 years to
come while the cloud computing server will be treated as a
continuous expenditure. Cost savings in hardware infrastructure
are relatively easy to quantify. For example, if companies adopt
cloud technologies in a public cloud, on-site hardware will be
replaced less often and less new hardware has to be purchased.
This is significant because it will in turn lead to much-reduced
power, less space needed in the data center, as well as, cooling
costs. Studies shows that a medium-sized cloud deployments
lead to about 62 percent savings within a year compared with an
on-premise system. However, for larger cloud deployments, the
annual savings may be about 50 percent compared with on-
premise implementations (Schwartz, 2013).
Labor cost savings is one of the possible reduction of human
capital that can be realized from implementing and
virtualization technologies. By off-loading software, application
or a platform to a private cloud platform, less time is needed to
maintain, administer, as well as, troubleshoot the technologies.
For instance, if there is a system administrator who is in charge
of about 140 servers, the same system administrator can be in
charge or responsible for thousands of cloud –based servers as
well (Schwartz, 2013). IT management costs can as well be
reduced through automated provisioning. For example, in the
cloud, virtual servers are usually provisioned automatically
instead of manually. This is significant because it reduces the
downtime along with the compliance issues.
The diagram below illustrates how cloud and virtualization
technology aligns with the company’s business processes and
assist with attainment of organizational goals.
Cloud Provider
Cloud Consumer
Service Orchestration
Resource abstraction and control layer
Physical Resource Layer
Service
Management
PaaS
SaaS
IaaS
The main actors in the above diagram are cloud consumers and
cloud provider. A cloud consumer is an individual or
organization that acquires and uses cloud products and services,
while cloud provider in this case is a person or an organization
that are responsible for making a service available to a cloud
consumer.
One area concern from many organizations regarding cloud
technologies and virtualization is how to trust the infrastructure
upon which the data and workloads will be run. As a result,
companies who are interested in moving to the cloud and
manage their risk profiles should look for Cloud Service
Providers that offer security and trust services, as well as, the
ability to enforce and audit policy on the data being deployed.
A compelling recommendation for solution providers that could
help the company secure a firm competitive advantage using
cloud and virtual technologies is recommending these firms to
advance their server security for physical, virtual, and cloud
servers in regards to what enterprises chose to use for storing
their information in the cloud. Enterprises should adopt data
security that will protect the enterprise applications and data
from breaches, as well as business disruptions. This is important
because it will help organizations to simplify security
operations while enabling regulatory compliance and cloud
projects. Another compelling recommendation for solution
providers that could help the company secure a firm competitive
advantage using cloud and virtual technologies is through
automation and self service. Organizations should discover best
practices in the cloud and encapsulate those in new procedures
for ongoing support, provisioning of resources, as well as,
application design. In the case of automation, firms can
radically reduce the human cost of IT if they take the advantage
of schedule maintenance and provisioning.
Section 2: Revised Project Plan
One of the critical factors for project success is having a well-
developed project plan. There are various step approaches to
follow when creating a project plan which includes; project
description, project objectives, project management plan
purpose, project deliverables, project milestones, project Roles
and Responsibilities, project scope management, project time
management, and many others. In this case, we shall revise the
previous project plan while updating the project plan template;
from Project Deliverables 3: Database and Data Warehousing
Design with at least three new project tasks each consisting five
to ten subtasks.
A database is an organized collection of data that can easily be
accessed, managed, and updated. Databases are set up so that
one set of software programs allows users to access to all. In
databases, there are rows and columns which are used to store
information. Data warehouse is an electronic storage of a large
amount of information, and are stored in a manner that is
secure, reliable, easy to manage and retrieve. Usually, data
warehouse is the only option for data collection Company that
has a huge collection of data. For example, every moment users
visit a single page in a website that will generate thousand
records to be saved. Some of the new project tasks in project
deliverables for Database and Data Warehousing Design
include; improving information access, bringing the user in
touch with their data, enhancing the quality of decisions,
managing schema objects, such as tables, indexes, and material
views, managing users and security, as well as, providing cross-
function integration.
All the above project tasks in project deliverables for Database
and Data Warehousing Design are important and difficult step
for designing a software product for the reason that they
determine what the user wants, this is because users often are
unable to communicate the entirety of their needs, and the
information they provide may also be incomplete (Parker,
2012). By managing schema objects, such as tables, indexes,
and material views, some of the subtasks include defining
alternatives to triggers by defining data version control and
management while the subtasks of managing users and security
include creating login less users, implementing certificate-based
security, as well as defining appropriate database roles and
permissions for the users.
References
Perilli, W. (2009). The benefits of virtualization and cloud
computing. Retrieved on 18 Feb, 2016 from
http://virtualization.sys-con.com/node/870217
Parker, J. (2012). Business Requirments vs Functional
Requirements. Retrieved on 12 Feb, 2016 from
http://enfocussolutions.com/business-requirements-vs-
functional-requirements/
Schwartz, P. (2013). Cloud Computing can generate massive
savings for agencies. Retrieved on 18 Feb, 2016 from
https://fcw.com/microsites/2011/cloud-computing-
download/financial-benefits-of-cloud-computing-to-federal-
agencies.aspx
Running Head: DATABASES AND DATA WAREHOUSING1
DATABASES AND DATA WAREHOUSING4
Databases and Data warehousing
Name
Institutional Affiliations
Section 1: Design Document
A database is an organized collection of data that has been
organized so that it can easily be accessed, managed, as well as,
updated. Generally, databases are set up so that one set of
software programs allows users to access to all. Table formats
that are made up of row and columns are used to store
information. A relational database is a collection of data items
that are organized as a set of formally-described tables from
which data can be accessed or reassembled. While data
warehouse is an electronic storage of a large amount of
information, and are stored in a manner that is secure, reliable,
and easy to manage and retrieve. Usually, data warehouse is the
only option for data Collection Company that has a huge
collection of data; however, it is not much structured like
relational databases. For example, every moment users visit a
single page in a website that generates thousand records to be
saved.
Data can be outdated; however, it cannot be deleted given that
in few months down the line, the data will be used for analysis,
which will then generate the future prospects of a business. The
analysis may depend on the efficiency; that is how it has been
arranged to give clear picture for the analysis and how the data
is collected. From Management point, need for use of relational
databases and warehousing helps companies to avoid serious
challenges in terms of intense competition. For example, in a
case where a supermarket has not implemented a data warehouse
and at the end finds it very difficult to analyze the products
sold, what is not sold, when the sale goes up, and several other
queries. These can be seen as attracting challenges because
organization has to make decision as to whether, a particular
product is a hit or not.
A database schema refers to logically grouping of objects such
as tables, views, and stored procedures. Generally, a database
schema is the skeleton structure representing the logical view of
the entire database. In a company’s business and processes,
there are entities such as offices, employees, payments,
OrderDetails, Orders, customers and product Lines. Entities in
relation to database refer to a single person, place, or thing
about which data can be stored. Product lines table will store a
list of product and service line category, Orders table will store
orders placed by customers, OrderDetails table will store the
order line items of each order, Payments table will store
payments made by the customers based on the customer’s
account, Employees table will store all employees information
that includes organizational unit structure, and lastly is the
office table which will store sale office data. The following are
database tables for the above stated entities that support the
company’s business and processes.
CREATE TABLE ‘Products’ (
‘productCode’ varchar(15) NOT NULL,
‘productName’ varchar(70) NOT NULL,
‘productLine’ varchar(50) NOT NULL,
‘productScale’ varchar(10) NOT NULL,
‘productVendor’varchar(50) NOT NULL,
‘productDescription’ Text,
‘addressLine2’ varchar(45) DEFAULT NULL,
‘quantityInStock’smallint (6) NOT NULL,
‘buyPrice’ double,
‘MSRP’ double,
PRIMARY KEY (“customerCode”),
KEY “quantityInStock” (‘quantityInStock’),
CONSTRAINT “productsibfk1” FOREIGN KEY
(‘productName’) REFERENCES ‘products’ (“productName”)
) ENGINE=”InnoDB DEFAULT CHARSET��= latin 1;
CREATE TABLE ‘customers’ (
‘customerNumber’ int(12) NOT NULL,
‘customerName’ varchar(55) NOT NULL,
‘contactLastName’ varchar(45) NOT NULL,
‘contactFirstName’ varchar(45) NOT NULL,
‘phone’varchar(45) NOT NULL,
‘addressLine1’ varchar(45) NOT NULL,
‘addressLine2’ varchar(45) DEFAULT NULL,
‘city’varchar (45) NOT NULL,
‘state’ varchar(45) DEFAULT NULL,
‘postalCode’ varchar(18) DEFAULT NULL,
‘country’ varchar(45) NOT NULL,
‘salesRepEmployeeNumber’int(12) DEFAULT NULL,
‘creditLimit’ double DEFAULT NULL,
PRIMARY KEY (“customerNumber”),
KEY “salesRepEmployeeNumber” (salesRepEmployeeNumber),
CONSTRAINT “customersibfk1” FOREIGN KEY
(‘slaesRepEmployeeNumber’) REFERENCES ‘employees’
(“employeeNumber”)
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
In the above customer table, the PRIMARY KEY is
“customerNumber”. ‘customerNumber’ in the above table is one
of the attributes that uniquely identifies the customer. The
FOREIGN KEY is ‘salesRepEmployeeNumber’
CREATE TABLE ‘employees’ (
‘employeerNumber’ int(12) NOT NULL,
‘jobTitle’ varchar(45) NOT NULL,
‘lastName’ varchar(45) NOT NULL,
‘firstName’ varchar(45) NOT NULL,
‘email’ varchar(90) NOT NULL,
‘reporysTo’ int(12) DEFAULT NULL,
‘officeCode’ varchar(12) NOT NULL,
‘postalCode’ varchar(18) DEFAULT NULL,
PRIMARY KEY (“employeeNumber”),
KEY “officeCode” (salesRepEmployeeNumber),
CONSTRAINT “employeesibfk2” FOREIGN KEY
(‘officeCode’) REFERENCES ‘offices’ (‘officeCode’)
CONSTRAINT “employeesibfk1” FOREIGN KEY (‘reportsTo’)
REFERENCES ‘employees’ (‘employeeNumber’)
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
CREATE TABLE ‘offices’ (
‘office Code’varchar(12) NOT NULL,
‘ccity’ varchar(45) NOT NULL,
‘phone’ varchar(45) NOT NULL,
‘addressLine1’ varchar(45) NOT NULL,
‘addressLine2’ varchar(45) DEFAULT NULL,
‘country’ varchar(45) NOT NULL,
‘state’ varchar(45) DEFAULT NULL,
‘city’varchar (45) NOT NULL,
‘postalCode’ varchar(18) NOT NULL,
‘territory’ varchar(12) NOT NULL,
PRIMARY KEY (“officeCode”),
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
CREATE TABLE ‘orderDetails’ (
‘orderNumber’ int(12) NOT NULL,
‘productCode’ varchar(16) NOT NULL,
‘quantityOrdered’ int(12) NOT NULL,
‘priceEach’ double NOT NULL,
‘orderLineNumber’ smallint(8) NOT NULL,
PRIMARY KEY (‘OrderNumber’ ‘ProductCode’),
KEY “productCode” (“productCode”),
CONSTRAINT “orderdetailssibfk2” FOREIGN KEY
(“cproductCode”), REFERENCES ‘products’ (‘productCode’)
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
CREATE TABLE ‘orders’ (
‘orderNumber’ int(11) NOT NULL,
‘orderDate’ date NOT NULL,
‘requiredDate’ date NOT NULL,
‘shippedDate’ date DEFAULT NULL,
‘status’ varchar(15) NOT NULL,
‘comments’ text,
‘customerNumber’ int(11) NOT NULL,
PRIMARY KEY (“OrderNumber”),
KEY “customerNumber” (“customerNumber”),
CONSTRAINT “ordersibfk1” FOREIGN KEY
(“customerNumber”), REFERENCES ‘customers’
(‘customerNumber’)
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
CREATE TABLE ‘payments’ (
‘customerNumber’ int(12) NOT NULL,
‘checkNumber’ varchar (45) NOT NULL,
‘paymentDate’ date NOT NULL,
‘Amount’ double NOT NULL,
PRIMARY KEY (‘customerNumber’ ‘
KEY “checkNumber” (“checkNumber”),
CONSTRAINT “paymentsssibfk2” FOREIGN KEY
(“checkNumber”), REFERENCES ‘payments’ (‘checkNumber’)
) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1;
The database schema created below shows the entities and
attributes of the company’s business and processes
EMPLOYEE
· employeeNumber’ int(12)
· jobTitle’ varchar(45)
· lastName’ varchar(45)
· firstName’ varchar(45)
· email’ varchar(90)
· reporysTo’ int(12)
· officeCode’ varchar(12)
· postalCode’ varchar(18)
INDIXES
PRODUCTS
· productCode varchar(15)
· productName varchar(70)
· productLine varchar(50)
· productScale varchar(10)
· productVendor varchar(50)
· productDescription Text,
· addressLine2 varchar(45)
· quantityInStock smallint (6)
· buyPrice’ double,
· MSRP’ double,
INDIXES
CUSTOMERS
· customerNumber int(12)
· customerName varchar(55)
· contactLastName
· varchar(45)
· contactFirstName
· varchar(45)
· phone varchar(45)
· addressLine1 varchar(45)
· addressLine2 varchar(45)
· city’varchar (45)
· state varchar(45)
· postalCode varchar(18)
· country varchar(45)
· salesRepEmployeeNumber int(12)
· creditLimit double
INDIXES
ORDERS
· orderNumber’ int(11)
· orderDate’ date
· requiredDate’ date
· shippedDate’ date
· status’ varchar(15)
· comments’ text,
· customerNumber’ int(11)
INDIXES
OFFICE
· office Code varchar(12)
· city varchar(45)
· phone varchar(45)
· addressLine1 varchar(45)
· addressLine2 varchar(45)
· country varchar(45)
· state varchar(45)
· city varchar (45)
· postalCode’ varchar(18)
· territory varchar(12)
INDIXES
ORDERDETAILS
· orderNumber int(12)
· productCode varchar(16)
· quantityOrdered int(12)
· priceEach double
· orderLineNumber smallint(8)
INDIXES
PAYMENT
· customerNumber int(12)
· checkNumber varchar (45)
· paymentDate date
· Amount double
INDIXES
Below is an Entity-Relationship (E-R) Diagram of the
company’s business and process of the above database schema
Customers
Products
Payment
Order
Office
Employee
OrderDetails
Warehouse
orders
contains
checks
Places
Stores
makes
has
1:M1:M
1:M
1:M1:M
1:M
Below, is the description of the above E-R Diagram
* 1 instance of a warehouse stores 0 to many products
* 1 instance of a customer orders 1 to many products
* 1 instance of a customer places 1 to many orders
* 1 instance of an order contains 1 to many OrderDetails
* 1 instance of an employee checks 1 to many OrderDetails
* 1 instance of a customer makes 0 to many payments
* 1 instance of an office has one warehouse
Data Flow Diagrams (DFD) helps in identifying business
processes. It looks how data flows through a system and
concerned about things like where data will come from and
stored. The other importance of DFDs is that, they are useful
tools that help in defining boundaries of the system, as well as,
have the ability to represent the system at different levels.
Below is a DFD relating to tables of the database schema
discussed previously. The DFD diagram also shows the flow of
data both inputs and outputs for the use of a data warehouse.
The diagram maps data between source systems, operational
systems, as well as, data warehouses.
Customers
3Customer 2
Warehouse
2
Receive Order
3
Collect payment
1 Order
Customers 2
2 Invoices
1
Ship products
Section 2: Revised Project Plan
One of the critical factors for project success is having a well-
developed project plan. There are various step approaches to
follow when creating a project plan which includes; project
description, project objectives, project management plan
purpose, project deliverables, project milestones, project Roles
and Responsibilities, project scope management, project time
management, and many others. In this case, we shall update the
project plan template, from Project Deliverables 2: Business
Requirements with at least three new project tasks each
consisting five to ten subtasks.
Some of the new project tasks in project deliverables for the
business requirements include the stakeholder requirements,
solution requirements and transitional requirements.
Stakeholders requirements refers to user needs or user
requirements some of the subtasks performed here are, the
users’ requirements are documented using use cases and event-
response tables. The important and difficult step of designing a
software product is determining what the user wants, this is
because users often are unable to communicate the entirety of
their needs, and the information they provide may also be
incomplete (Parker, 2012). For transitional requirements, the
subtasks involved during the project development include; data
conversion and migration, user acceptance testing, production
turnover and transition, user preparation and transition and user
access and security rights.
Solution
requirements, on the other hand, describe the characteristics of
a solution that meet the business requirements and stakeholder
requirements (Lannon, 2014). Subtasks in solution requirements
in project development include validation, user interactions,
promotion of tools and engines and many others.
References
Lannon, R. (2014). Four requirements that make a difference in
creating solutions. Retrieved on 12 Feb, 2016 from
http://www.batimes.com/articles/four-requirements-that-make-
a-difference-in-creating-solutions.html
Parker, J. (2012). Business Requirements vs. Functional
Requirements. Retrieved on 12 Feb, 2016 from
http://enfocussolutions.com/business-requirements-vs-
functional-requirements/
Running head: SECTION 1 BUSINESS REQUIREMENTS
DOCUMENT 1
SECTION 1 BUSINESS REQUIREMENTS DOCUMENT
4
Section 1 Business Requirements Document
Student’s Name:
Institutional Affiliation:
Section 1 Business Requirements Document
The scope of the project
The scope of the project is to enable the creation and
implementation of the information systems infrastructure of a
company. The information system of an organization is crucial
to the organization, and thus, the IT network and system needs
to be improved such that efficiency of the entire IT system is
enhanced. Procuring quality business requirements is an
essential step towards the design of quality information
systems. The completion of a quality requirements documents
allows the user needs, as well as expectations to be captured
such that the infrastructure and information system can be
designed properly.
Justifications for the scope
The infrastructure of an information system should entail
relevant security mechanism that protects critical information in
an organization. Information system security can be evaluated
using CIA or Confidentiality, Integrity, and Availability
triangle, which mainly define the security policy of an
organization. A network solution is always chosen so as to
support the conceived information systems, as well as allow for
scalability.
How to control the scope
The design of a respiratory data collection system, it involves a
combination of Data Warehouses, OLTP, OLAP and Data
Mining. It is essential to acknowledge that relational database,
which is to be implemented in this case is designed for a certain
purpose. Essentially, the purpose of data ware house certainly
differ from OLTP is due to the characteristics of a relational
data supports data warehouse, while on the other OLTP database
has a different design. In this case, Data warehouse database is
usually designed for the purpose of analysis of different
business measures by both categories and attributes
(Silberschatz et al., 2011). They will be able to optimize bulk
loads, complex and unpredictable queries that are accessed
through many rows per table. Also, its loading is consistent and
gives valid data, which does not necessarily require real-time
validation (Connolly & Begg, 2014). Unfortunately, Data
warehouses only support a few concurrent users. OLTP
database, on the other hand, is designed for real-time operations
of the business. Moreover, it is optimized for only common
transaction sets, which is attained by adding or retrieving single
rows at a time per table. Interestingly, OLTP database can
support thousands of concurrent users. The organizations data
collection and analysis system will involve and incorporation of
OLPT system for purposes of providing a user interface while at
the same time using data warehouse databases for storage
purposes (Vaisman & Zimányi, 2014).
Possible risks, constraints, and assumptions
The recommended option for the development of the data
collection system is ware house database. However, there are
various issues that are likely to be characterized by the use of
the data warehouse database for analysis and storage of
information. The user interface of the company’s system will
largely depend on an OLTP system to provide access to multiple
users, while at the same time utilizing data warehouse databases
for effective storage. The risks that will be associated with the
design will be as follows; firstly, the incorporation of OLTP
system on one end and a data ware house database, on the other
hand, will be extremely complex to implement (Connolly &
Begg, 2014). There is a likelihood of scrubbing data in the
merge between the OLTP system and data warehouse, which
may lead to inconsistency of data being stored in the system.
The needed integration with other systems and infrastructure
The design of the data collection and analysis system will
involve an integration of an OLTP system into data warehouse
database to offer excellent and acceptable performance. This
incorporation of OLTP systems and data warehousing is aimed
at meeting specific needs of the company in data collection and
analysis. The relationship that exists in this hybrid system is
that OLTP system will only be used for providing an excellent
user interface. Hence, an exciting user experience is achieved.
The data warehouse on the other hand is used to make collection
and analysis of different data by both categories and attributes,
which ensures that the data stored is segmented into different
categories (Hellerstein, 2005). This shows that the
infrastructure of the system will entail two ends of the OLTP
system and a data warehouse database. The Meta data will
represent the data and application organization of different
OLTP components, and which has a direct link to the user
experience.
The relational data, in this case, is organized in such a way that
make analysis and collection more effective, since the OLTP
system will enable thousands of users into the organization’s
system. When the OLTP data moves to the data warehouse, it
must then be transformed into warehouse data for storage
purposes. The process of building the data warehouse will
involve reorganization of the OLTP data stored in the relational
tables into multidimensional cubes (Silberschatz et al, 2011).
This is regarded as transformation stage as OLTP data is
changed into warehouse data for storage. It will involve three
phases, which include extraction of data from the OLTP system,
transformation of the data into usable OLTP system, and finally
loading of the data into the data ware house or data mart. Once
the data is finally loaded into the data warehouse, there will be
decision makers that will be used to access and analyze the data
in the data marts or data warehouses (Hellerstein, 2005). The
diagram above show the structure of the entire system suitable
for the company’s needs.
Potential outsourcing/offshoring needs
The company might need to outsource services in order to attain
effectiveness in improving its productivity. Some of the
outsourcing needs might include including a third party to
manage their user interface, and hiring the services of outside
servers companies to run their data. This means that the
company can dedicate the marketing aspect of their site to other
parties to manage in increase the company’s relevance in the
competitive market. This is crucial in meeting the different
needs of potential consumers, as parties contracted for
marketing purpose will greatly entail more excess to the user
interface. Essentially, it means that a third party will be needed
to outsource for more customers, as the company’s data
collection and analysis is improved (Khan, 2003). Contracting
third party to manage and host the company’s system will entail
a cut in the production cost as the contracted party will solely
provide the infrastructure for running of the site and all servers.
Justify the necessary resources
Since the company anticipates for 20 percent annual increase in
data warehouse storage space, the company will require
increasing its storage capacity into order to be able to
accommodate huge volumes of data. With the 20 percent
increase each year, the company should also make places to
increase its capacity to accommodate more customers at its user
interfaces. Notably, this will involve enlarging the capacity of
the OLTP system to handle more users without frequent failures
(Khan, 2003). The system servers on the hand should be
regularly upgraded to handle the projected increase of users to
the company’s system every year.
References
Connolly, T. M., & Begg, C. E. (2014). Database systems: A
practical approach to design,implementation, and management.
Hellerstein, J. M. (2005). Readings in database systems.
Cambridge, Mass. [u.a.: MIT Press.
Khan, A. (2003). Data warehousing 101: Concepts and
implementation. San Jose, Calif: KhanConsulting and
Publishing.
Silberschatz, A., Korth, H. F., & Sudarshan, S. (2011). Database
system concepts. New York:McGraw-Hill.
Vaisman, A., & Zimányi, E. (2014). Data warehouse systems:
Design and implementation.
Running head: SECTION 1: PROJECT INTRODUCTION
1
SECTION 1: PROJECT INTRODUCTION
4
Section 1: Project Introduction
Student’s Name:
Institutional Affiliation:
Section 1: Project Introduction
Background Information of the Company
The LiquiTel Communications Company is an innovative
internet-based organization that has its operations in the US,
UK, and Canada. The Company offers services across its
clientele by providing space for cloud storage to its customers
across North America and Europe. The company offers high
quality services and further distributes hardware to its customer
base to guarantee its commitment in the IT business. The
company has lately implemented a Data warehouse with BD2 to
increase its capacity to analyze and store data of its customers
across North America and Europe. To attain this, LiquiTel
Communications outsourced IBM Corporation to offer its
Information technology needs as it seeks to be able to allow its
customers from every location to access their system and
subsequently be attend to promptly (Bosworth et al., 2009).
Type of business activities associated with the Company
The LiquiTel Communications has its operations with the ICT
sector, and its ventures mainly entail provision of services with
the arena of Information Technology. Some of its common
activities to its clients include: installation of business
intelligence platforms, implementation of Statistical Analysis
System software technology to different companies in the US,
the UK, and Canada, and the creation of an effectively network
infrastructure in various companies.
Speculations on outsourcing and offshoring opportunities
Data warehouse is information system that is used for data
analysis and reporting. It is central repository of integrated data
usually from one or even more disparate sources. These systems
are capable of storing both current and historical data, hence
they are used for purposes of creating trending reports for high
end management reporting like annual or quarterly comparisons.
In regard to LiquiTel Communications Ltd, the company is
building a data warehouse with DB2 to improve its
effectiveness and performance of its system for purposes of
receiving customer reports in different parts of the region, and
equally enhance its process of data analysis. With the
implementation of a Data warehouse with customers who are
required to access the services through its websites and
subsequently make their desired transaction online (Bosworth et
al., 2009). The data warehouse will increase the capacity of the
company’s information to support a broad online platform that
will serve its huge customer base in precision.
The data warehouse allows organizations to have several
marketing strategies that will be aimed at increasing its
customer base. With an information system that is supported by
an elaborate Data warehouse that used the DB2 version, its user
interface will significantly improve. With such improvements of
its user interface, the company will be able to advertise its
services and equally register more customers into its programs.
The data warehouse will allow the company’s system to support
running of its website in the entire North America without
failing. This means that as a market strategy, the company will
reach out a large customer base. The number of online programs
or services that are offered on the company’s website can also
be increase to reach out to a larger circle of groups across the
country (Ivanov et al., 2012). This is an aspect that was not
possible before the implementation of the data warehouse, since
the previous system did not have the capacity to support a
system that handles a large number of users across the vast
region. The previous system could not also provide an elaborate
data analysis for the large customers reaching the company
through its online platform. The mode of storage was not
adequate for the huge volumes of data that were streaming into
the previous database (Ivanov et al., 2012).
Although the Data warehouse is commonly known to host
various benefits, it also has several risks upon its use. Firstly,
Data warehouses pose a huge risks in the data inputs as if is
difficult to know all the data that is being stored. It is also a
requirement to have a data classification policy, which is
applied for purposes of all data entering the system. Lack of
this policy poses huge risks of data being compromised. The
data outputs also require the company to have software that
monitors the outgoing data to ensure it is secure, failure to
which the data is highly compromised. In addition, the security
system of the data warehouse should be incorporated through
programs that assign different roles (Ivanov et al., 2012).
Improper utilization of this program put the system as huge
risks as it remains completely unsecured.
It is due to the context above, that it is important for LiquiTel
Communications to outsource some of the tasks to firmer and
reputable companies such as the IBM. The reason for the
LiquiTel Communications Ltd to choose to work with IBM’s
data warehouse technology could be because IBM is a well-
established information technology that has proven to offer
reliable services on large scale basis (Ivanov et al., 2012).
Considering IBM is an IT company will adequate resources to
support huge data volumes, this could be considers as the most
appropriate company that could handle the needs of LiquiTel
Communications Ltd
Data warehouse technology provide a robust environment that
supports numerous slots for data reporting and Data analysis, as
well as storage (Ivanov et al., 2012). Considering the large
number of customers that LiquiTel Communications Ltd has,
implementing a data warehouse would subsequently support its
programs, particular on the data that is channeled to them from
its online platforms, which are accessible across the vast region.
Its decision to use IBM data warehouse is essential considering
that the IBM is an ICT company has a large capacity to meet the
needs of LiquiTel Communications Ltd, hence improved
performance for the company throughout.
Overview of the Company’s:
a) Operation systems
The company plans to utilize a business intelligence platform.
The platform entails a completion of a quality requirements
documents allows the user needs, as well as expectations to be
captured such that the infrastructure and information system can
be designed properly. The scope of this part of the project is to
create a repository for data collection beyond the standard
relational databases (Harrington, 2009). This aims at designing
a quality information system that automatically allows user
needs, as well as expectations to be captured in the collection of
data and analysis platforms.
b) Databases and Data warehousing
LiquiTel Communications Ltd utilizes the use of a database
schema, which is understood as the relational database. It aims
at demonstrating skills acquired in using databases for business
transactions. Data warehouses are critical in providing support
to business decisions through collection, consolidating and
organizing data for reporting as well as analyzing with tools
such as data mining and online analytical processing, OLAP
(Harrington, 2009).
c) Cloud technology and virtualization
LiquiTel Communications Ltd seeks to install cloud technology
and virtualization platform in the organization after realizing its
benefits from large companies that have been using them for
improved and efficient, as well as advanced analysis.
Furthermore, the company realizes that the use of the cloud
technology and virtualization would boost its performance, and
equally set up new expectations, as well as performance goals
(Ivanov et al., 2012).
d) Network infrastructure and Security
The company has its information system security evaluated
using CIA (Confidentiality, Integrity, and Availability triangle),
which mainly define the security policy of an organization. A
network solution is always chose so as to support the conceived
information systems, as well as allow for scalability. It should
be understood that the CIO of the company is responsible for
required design of infrastructure and security protocols upon the
implementation of the network infrastructure in the company
(Ivanov et al., 2012). The design of the infrastructure should be
able to support the operations of the organization and equally
protect its data.
References
Bosworth, S., Kabay, M. E., & Whyne, E. (2009). Computer
security handbook. Hoboken, N.J:
John Wiley & Sons.
Harrington, J. L., & Harrington, J. L. (2009). Relational
database design and implementation:
Clearly explained. Amsterdam: Morgan Kaufmann/Elsevier.
Ivanov, I., Sinderen, M. J., & Shishkov, B. (2012). Cloud
computing and services science. New
York: Springer.

More Related Content

Running Head NETWORK INFRASTRUCTURE VULNERABILITIES1NETWORK .docx

  • 1. Running Head: NETWORK INFRASTRUCTURE VULNERABILITIES1 NETWORK INFRASTRUCTURE VULNERABILITIES3 Project Paper: Network Infrastructure vulnerabilities Name Institutional Affiliations Section 1: Infrastructure Document Computer networks have increasingly become ubiquitous and synonymous especially with the organizations that thrive on excellence, as well as, those who would want to adopt cloud technology and virtualization within their companies. Today, most organizations that set up their businesses ensure that they have incorporated an efficient computer network infrastructure that will connect the business to the outside world through Internets. This is because, research has shown that the present business depend heavily on network infrastructure platforms that make communication easy, efficient, available, as well as, accessible. Consequently, despite the fact that robust computers networks have made it easier by providing a basis of interactivity and bringing a whole lot of people and businesses together, all these at one point have amounted to growing security concerns over the past years across various sectors and industries. This paper will therefore identify some of the
  • 2. possible network infrastructure vulnerabilities, as well as, describing a comprehensive security policy that helps in protecting the company infrastructure and assets by applying the principle of CIA. A network consists of devices such as routers, firewalls, generic and hosts which include servers and workstations. Equally, there are thousands of network vulnerabilities; therefore, organizations should ensure that they focus on tests that will produce a good overall assessment of the network especially when they store their data in the cloud, however, there may be risk of non-compliance and regulation, due to lack of control over where data is stored. The possible network infrastructure vulnerabilities include; improper system configuration, poor firewall deployment, poor anti-virus implementation, weak password implementation, lack of efficient physical security, lack of appropriate security policies and many others. Vulnerabilities can be successfully contained by putting measure in place, for example, the Network Administrator should be in position to gather information about viruses and worms, as well as, identifying network vulnerabilities by getting information that helps in preventing security problems. Security measures for Network vulnerabilities can be accessed through three main stages which involve planning, conducting and inference (Markluec, 2010). In planning stage, there is an official agreement that is signed between the concerned parties. The document signed is important because it will contain both legal and non-disclosure causes that serve to protect the ethical hacker against possible law suit. Conducting stage involves the evaluation of technical reports prepared based on testing potential vulnerabilities. Lastly, in inference stage, the results of the evaluation are communicated to the organization and corrective action is taken if needed. A logical and a physical layout of planned network A logical topographical layout of network illustrates all the logical features of the network which comprises of logical
  • 3. networks, routing tables, as well as, assigned IP addresses to a variety of hosts and devices. Conversely, a physical layout of the networks, on the other hand, represents the physical location and the association between the different devices participating on the network. In a physical network layout, each computer is connected to the hub and a cable line, this is important because it helps the administrator to visualize how much equipment he or she will need. Having a well planned illustration view of both physical and logical network is significant because it helps to facilitates identify probable security problems. For example, when unnecessary visitor attempts to acquire access to sensitive data or information, he or she will first generate a map of network to check on security checkpoint that is firewall or other similar devices that are established, and to what access can be attained. The diagrams below show a logical and a physical layout of planned network. Physical network diagram Internet Router Switch WIFI router Switch PC PC PC PC Ring Server PC Printer Scanner PC IP Phone IP Phone
  • 4. Logical Network diagram PC PC PC PC 172.16.44.1172.16.48.21 172.16.52.1 172.16.56.23172.16.60.9 PC Illustrate the possible placement of servers When designing a logical and physical network, the network administrator should thoroughly illustrate the possible placement of servers, including the access paths to firewalls, as well as, internets. Conversely, facility limitations, routers, printers, switches, bridges, workstations, and access points should be as well considered when designing a network. The basic design of this network demonstrates the relation to the Internet using a broader router and firewall. In order to design a well secured network, various factors must be considered into contemplation such as the topology and placement of hosts inside the network, the selection of software and hardware technologies, as well as, the suspicious configuration of the components. Comprehensive Security Policy for the Company
  • 5. The acts of providing trust information; confidentiality, Integrity and availability (CIA) of the information are not violated. For instance many companies ensure that they have backups just in case data is lost when critical issues arise. Confidentiality, integrity and availability, also known as the CIA triad is a model that has been designed to guide policies for information security within organizations. Confidentiality is a set of rules that limits access to information, integrity is the assurance that the information is accurate, while availability is a guarantee of reliable access to the information by authorized people. The business’ assets may be measured in terms of its employees and buildings, all these are stored in the form of information, whether electronic data or written documents, therefore, if these information are disclosed to unauthorized individuals, is inaccurate, or not available when it is needed, then the business may suffer significant harm, which include, loss of customer confidence, contract damages, or even reduction in market share. Confidentiality of information is very important not only for corporations but for Governments as well. Most organizations have developed various measures to ensure the confidentiality of information by preventing sensitive information from reaching wrong people, as well as, making sure that the right people get the information at the right time they need them. Access must be restricted to those authorized to view data, however, sometimes, safeguarding data confidentiality may require special training, which typically include security risks that may threaten information. One way to ensure confidentiality of information is through data encryption. For example, users are encouraged to use Passwords that have at least eight characters that will prevent the attackers from hacking into their passwords and gaining access to their personal information. Integrity on the other hand involves maintaining the accuracy and consistency of data over its entire life cycle. For example, data should not be changed in transit; therefore, steps should be taken to ensure that data cannot be
  • 6. altered by unauthorized people (TechTarget, 2016). Employees within organizations should ensure that they do not disclose any information regarding the organization because when they do so, those who are not authorized to accessing the information may get the opportunity to leak out the information by hacking into the system. Conversely, users should ensure that they created passwords that make it hard for attackers to hack these passwords. Passwords should never have only lower case or upper case letters, however, both lower and upper case letters can be used together, since this will make it a bit difficult for hackers to guess the password of the users. Organizations on the other hand, should make sure that they have implemented a secured network within the organization using firewalls that prevent hackers from bringing the whole system down. Section 2: Revised Project Plan One of the critical factors for project success is having a well- developed project plan. There are various step approaches to follow when creating a project plan. These include; project description, project objectives, project management plan purpose, project deliverables, project milestones, project Roles and Responsibilities, project scope management, project time management, and many others. In this case, we shall revise the previous project plan while updating the project plan template; from Project Deliverables 4: Cloud Technology and Virtualization with at least three new project tasks each consisting five to ten subtasks. I personally support the need for use of cloud technology and virtualization. Use of cloud computing can change the risk posture and profile of a company. For example, companies that have adopted cloud technology avoid the risks that a large investment in IT resources will not pay off. Conversely, cost savings in hardware infrastructure are relatively easy to quantify particularly for those companies that adopt cloud technologies in a public cloud, on-site hardware will be
  • 7. replaced less often and less new hardware is purchased. Cloud and virtualization technology align with the company’s business processes and assist with attainment of organizational goals by providing a more productive environment not only for a collaborative working, but also it improves the company’s productivity by enabling participants in a business ecosystem to share processing logic (Parker, 2012). Some of the new project tasks in project deliverables for Cloud Technology and Virtualization include; host server patching, monitoring the virtual machine sprawl, as well as exploring backup options. These tasks are important in ensuring efficient, healthy and a secure virtual environment. Some of the subtasks involve in host server patching include configuring the software updates installation and running compliance and vulnerabilities. A compelling recommendation for solution providers and partners that could help a company secures a firm competitive advantage by using cloud and virtualization technologies is to ensure that all the systems are up-to-date and firewalls are installed in the systems to prevent attackers from hacking the system. References Markluec, M. (2010). Some Common network vulnerabilities persist. Retrieved on 26 Feb, 2016 from http://gsnmagazine.com/article/20994/some_common_network_ vulnerabilities_persist TechTarget, (2016). Confidentiality, Integrity, and availability (CIA triad). Retrieved on 26 Feb, 2016 from http://whatis.techtarget.com/definition/Confidentiality- integrity-and-availability-CIA Parker, J. (2012). Business Requirements vs. Functional Requirements. Retrieved on 26 Feb, 2016 from
  • 8. http://enfocussolutions.com/business-requirements-vs- functional-requirements/ Running Head: PROJECT PAPER1 PROJECT PAPER4 Project Paper Name Institutional Affiliations Section 1: Design Document There are many reasons why many companies would want to adopt cloud technology and virtualization within their companies. The most significant ones are identified as business performance resourcing, rapid-go-to-market, business agility, as well as, cost reduction. Often, there are no serious reasons why organizations choose to adopt cloud computing given that the decisions always depends on the complex combination of reasons, rather than being based on a single factor. Use of cloud computing can change the risk posture and profile of a company. For example, by companies using public cloud, they will definitely avoid the risk that a large investment in IT resources will not pay off; however, it can introduce security risks because the resources are shared with unknown parties. Conversely, there may be risk of non-compliance and regulation, due to lack of control over where data is stored. The need for using cloud technology within the company is the ability to take advantage of new business opportunities, cost,
  • 9. agility, and productivity. Cloud computing delivers improved agility because it has on- demand rapid elasticity. For instance, the IT resources can be deployed more quickly and increased as needed to meet demand. This gives opportunities for enterprises to innovate, introduce new products and services, enter new markets, as well as, adapting to changing circumstances. The other reason for the need for the use of cloud technology and virtualization within the company is that, it increases productivity. Cloud technology provides a more productive environment not only for a collaborative working, but also it improves the company’s productivity by enabling participants in a business ecosystem to share processing logic. Cloud computing or technologies create new business opportunities, for example, it can give an enterprise new business opportunities as a provider of cloud services or added services. Companies that excel in quality of its IT can become a public SaaS (Software-as-a-Service), PaaS (Platform-as-a-Service) and IaaS (Infrastructure-as-a-Service) provider. A good example of this is in a case where a company implements a private cloud, has spare capacity, as well as, sells that capacity as public cloud. Cloud computing within the company also cut businesses costs. For example, given the benefits of agility, productivity, and quality that cloud computing have, companies might expect these to be generally more expensive, however, this is not the case; reduced cost is one of the main reasons why many companies are turning to the cloud technologies (Perilli, 2009). Research has shown that few technologies have affected the IT industry. An example is cloud computing that delivers computing as a service. Part of cloud’s appeal is evidently financial; that is, it allows organizations to shed some of their expensive IT infrastructure (software and hardware) and shifts computing costs to more manageable operational expenses. Cloud computing is not only about purchasing hardware and infrastructure, instead, the companies, as well as, individuals usually compare the cost of on-premise server to the cost of
  • 10. cloud server. In the mind of these entities, the on-premise server is most likely to depreciate itself may be in 3 years to come while the cloud computing server will be treated as a continuous expenditure. Cost savings in hardware infrastructure are relatively easy to quantify. For example, if companies adopt cloud technologies in a public cloud, on-site hardware will be replaced less often and less new hardware has to be purchased. This is significant because it will in turn lead to much-reduced power, less space needed in the data center, as well as, cooling costs. Studies shows that a medium-sized cloud deployments lead to about 62 percent savings within a year compared with an on-premise system. However, for larger cloud deployments, the annual savings may be about 50 percent compared with on- premise implementations (Schwartz, 2013). Labor cost savings is one of the possible reduction of human capital that can be realized from implementing and virtualization technologies. By off-loading software, application or a platform to a private cloud platform, less time is needed to maintain, administer, as well as, troubleshoot the technologies. For instance, if there is a system administrator who is in charge of about 140 servers, the same system administrator can be in charge or responsible for thousands of cloud –based servers as well (Schwartz, 2013). IT management costs can as well be reduced through automated provisioning. For example, in the cloud, virtual servers are usually provisioned automatically instead of manually. This is significant because it reduces the downtime along with the compliance issues. The diagram below illustrates how cloud and virtualization technology aligns with the company’s business processes and assist with attainment of organizational goals. Cloud Provider Cloud Consumer Service Orchestration Resource abstraction and control layer Physical Resource Layer Service
  • 11. Management PaaS SaaS IaaS The main actors in the above diagram are cloud consumers and cloud provider. A cloud consumer is an individual or organization that acquires and uses cloud products and services, while cloud provider in this case is a person or an organization that are responsible for making a service available to a cloud consumer. One area concern from many organizations regarding cloud technologies and virtualization is how to trust the infrastructure upon which the data and workloads will be run. As a result, companies who are interested in moving to the cloud and manage their risk profiles should look for Cloud Service Providers that offer security and trust services, as well as, the ability to enforce and audit policy on the data being deployed. A compelling recommendation for solution providers that could help the company secure a firm competitive advantage using cloud and virtual technologies is recommending these firms to advance their server security for physical, virtual, and cloud servers in regards to what enterprises chose to use for storing their information in the cloud. Enterprises should adopt data security that will protect the enterprise applications and data from breaches, as well as business disruptions. This is important because it will help organizations to simplify security operations while enabling regulatory compliance and cloud projects. Another compelling recommendation for solution
  • 12. providers that could help the company secure a firm competitive advantage using cloud and virtual technologies is through automation and self service. Organizations should discover best practices in the cloud and encapsulate those in new procedures for ongoing support, provisioning of resources, as well as, application design. In the case of automation, firms can radically reduce the human cost of IT if they take the advantage of schedule maintenance and provisioning. Section 2: Revised Project Plan One of the critical factors for project success is having a well- developed project plan. There are various step approaches to follow when creating a project plan which includes; project description, project objectives, project management plan purpose, project deliverables, project milestones, project Roles and Responsibilities, project scope management, project time management, and many others. In this case, we shall revise the previous project plan while updating the project plan template; from Project Deliverables 3: Database and Data Warehousing Design with at least three new project tasks each consisting five to ten subtasks. A database is an organized collection of data that can easily be accessed, managed, and updated. Databases are set up so that one set of software programs allows users to access to all. In databases, there are rows and columns which are used to store information. Data warehouse is an electronic storage of a large amount of information, and are stored in a manner that is secure, reliable, easy to manage and retrieve. Usually, data warehouse is the only option for data collection Company that has a huge collection of data. For example, every moment users visit a single page in a website that will generate thousand records to be saved. Some of the new project tasks in project deliverables for Database and Data Warehousing Design include; improving information access, bringing the user in
  • 13. touch with their data, enhancing the quality of decisions, managing schema objects, such as tables, indexes, and material views, managing users and security, as well as, providing cross- function integration. All the above project tasks in project deliverables for Database and Data Warehousing Design are important and difficult step for designing a software product for the reason that they determine what the user wants, this is because users often are unable to communicate the entirety of their needs, and the information they provide may also be incomplete (Parker, 2012). By managing schema objects, such as tables, indexes, and material views, some of the subtasks include defining alternatives to triggers by defining data version control and management while the subtasks of managing users and security include creating login less users, implementing certificate-based security, as well as defining appropriate database roles and permissions for the users. References Perilli, W. (2009). The benefits of virtualization and cloud computing. Retrieved on 18 Feb, 2016 from http://virtualization.sys-con.com/node/870217 Parker, J. (2012). Business Requirments vs Functional Requirements. Retrieved on 12 Feb, 2016 from http://enfocussolutions.com/business-requirements-vs- functional-requirements/ Schwartz, P. (2013). Cloud Computing can generate massive savings for agencies. Retrieved on 18 Feb, 2016 from https://fcw.com/microsites/2011/cloud-computing- download/financial-benefits-of-cloud-computing-to-federal- agencies.aspx Running Head: DATABASES AND DATA WAREHOUSING1
  • 14. DATABASES AND DATA WAREHOUSING4 Databases and Data warehousing Name Institutional Affiliations Section 1: Design Document A database is an organized collection of data that has been organized so that it can easily be accessed, managed, as well as, updated. Generally, databases are set up so that one set of software programs allows users to access to all. Table formats that are made up of row and columns are used to store information. A relational database is a collection of data items that are organized as a set of formally-described tables from which data can be accessed or reassembled. While data warehouse is an electronic storage of a large amount of information, and are stored in a manner that is secure, reliable, and easy to manage and retrieve. Usually, data warehouse is the only option for data Collection Company that has a huge collection of data; however, it is not much structured like relational databases. For example, every moment users visit a single page in a website that generates thousand records to be saved. Data can be outdated; however, it cannot be deleted given that in few months down the line, the data will be used for analysis, which will then generate the future prospects of a business. The analysis may depend on the efficiency; that is how it has been arranged to give clear picture for the analysis and how the data is collected. From Management point, need for use of relational databases and warehousing helps companies to avoid serious
  • 15. challenges in terms of intense competition. For example, in a case where a supermarket has not implemented a data warehouse and at the end finds it very difficult to analyze the products sold, what is not sold, when the sale goes up, and several other queries. These can be seen as attracting challenges because organization has to make decision as to whether, a particular product is a hit or not. A database schema refers to logically grouping of objects such as tables, views, and stored procedures. Generally, a database schema is the skeleton structure representing the logical view of the entire database. In a company’s business and processes, there are entities such as offices, employees, payments, OrderDetails, Orders, customers and product Lines. Entities in relation to database refer to a single person, place, or thing about which data can be stored. Product lines table will store a list of product and service line category, Orders table will store orders placed by customers, OrderDetails table will store the order line items of each order, Payments table will store payments made by the customers based on the customer’s account, Employees table will store all employees information that includes organizational unit structure, and lastly is the office table which will store sale office data. The following are database tables for the above stated entities that support the company’s business and processes. CREATE TABLE ‘Products’ ( ‘productCode’ varchar(15) NOT NULL, ‘productName’ varchar(70) NOT NULL, ‘productLine’ varchar(50) NOT NULL, ‘productScale’ varchar(10) NOT NULL, ‘productVendor’varchar(50) NOT NULL, ‘productDescription’ Text, ‘addressLine2’ varchar(45) DEFAULT NULL, ‘quantityInStock’smallint (6) NOT NULL, ‘buyPrice’ double, ‘MSRP’ double, PRIMARY KEY (“customerCode”),
  • 16. KEY “quantityInStock” (‘quantityInStock’), CONSTRAINT “productsibfk1” FOREIGN KEY (‘productName’) REFERENCES ‘products’ (“productName”) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; CREATE TABLE ‘customers’ ( ‘customerNumber’ int(12) NOT NULL, ‘customerName’ varchar(55) NOT NULL, ‘contactLastName’ varchar(45) NOT NULL, ‘contactFirstName’ varchar(45) NOT NULL, ‘phone’varchar(45) NOT NULL, ‘addressLine1’ varchar(45) NOT NULL, ‘addressLine2’ varchar(45) DEFAULT NULL, ‘city’varchar (45) NOT NULL, ‘state’ varchar(45) DEFAULT NULL, ‘postalCode’ varchar(18) DEFAULT NULL, ‘country’ varchar(45) NOT NULL, ‘salesRepEmployeeNumber’int(12) DEFAULT NULL, ‘creditLimit’ double DEFAULT NULL, PRIMARY KEY (“customerNumber”), KEY “salesRepEmployeeNumber” (salesRepEmployeeNumber), CONSTRAINT “customersibfk1” FOREIGN KEY (‘slaesRepEmployeeNumber’) REFERENCES ‘employees’ (“employeeNumber”) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; In the above customer table, the PRIMARY KEY is “customerNumber”. ‘customerNumber’ in the above table is one of the attributes that uniquely identifies the customer. The FOREIGN KEY is ‘salesRepEmployeeNumber’ CREATE TABLE ‘employees’ ( ‘employeerNumber’ int(12) NOT NULL, ‘jobTitle’ varchar(45) NOT NULL, ‘lastName’ varchar(45) NOT NULL,
  • 17. ‘firstName’ varchar(45) NOT NULL, ‘email’ varchar(90) NOT NULL, ‘reporysTo’ int(12) DEFAULT NULL, ‘officeCode’ varchar(12) NOT NULL, ‘postalCode’ varchar(18) DEFAULT NULL, PRIMARY KEY (“employeeNumber”), KEY “officeCode” (salesRepEmployeeNumber), CONSTRAINT “employeesibfk2” FOREIGN KEY (‘officeCode’) REFERENCES ‘offices’ (‘officeCode’) CONSTRAINT “employeesibfk1” FOREIGN KEY (‘reportsTo’) REFERENCES ‘employees’ (‘employeeNumber’) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; CREATE TABLE ‘offices’ ( ‘office Code’varchar(12) NOT NULL, ‘ccity’ varchar(45) NOT NULL, ‘phone’ varchar(45) NOT NULL, ‘addressLine1’ varchar(45) NOT NULL, ‘addressLine2’ varchar(45) DEFAULT NULL, ‘country’ varchar(45) NOT NULL, ‘state’ varchar(45) DEFAULT NULL, ‘city’varchar (45) NOT NULL, ‘postalCode’ varchar(18) NOT NULL, ‘territory’ varchar(12) NOT NULL, PRIMARY KEY (“officeCode”), ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; CREATE TABLE ‘orderDetails’ ( ‘orderNumber’ int(12) NOT NULL, ‘productCode’ varchar(16) NOT NULL, ‘quantityOrdered’ int(12) NOT NULL, ‘priceEach’ double NOT NULL, ‘orderLineNumber’ smallint(8) NOT NULL, PRIMARY KEY (‘OrderNumber’ ‘ProductCode’), KEY “productCode” (“productCode”),
  • 18. CONSTRAINT “orderdetailssibfk2” FOREIGN KEY (“cproductCode”), REFERENCES ‘products’ (‘productCode’) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; CREATE TABLE ‘orders’ ( ‘orderNumber’ int(11) NOT NULL, ‘orderDate’ date NOT NULL, ‘requiredDate’ date NOT NULL, ‘shippedDate’ date DEFAULT NULL, ‘status’ varchar(15) NOT NULL, ‘comments’ text, ‘customerNumber’ int(11) NOT NULL, PRIMARY KEY (“OrderNumber”), KEY “customerNumber” (“customerNumber”), CONSTRAINT “ordersibfk1” FOREIGN KEY (“customerNumber”), REFERENCES ‘customers’ (‘customerNumber’) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; CREATE TABLE ‘payments’ ( ‘customerNumber’ int(12) NOT NULL, ‘checkNumber’ varchar (45) NOT NULL, ‘paymentDate’ date NOT NULL, ‘Amount’ double NOT NULL, PRIMARY KEY (‘customerNumber’ ‘ KEY “checkNumber” (“checkNumber”), CONSTRAINT “paymentsssibfk2” FOREIGN KEY (“checkNumber”), REFERENCES ‘payments’ (‘checkNumber’) ) ENGINE=”InnoDB DEFAULT CHARSET”= latin 1; The database schema created below shows the entities and attributes of the company’s business and processes
  • 19. EMPLOYEE · employeeNumber’ int(12) · jobTitle’ varchar(45) · lastName’ varchar(45) · firstName’ varchar(45) · email’ varchar(90) · reporysTo’ int(12) · officeCode’ varchar(12) · postalCode’ varchar(18) INDIXES PRODUCTS · productCode varchar(15) · productName varchar(70) · productLine varchar(50) · productScale varchar(10) · productVendor varchar(50) · productDescription Text, · addressLine2 varchar(45) · quantityInStock smallint (6) · buyPrice’ double, · MSRP’ double, INDIXES CUSTOMERS · customerNumber int(12) · customerName varchar(55) · contactLastName · varchar(45) · contactFirstName · varchar(45) · phone varchar(45)
  • 20. · addressLine1 varchar(45) · addressLine2 varchar(45) · city’varchar (45) · state varchar(45) · postalCode varchar(18) · country varchar(45) · salesRepEmployeeNumber int(12) · creditLimit double INDIXES ORDERS · orderNumber’ int(11) · orderDate’ date · requiredDate’ date · shippedDate’ date · status’ varchar(15) · comments’ text, · customerNumber’ int(11) INDIXES OFFICE · office Code varchar(12) · city varchar(45) · phone varchar(45) · addressLine1 varchar(45) · addressLine2 varchar(45) · country varchar(45) · state varchar(45) · city varchar (45) · postalCode’ varchar(18) · territory varchar(12) INDIXES ORDERDETAILS · orderNumber int(12)
  • 21. · productCode varchar(16) · quantityOrdered int(12) · priceEach double · orderLineNumber smallint(8) INDIXES PAYMENT · customerNumber int(12) · checkNumber varchar (45) · paymentDate date · Amount double INDIXES Below is an Entity-Relationship (E-R) Diagram of the company’s business and process of the above database schema Customers Products Payment Order Office Employee OrderDetails Warehouse orders contains checks Places Stores makes has
  • 22. 1:M1:M 1:M 1:M1:M 1:M Below, is the description of the above E-R Diagram * 1 instance of a warehouse stores 0 to many products * 1 instance of a customer orders 1 to many products * 1 instance of a customer places 1 to many orders * 1 instance of an order contains 1 to many OrderDetails * 1 instance of an employee checks 1 to many OrderDetails * 1 instance of a customer makes 0 to many payments * 1 instance of an office has one warehouse Data Flow Diagrams (DFD) helps in identifying business processes. It looks how data flows through a system and concerned about things like where data will come from and stored. The other importance of DFDs is that, they are useful tools that help in defining boundaries of the system, as well as, have the ability to represent the system at different levels. Below is a DFD relating to tables of the database schema discussed previously. The DFD diagram also shows the flow of data both inputs and outputs for the use of a data warehouse. The diagram maps data between source systems, operational systems, as well as, data warehouses. Customers 3Customer 2 Warehouse
  • 23. 2 Receive Order 3 Collect payment 1 Order Customers 2 2 Invoices 1 Ship products Section 2: Revised Project Plan One of the critical factors for project success is having a well- developed project plan. There are various step approaches to follow when creating a project plan which includes; project description, project objectives, project management plan purpose, project deliverables, project milestones, project Roles and Responsibilities, project scope management, project time management, and many others. In this case, we shall update the project plan template, from Project Deliverables 2: Business Requirements with at least three new project tasks each consisting five to ten subtasks. Some of the new project tasks in project deliverables for the
  • 24. business requirements include the stakeholder requirements, solution requirements and transitional requirements. Stakeholders requirements refers to user needs or user requirements some of the subtasks performed here are, the users’ requirements are documented using use cases and event- response tables. The important and difficult step of designing a software product is determining what the user wants, this is because users often are unable to communicate the entirety of their needs, and the information they provide may also be incomplete (Parker, 2012). For transitional requirements, the subtasks involved during the project development include; data conversion and migration, user acceptance testing, production turnover and transition, user preparation and transition and user access and security rights. Solution requirements, on the other hand, describe the characteristics of a solution that meet the business requirements and stakeholder requirements (Lannon, 2014). Subtasks in solution requirements in project development include validation, user interactions, promotion of tools and engines and many others. References Lannon, R. (2014). Four requirements that make a difference in creating solutions. Retrieved on 12 Feb, 2016 from http://www.batimes.com/articles/four-requirements-that-make- a-difference-in-creating-solutions.html Parker, J. (2012). Business Requirements vs. Functional
  • 25. Requirements. Retrieved on 12 Feb, 2016 from http://enfocussolutions.com/business-requirements-vs- functional-requirements/ Running head: SECTION 1 BUSINESS REQUIREMENTS DOCUMENT 1 SECTION 1 BUSINESS REQUIREMENTS DOCUMENT 4 Section 1 Business Requirements Document Student’s Name: Institutional Affiliation:
  • 26. Section 1 Business Requirements Document The scope of the project The scope of the project is to enable the creation and implementation of the information systems infrastructure of a company. The information system of an organization is crucial to the organization, and thus, the IT network and system needs to be improved such that efficiency of the entire IT system is enhanced. Procuring quality business requirements is an essential step towards the design of quality information systems. The completion of a quality requirements documents allows the user needs, as well as expectations to be captured such that the infrastructure and information system can be designed properly. Justifications for the scope The infrastructure of an information system should entail relevant security mechanism that protects critical information in an organization. Information system security can be evaluated using CIA or Confidentiality, Integrity, and Availability triangle, which mainly define the security policy of an organization. A network solution is always chosen so as to support the conceived information systems, as well as allow for scalability. How to control the scope
  • 27. The design of a respiratory data collection system, it involves a combination of Data Warehouses, OLTP, OLAP and Data Mining. It is essential to acknowledge that relational database, which is to be implemented in this case is designed for a certain purpose. Essentially, the purpose of data ware house certainly differ from OLTP is due to the characteristics of a relational data supports data warehouse, while on the other OLTP database has a different design. In this case, Data warehouse database is usually designed for the purpose of analysis of different business measures by both categories and attributes (Silberschatz et al., 2011). They will be able to optimize bulk loads, complex and unpredictable queries that are accessed through many rows per table. Also, its loading is consistent and gives valid data, which does not necessarily require real-time validation (Connolly & Begg, 2014). Unfortunately, Data warehouses only support a few concurrent users. OLTP database, on the other hand, is designed for real-time operations of the business. Moreover, it is optimized for only common transaction sets, which is attained by adding or retrieving single rows at a time per table. Interestingly, OLTP database can support thousands of concurrent users. The organizations data collection and analysis system will involve and incorporation of OLPT system for purposes of providing a user interface while at the same time using data warehouse databases for storage purposes (Vaisman & Zimányi, 2014).
  • 28. Possible risks, constraints, and assumptions The recommended option for the development of the data collection system is ware house database. However, there are various issues that are likely to be characterized by the use of the data warehouse database for analysis and storage of information. The user interface of the company’s system will largely depend on an OLTP system to provide access to multiple users, while at the same time utilizing data warehouse databases for effective storage. The risks that will be associated with the design will be as follows; firstly, the incorporation of OLTP system on one end and a data ware house database, on the other hand, will be extremely complex to implement (Connolly & Begg, 2014). There is a likelihood of scrubbing data in the merge between the OLTP system and data warehouse, which may lead to inconsistency of data being stored in the system. The needed integration with other systems and infrastructure The design of the data collection and analysis system will involve an integration of an OLTP system into data warehouse database to offer excellent and acceptable performance. This incorporation of OLTP systems and data warehousing is aimed at meeting specific needs of the company in data collection and analysis. The relationship that exists in this hybrid system is that OLTP system will only be used for providing an excellent
  • 29. user interface. Hence, an exciting user experience is achieved. The data warehouse on the other hand is used to make collection and analysis of different data by both categories and attributes, which ensures that the data stored is segmented into different categories (Hellerstein, 2005). This shows that the infrastructure of the system will entail two ends of the OLTP system and a data warehouse database. The Meta data will represent the data and application organization of different OLTP components, and which has a direct link to the user experience. The relational data, in this case, is organized in such a way that make analysis and collection more effective, since the OLTP system will enable thousands of users into the organization’s system. When the OLTP data moves to the data warehouse, it must then be transformed into warehouse data for storage purposes. The process of building the data warehouse will involve reorganization of the OLTP data stored in the relational tables into multidimensional cubes (Silberschatz et al, 2011). This is regarded as transformation stage as OLTP data is changed into warehouse data for storage. It will involve three phases, which include extraction of data from the OLTP system, transformation of the data into usable OLTP system, and finally loading of the data into the data ware house or data mart. Once the data is finally loaded into the data warehouse, there will be decision makers that will be used to access and analyze the data
  • 30. in the data marts or data warehouses (Hellerstein, 2005). The diagram above show the structure of the entire system suitable for the company’s needs. Potential outsourcing/offshoring needs The company might need to outsource services in order to attain effectiveness in improving its productivity. Some of the outsourcing needs might include including a third party to manage their user interface, and hiring the services of outside servers companies to run their data. This means that the company can dedicate the marketing aspect of their site to other parties to manage in increase the company’s relevance in the competitive market. This is crucial in meeting the different needs of potential consumers, as parties contracted for marketing purpose will greatly entail more excess to the user interface. Essentially, it means that a third party will be needed to outsource for more customers, as the company’s data collection and analysis is improved (Khan, 2003). Contracting third party to manage and host the company’s system will entail a cut in the production cost as the contracted party will solely provide the infrastructure for running of the site and all servers. Justify the necessary resources Since the company anticipates for 20 percent annual increase in data warehouse storage space, the company will require increasing its storage capacity into order to be able to accommodate huge volumes of data. With the 20 percent
  • 31. increase each year, the company should also make places to increase its capacity to accommodate more customers at its user interfaces. Notably, this will involve enlarging the capacity of the OLTP system to handle more users without frequent failures (Khan, 2003). The system servers on the hand should be regularly upgraded to handle the projected increase of users to the company’s system every year. References Connolly, T. M., & Begg, C. E. (2014). Database systems: A practical approach to design,implementation, and management. Hellerstein, J. M. (2005). Readings in database systems. Cambridge, Mass. [u.a.: MIT Press. Khan, A. (2003). Data warehousing 101: Concepts and implementation. San Jose, Calif: KhanConsulting and Publishing. Silberschatz, A., Korth, H. F., & Sudarshan, S. (2011). Database system concepts. New York:McGraw-Hill. Vaisman, A., & Zimányi, E. (2014). Data warehouse systems: Design and implementation.
  • 32. Running head: SECTION 1: PROJECT INTRODUCTION 1 SECTION 1: PROJECT INTRODUCTION 4 Section 1: Project Introduction Student’s Name: Institutional Affiliation: Section 1: Project Introduction Background Information of the Company The LiquiTel Communications Company is an innovative internet-based organization that has its operations in the US, UK, and Canada. The Company offers services across its clientele by providing space for cloud storage to its customers across North America and Europe. The company offers high quality services and further distributes hardware to its customer base to guarantee its commitment in the IT business. The company has lately implemented a Data warehouse with BD2 to increase its capacity to analyze and store data of its customers
  • 33. across North America and Europe. To attain this, LiquiTel Communications outsourced IBM Corporation to offer its Information technology needs as it seeks to be able to allow its customers from every location to access their system and subsequently be attend to promptly (Bosworth et al., 2009). Type of business activities associated with the Company The LiquiTel Communications has its operations with the ICT sector, and its ventures mainly entail provision of services with the arena of Information Technology. Some of its common activities to its clients include: installation of business intelligence platforms, implementation of Statistical Analysis System software technology to different companies in the US, the UK, and Canada, and the creation of an effectively network infrastructure in various companies. Speculations on outsourcing and offshoring opportunities Data warehouse is information system that is used for data analysis and reporting. It is central repository of integrated data usually from one or even more disparate sources. These systems are capable of storing both current and historical data, hence they are used for purposes of creating trending reports for high end management reporting like annual or quarterly comparisons. In regard to LiquiTel Communications Ltd, the company is building a data warehouse with DB2 to improve its
  • 34. effectiveness and performance of its system for purposes of receiving customer reports in different parts of the region, and equally enhance its process of data analysis. With the implementation of a Data warehouse with customers who are required to access the services through its websites and subsequently make their desired transaction online (Bosworth et al., 2009). The data warehouse will increase the capacity of the company’s information to support a broad online platform that will serve its huge customer base in precision. The data warehouse allows organizations to have several marketing strategies that will be aimed at increasing its customer base. With an information system that is supported by an elaborate Data warehouse that used the DB2 version, its user interface will significantly improve. With such improvements of its user interface, the company will be able to advertise its services and equally register more customers into its programs. The data warehouse will allow the company’s system to support running of its website in the entire North America without failing. This means that as a market strategy, the company will reach out a large customer base. The number of online programs or services that are offered on the company’s website can also be increase to reach out to a larger circle of groups across the country (Ivanov et al., 2012). This is an aspect that was not possible before the implementation of the data warehouse, since
  • 35. the previous system did not have the capacity to support a system that handles a large number of users across the vast region. The previous system could not also provide an elaborate data analysis for the large customers reaching the company through its online platform. The mode of storage was not adequate for the huge volumes of data that were streaming into the previous database (Ivanov et al., 2012). Although the Data warehouse is commonly known to host various benefits, it also has several risks upon its use. Firstly, Data warehouses pose a huge risks in the data inputs as if is difficult to know all the data that is being stored. It is also a requirement to have a data classification policy, which is applied for purposes of all data entering the system. Lack of this policy poses huge risks of data being compromised. The data outputs also require the company to have software that monitors the outgoing data to ensure it is secure, failure to which the data is highly compromised. In addition, the security system of the data warehouse should be incorporated through programs that assign different roles (Ivanov et al., 2012). Improper utilization of this program put the system as huge risks as it remains completely unsecured. It is due to the context above, that it is important for LiquiTel Communications to outsource some of the tasks to firmer and
  • 36. reputable companies such as the IBM. The reason for the LiquiTel Communications Ltd to choose to work with IBM’s data warehouse technology could be because IBM is a well- established information technology that has proven to offer reliable services on large scale basis (Ivanov et al., 2012). Considering IBM is an IT company will adequate resources to support huge data volumes, this could be considers as the most appropriate company that could handle the needs of LiquiTel Communications Ltd Data warehouse technology provide a robust environment that supports numerous slots for data reporting and Data analysis, as well as storage (Ivanov et al., 2012). Considering the large number of customers that LiquiTel Communications Ltd has, implementing a data warehouse would subsequently support its programs, particular on the data that is channeled to them from its online platforms, which are accessible across the vast region. Its decision to use IBM data warehouse is essential considering that the IBM is an ICT company has a large capacity to meet the needs of LiquiTel Communications Ltd, hence improved performance for the company throughout. Overview of the Company’s: a) Operation systems
  • 37. The company plans to utilize a business intelligence platform. The platform entails a completion of a quality requirements documents allows the user needs, as well as expectations to be captured such that the infrastructure and information system can be designed properly. The scope of this part of the project is to create a repository for data collection beyond the standard relational databases (Harrington, 2009). This aims at designing a quality information system that automatically allows user needs, as well as expectations to be captured in the collection of data and analysis platforms. b) Databases and Data warehousing LiquiTel Communications Ltd utilizes the use of a database schema, which is understood as the relational database. It aims at demonstrating skills acquired in using databases for business transactions. Data warehouses are critical in providing support to business decisions through collection, consolidating and organizing data for reporting as well as analyzing with tools such as data mining and online analytical processing, OLAP (Harrington, 2009). c) Cloud technology and virtualization LiquiTel Communications Ltd seeks to install cloud technology and virtualization platform in the organization after realizing its benefits from large companies that have been using them for
  • 38. improved and efficient, as well as advanced analysis. Furthermore, the company realizes that the use of the cloud technology and virtualization would boost its performance, and equally set up new expectations, as well as performance goals (Ivanov et al., 2012). d) Network infrastructure and Security The company has its information system security evaluated using CIA (Confidentiality, Integrity, and Availability triangle), which mainly define the security policy of an organization. A network solution is always chose so as to support the conceived information systems, as well as allow for scalability. It should be understood that the CIO of the company is responsible for required design of infrastructure and security protocols upon the implementation of the network infrastructure in the company (Ivanov et al., 2012). The design of the infrastructure should be able to support the operations of the organization and equally protect its data. References Bosworth, S., Kabay, M. E., & Whyne, E. (2009). Computer security handbook. Hoboken, N.J: John Wiley & Sons. Harrington, J. L., & Harrington, J. L. (2009). Relational database design and implementation:
  • 39. Clearly explained. Amsterdam: Morgan Kaufmann/Elsevier. Ivanov, I., Sinderen, M. J., & Shishkov, B. (2012). Cloud computing and services science. New York: Springer.