Software weaknesses in design, architecture, code and deployment have led to software vulnerability exploited by the perpetrators. Although counter measure tools have been developed such as patch management systems, firewalls and antivirus, but the perpetrators have advance sophisticated tools such malware with crypto-lock and crypto-wall technologies. The current counter measures technologies are based on detection and respond model or risk management framework, which are no match to the attacker’s technologies based on speed technologies such as machine generated malwares and precision or stealth technologies such as command-andcontrol node malwares. Although lots of ink has been poured on advances in measuring and preventing software weakness on the detection and respond concept,this study is motivated to explore the state-of-art advances specifically on the novel concept of Continuous Trust Restoration (CTR). The Continuous Trust Restoration is a process of breaking down attacker’s activities kill chain and restoring the system trust. The CTR concept deploys speed, precision and stealth technologies on random route mutation, random host mutation, hypervisors, trust boot, software identities and software define infrastructure. Moreover, to deploy these technologies the study further explores a common security architectural framework with software metrics such as CVE (Common Vulnerability and Exposure), CWE (Common Weakness Enumeration), CVSS (Common Vulnerability Scoring System), CWSS (Common Weakness Scoring System), and CAPEC (Common Attack Pattern Enumeration and Classification). Finally, the study recommends a software security counter measures research paradigm shift from the current detection and respond models to Continuous Trust Restoration concept and from risk management frameworks to a Common Security Architectural Framework.
Evasion Streamline Intruders Using Graph Based Attacker model Analysis and Co...Editor IJCATR
Network Intrusion detection and Countermeasure Election in virtual network systems (NICE) are used to establish a
defense-in-depth intrusion detection framework. For better attack detection, NICE incorporates attack graph analytical procedures into
the intrusion detection processes. We must note that the design of NICE does not intend to improve any of the existing intrusion
detection algorithms; indeed, NICE employs a reconfigurable virtual networking approach to detect and counter the attempts to
compromise VMs, thus preventing zombie VMs. NICE includes two main phases: deploy a lightweight mirroring-based network
intrusion detection agent (NICE-A) on each cloud server to capture and analyze cloud traffic. A NICE-A periodically scans the virtual
system vulnerabilities within a cloud server to establish Scenario Attack Graph (SAGs), and then based on the severity of identified
vulnerability toward the collaborative attack goals, NICE will decide whether or not to put a VM in network inspection state. Once a
VM enters inspection state, Deep Packet Inspection (DPI) is applied, and/or virtual network reconfigurations can be deployed to the
inspecting VM to make the potential attack behaviors prominent.
Vulnerability scanners a proactive approach to assess web application securityijcsa
With the increasing concern for security in the network, many approaches are laid out that try to protect
the network from unauthorised access. New methods have been adopted in order to find the potential
discrepancies that may damage the network. Most commonly used approach is the vulnerability
assessment. By vulnerability, we mean, the potential flaws in the system that make it prone to the attack.
Assessment of these system vulnerabilities provide a means to identify and develop new strategies so as to
protect the system from the risk of being damaged. This paper focuses on the usage of various vulnerability
scanners and their related methodology to detect the various vulnerabilities available in the web
applications or the remote host across the network and tries to identify new mechanisms that can be
deployed to secure the network.
Secure intrusion detection and attack measure selectionUvaraj Shan
This document proposes NICE, a framework for secure intrusion detection and attack mitigation in virtual network systems. NICE uses distributed agents on cloud servers to monitor traffic, detect vulnerabilities, and generate attack graphs. It profiles virtual machines to identify their state and vulnerabilities. When potential attacks are detected, NICE can quarantine suspicious VMs and inspect their traffic. The attack analyzer correlates alerts, constructs attack graphs, and selects appropriate countermeasures based on the graphs. Evaluations show NICE can effectively detect attacks while minimizing performance overhead for the cloud system.
A NOVEL SECURITY PROTOCOL FOR WIRELESS SENSOR NETWORKS BASED ON ELLIPTIC CURV...IJCNCJournal
With the growing usage of wireless sensors in a variety of applications including Internet of Things, the security aspects of wireless sensor networks have been on priority for the researchers. Due to the constraints of resources in wireless sensor networks, it has been always a challenge to design efficient security protocols for wireless sensor networks. An novel elliptic curve signcryption based security protocol for wireless sensor networks has been presented in this paper, which provides anonymity, confidentiality, mutual authentication, forward security, secure key establishment, and key privacy at the same time providing resistance from replay attack, impersonation attack, insider attack, offline dictionary attack, and stolen-verifier attack. Results have revealed that the proposed elliptic curve signcryption based protocol consumes the least time in comparison to other protocols while providing the highest level of security.
Secure intrusion detection and countermeasure selection in virtual system usi...eSAT Publishing House
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
This document discusses the potential threat of a "Superworm", a theoretical worm that could incorporate successful propagation techniques from past worms to spread rapidly and cause widespread damage. It describes the features such a worm may have, including exploiting multiple vulnerabilities across many operating systems and using various proliferation methods. The document also examines a past university network security incident and two security technologies that could help detect and limit the spread of such a worm: an early worm detection system and a modified reverse proxy server.
A technical review and comparative analysis of machine learning techniques fo...IJECEIAES
Machine learning techniques are being widely used to develop an intrusion detection system (IDS) for detecting and classifying cyber attacks at the network-level and the host-level in a timely and automatic manner. However, Traditional Intrusion Detection Systems (IDS), based on traditional machine learning methods, lacks reliability and accuracy. Instead of the traditional machine learning used in previous researches, we think deep learning has the potential to perform better in extracting features of massive data considering the massive cyber traffic in real life. Generally Mobile Ad Hoc Networks have given the low physical security for mobile devices, because of the properties such as node mobility, lack of centralized management and limited bandwidth. To tackle these security issues, traditional cryptography schemes can-not completely safeguard MANETs in terms of novel threats and vulnerabilities, thus by applying Deep learning methods techniques in IDS are capable of adapting the dynamic environments of MANETs and enables the system to make decisions on intrusion while continuing to learn about their mobile environment. An IDS in MANET is a sensoring mechanism that monitors nodes and network activities in order to detect malicious actions and malicious attempt performed by Intruders. Recently, multiple deep learning approaches have been proposed to enhance the performance of intrusion detection system. In this paper, we made a systematic comparison of three models, Inceprtion architecture convolutional neural network (Inception-CNN), Bidirectional long short-term memory (BLSTM) and deep belief network (DBN) on the deep learning-based intrusion detection systems, using the NSL-KDD dataset containing information about intrusion and regular network connections, the goal is to provide basic guidance on the choice of deep learning models in MANET.
Application of Attack Graphs in Intrusion Detection Systems: An ImplementationCSCJournals
This document discusses integrating attack graphs with intrusion detection systems to help identify complex multi-stage attacks. It proposes an architecture where an intrusion detection system detects alerts and stores them in a database. A vulnerability scanner identifies vulnerabilities, and an attack graph generator uses the alerts and vulnerabilities to generate and update an attack graph. A tool then analyzes the alerts and attack graph to highlight detected intrusions on the graph. The goal is to help administrators better understand the progression of attacks by visualizing how alerts may be related across different stages of an attack. As a proof of concept, the paper implements this using SNORT for intrusion detection, NESSUS for vulnerability scanning, and MULVAL for attack graph generation.
This document summarizes security schemes for wireless sensor networks, including TinySec, IEEE 802.15.4, and others. It discusses the challenges of WSNs like power constraints and limited resources. It also outlines common security threats to WSNs such as denial of service attacks, attacks on information in transit, Sybil attacks, black hole/sinkhole attacks, and hello flood attacks. The document evaluates the feasibility of applying basic security schemes like cryptography and steganography to WSNs given their unique constraints and requirements.
Malware Risk Analysis on the Campus Network with Bayesian Belief NetworkIJNSA Journal
This document discusses using a Bayesian Belief Network (BBN) to analyze malware risk on a university campus network. It begins by introducing the campus network monitoring tools and SIR epidemiological model used to model malware propagation. It then provides background on BBN principles, including defining nodes, conditional probabilities, and using the network to compute joint probabilities. The document proposes applying a BBN to assess malware prevalence risk by relating threat, vulnerability, and cost impact on network assets. It aims to provide understandable risk assessments to inform decision making.
INVESTIGATING & IMPROVING THE RELIABILITY AND REPEATABILITY OF KEYSTROKE DYNA...IJNSA Journal
One of the most challenging tasks facing the security expert remains the correct authentication of human being which has been crucial to the fabric of our society. The emphasis is now on reliable person identification for computerized devices as the latter forms an integral part of our daily activities. Moreover with increasing geographical mobility of individuals, the identification problem has become more acute. One alternative, to curb down the increasing number of computer related crimes, is through the use of keystroke biometric technology which represents an enhancement to password mechanisms by incorporating typing rhythms in it. Time captured being critical to the performance of the identifier, it is primordial that it satisfies certain requirements at a suitable degree of acceptability This paper presents an evaluation of timing options for keystroke dynamics paying attention to their repeatability and reliability as well as their portability on different systems. In actual passwords schemes users enroll using one computer and access resources using other configurations at different locations without bothering about the different underlying operating systems.
IRJET- 3 Juncture based Issuer Driven Pull Out System using Distributed ServersIRJET Journal
This document discusses network security visualization and proposes a classification system for network security visualization systems. It begins by introducing the importance of visualizing network security data due to the large quantities of data produced. It then reviews existing network security visualization systems and outlines key aspects they monitor like host/server monitoring, port activity, and intrusion detection. The document proposes a taxonomy to classify network security visualization systems based on their data sources and techniques. It concludes by stating papers were selected for review based on their relevance to network security, novelty of techniques, and inclusion of evaluations.
HYBRID ARCHITECTURE FOR DISTRIBUTED INTRUSION DETECTION SYSTEM IN WIRELESS NE...IJNSA Journal
This document proposes a hybrid architecture for a distributed intrusion detection system using multiple agents. The key aspects of the architecture include:
- Using multiple independent tracker agents that monitor hosts and generate reports sent to monitors and storage.
- Monitors analyze activity and compare to signatures to detect known attacks, or send data to anomaly detectors.
- Anomaly and misuse detectors use classification and pattern matching to detect known and unknown attacks.
- An inference module coordinates entities across hosts to classify new attacks using a knowledge base and signature generator.
- A countermeasure module alerts administrators and can take actions like dropping packets in response to detected attacks.
IMPROVED IDS USING LAYERED CRFS WITH LOGON RESTRICTIONS AND MOBILE ALERTS BAS...IJNSA Journal
With the ever increasing number and diverse type of attacks, including new and previously unseen attacks, the effectiveness of an Intrusion Detection System is very important. Hence there is high demand to reduce the threat level in networks to ensure the data and services offered by them to be more secure. In this paper we developed an effective test suite for improving the efficiency and accuracy of an intrusion detection system using the layered CRFs. We set up different types of checks at multiple levels in each layer. Our framework examines various attributes at every layer in order to effectively identify any breach of security. Once the attack is detected, it is intimated through mobile phone to the system administrator for safeguarding the server system. We established experimentally that the layered CRFs can thus be more effective in detecting intrusions when compared with the other previously known techniques.
The document proposes a security model for wireless sensor networks using zero knowledge protocol. It addresses security threats like cloning attacks, man-in-the-middle attacks, and replay attacks. The model uses a unique fingerprint for each node based on its neighboring nodes to detect cloning. It also uses zero knowledge protocol for sensor nodes to verify authenticity without transmitting cryptographic information, preventing man-in-the-middle and replay attacks. The paper analyzes the performance and security of the proposed model.
Comparative Analysis of Different Denial of Service Attackstheijes
This document summarizes and compares different types of denial of service (DoS) attacks. It discusses Ping of Death, Smurf, buffer overflow, teardrop, SYN flooding, and permanent DoS attacks. For each attack, it describes how the attack occurs and how systems can potentially be recovered. It concludes that while some attacks like Ping of Death can flood systems, Smurf attacks can slow processes down, and permanent DoS attacks can destroy hardware. Defenses include firewalls, configuring routers, and replacing damaged hardware.
Investigate the Use of Shear Walls in Concrete Structures, Considering the Ex...theijes
Buildings with cast-in-situ reinforced concrete shear walls are widespread in manyearthquake-prone countries and regions, such as Canada, Chile, Iran, Romania, Turkey, Colombia, the republics of the former Soviet Union, etc. This type of construction hasbeen practiced since the 1960s in urban regions for medium- to high-rise buildings(4 to 35 stories high). Shear wall buildings are usually regular in plan and in elevation.However, in some buildings, lower floors are used for commercial purposes and thebuildings are characterized with larger plan dimensions at those floors.In this paper, an overview of the various studies conducted on shear walls, Such asExperimental dynamic tests,finite element model, Rocking behavior and nonlinear modeling. So, in the future, the development of FE models of complete buildings will be studied. In the case of a building with several stories, the simplified model of shear wall should be able to account for the overturning phenomenon (the refined model already can). In the case of a single story structure, the main outlook of this work is obviously the development of a FE model and its confrontation to experimental data, which is currently ongoing research.
MRI rat organ Assessment under recurrent Interferon administrationtheijes
This study aims to assess serial and transverse quantitative Magnetic Resonance Imaging (MRI) in four rat groups using different doses of a formulation based on the combinations of IFN alpha 2b and gamma. Axial and coronal T1, T2 and Diffusion MRI images have been performed in order to follow up morphological and tissue texture changes in the rat brain, cerebellum, spinal cord and kidney. As a result, no morphological changes have been observed during 28 days in any of the four groups including the placebo. Yet, doses until 15 times were bigger than the therapeutic dose. This MRI studies is robust and complementary evidence that the pharmaceutical formulation mixing in the same vial IFNs alpha2b and gamma is safe. For the first time, results of a longitudinal MRI study in rats based on the effects of this pharmacological combination are reported.
A Survey on Mobile Commerce Security Issues and Applicationstheijes
Electronic saving money and Mobile managing an account are seen as one of the best business-to-buyer applications in electronic trade and versatile business. The utilization of e-saving money and m- managing an account particularly in created nations has become quickly. Low charges, time investment funds and opportunity from time and spot have been observed to be generally imperative components of e-managing an account and m-saving money. These administrations are simple to utilize helpful and good with way of life , pace of administration conveyance is quick. There are two sorts of administrations offered in e-keeping money and m-keeping money, i.e. A) Notifications and alarms and B) Data, in which the bank sends messages containing data or notice required by the client. In this paper shows another system for using so as to enhance security of these messages steganography and cryptography system together.
Sediment Source and Transport in River Channels: Implications for River Struc...theijes
This document discusses sediment source and transport in river channels and its implications for river structures. It contains 3 key points:
1) Rivers naturally transport and accumulate sediments from erosion. Understanding sediment characteristics and transport processes is important for managing rivers and engineering structures like dams and bridges.
2) Sediments are transported via various modes including bedload, suspended load, and dissolved load. Factors like particle size, water flow, and channel geometry determine sediment transport rates.
3) High sediment influx can negatively impact structures like filling reservoirs in dams, disturbing turbine blades, and exposing foundations around bridge piers to erosion. Understanding sediment dynamics helps mitigate these problems and extend the lifespan of river engineering projects.
On essential pseudo principally-injective modulestheijes
Pseudo-injectivity is a generalization of quasi- injectivity. An essential Pseudo-injective module wasintroduced by R. Jaiswal, P.C. Bharadwaj and S. Wongwai [7]. This paper is generalization of above notion with new properties
Mediating Role of Gratitude In Effect of Bonds on Customer Loyaltytheijes
The purpose of this study was to identify and analyze the role of gratitude in mediating the effect of relational of bonds on customer loyalty. The populations in this study are all customer deposits Bank Sultra Bombana Branch totaling 95 customers. Total sample in this study are 77 respondents through Slovin formula. Furthermore, the respondent obtained by convenience sampling method. Data for needs analysis questionnaires were obtained by using a Likert scale of 5 points. The data were then analyzed with methods of structural equation modeling approach using the Partial Least Square SmartPLS software version 2.0. The analysis showed that gratitude plays a role mediating influence on customer loyalty relational bonds Bank Sultra Bombana Branch
Strength Improvement of Mud Houses Through Stabilization of the Lateritic Mat...theijes
This paper reports an experimental investigation of the compressive strength of laterite stabilized with cement (CSL), lime (LSL) and rice straw (RSL) respectively. The laterites were collected from borrow pit used by locals in Bauchi, Nigeria to build mud houses. Unfortunately the mud houses experienced massive failures by through wall collapses over the years during the flooding cycles of the rainy seasons. An attempt is made to stabilize the lateritic soil materials used for the mud house walls in order to strengthen them against rains and flood erosions. Briefly discussed are factors that affect performance and strength, this include mix proportions, compaction, characteristics of the lateritic soil, mix procedure and curing. The results showed that the lateritic soils in the investigated area were relatively high on sand and lower on clay thereby promoting cement as the best stabilizer for strength. It increased the compressive strength by 661% from 0.61 N/mm2 at zero stabilization (ZSL) to 4.64 N/mm2 at 8% cement content after 28 days of curing. LSL and RSL at the same contents had strengths of 1.21 N/mm2 (98.4% increase) and 0.71 N/mm2 (16.4% increase) respectively. At 6% contents strength values were 4.33 N/mm2 , 1.16 N/mm2 and 0.66 N/mm2 respectively. The values reduced at 4% contents reporting 3.14N/mm2 , 0.82N/mm2 and 0.44N/mm2 respectively. While CSL increased non-linearly in density with increase in cement content, LSL and RSL decreased with increase of the respective contents. The results show that with cement as the stabilizer, mud house walls constructed with CSL bricks will resist collapse failures due to the perennial flooding in the area. Moreover by their relatively high compressive strengths they can be used for load bearing walls as much as sandcrete blocks
Application of Central Limit Theorem to Study the Student Skills in Verbal, A...theijes
Through this paper we analyses the application of the central limit theorem to study the Verbal, Apptitude and Reasoning skills of students. The planning of teaching is based on the mathematical knowledge about the theorem. The different meanings of this theorem were analyzed using the history of its development and previous research studies related to this theorem. Results at the end of this work will serve to improve the correct application of different elements of meaning for central limit theorem when solving the selected problem and to prepare new proposals to teach statistics to students. The central limit theorem forms the basis of inferential statistics and it would be difficult to overestimate its importance. In a statistical study, the sample mean is used to estimate the population mean. However, the number of different samples (of a given size) that could be taken is extremely large and these different samples would have different means. Some would be lower than the mean of the population and some would be higher.The central limit theorem states that, for samples of size n from a normal population, the distribution of sample means is normal with a mean equal to the mean of the population and a standard deviation equal to the standard deviation of the population divided by the square root of the sample size. (For suitably large sample sizes, the central limit theorem also applies to populations whose distributions are not normal.)
The Ideal, The Mission, The Vision, The Goals and The Competencestheijes
The competences, that individuals who graduate from an educational institution must possess, are intimately related to the ideal, the vision and the educational project that has been given by an institution. In this article the authors show the interrelationships that exist between those terms.
Optimum Conditions for the Removal of Cadmium from Aqueous Solution with Bamb...theijes
The performance of two varieties of Bamboo activated carbon (CABC washed and CABC unwashed), produced by chemical activation with ZnCl2, was evaluated through batch adsorption studies for the removal of Cadmium from aqueous solution. The effects of adsorbent dose, initial concentration of cadmium, agitation time, adsorbate p and, particle size were used as variables to obtain the optimum conditions for the removal of cadmium. Results obtained revealed that as the adsorbent dose increased, the amount adsorbed per unit mass decreased indicating that more active sites were utilized at smaller adsorbent dose. Also, the effects of initial cadmium concentration showed that percentage removal rate increased with increase in cadmium concentration due to availability of more metal ions at higher concentrations. Thus, all the carbons achieved at least 82.62% removal at initial Cadmium concentration of 50mg/l. The optimum pH was 5 and 7 for CABC unwashed and CABC washed respectively while the optimum particle size was 50μm for all the carbon tested. Furthermore, CABC unwashed performed better as an adsorbent because it achieved 71.48% removal of Cadmium in 60 minutes, while CABC washed achieved 69.46% removal in 120mins.
The role of technology transfer & cooperation for the development of wind pow...theijes
Sri Lankan power sector heavily depends on the import of fossil based energy sources and major hydro power plants. Considering the fact that Sri Lanka’s hydro power reserves have already been utilized, the country’s power sector highly vulnerable to price fluctuations in the imported fossil energy sources. Recent windmapping studies claim that the country possesses several areas estimated to have good-to-excellent wind resources. However, given such a backdrop, Sri Lanka is still lacking in its capability to maximize on the real potential of wind power based electricity generation. Accordingly, the objective of this research is to study the technology transfers and collaboration in the wind power sector in Sri Lanka and to identify a directional strategy to foster the development of wind turbine and components industry in the country. In order to effectively answer the stated research questions a qualitative approach was adopted. A comprehensive literature review followed by two case studies pertaining to the wind power sector in Sri Lanka were studied. The first case represented typical technology transfer approach in many developing countries, in which country acquired the wind technology through international trade. The second case is an example of technology licencing and study reveals that such integrations have subsequently created opportunities for more local value addition.
Analysis of Bhadra River Surface Water during Rainy Seasontheijes
Water samples were collected from Bhadra river along four different significant points and analyzed various temperature correlated parameters during the period rainy season 5th july, 2014 to 22nd august, 2014 using standard method. Water samples were collected from about10 cm depth with three replications from each station during low tides and high tides of the day. The study was conducted to know the present status of the water quality of the Bhadra river and the change of water quality parameters with the change of temperature. The study is significant due to huge practice of aquaculture at the adjacent area using the river water. The other significant feature is the connection of the river with sundarbans river system. This study involves the determination of some physical and chemical parameters which are mainly temperature correlated such as pH, transparency, salinity, electrical conductivity (EC), total alkalinity, total acidity, dissolved oxygen (DO) and dissolved free carbon dioxide of the surface water at four locations. The mean of parameters of different stations were temperature 290C; pH 7.68; transparency 10.88 cm, salinity 3.18 ppt, electrical conductivity (EC) 4.78 mS/cm, total alkalinity 103.91 mg/L, total acidity 8.4 mg/L, dissolve oxygen(DO) 5.1 mg/L, dissolve free carbon dioxide 3.89 mg/L. According to the results the parameters found less deviation from the standard water quality for aquatic habitat of river water
Experimental Study on Durability Characteristics of High Performance Concrete...theijes
High performance concrete (HPC) is developed gradually over the last 15 years with respect to production of concrete with higher and higher strength. To enhance the properties such as durability, strength, workability, economy has increased due to the usage of mineral admixtures in making high performance concrete. The scope of the present study is to investigate the effect of mineral admixtures and by-products towards the performance of HPC. An effort has been made to concentrate on the mineral admixture of silica fume towards their pozzolanic reaction and industrial by-product of bottom ash and steel slag towards their hydration reaction can be contributed towards their strength and durability properties. The strength characteristics such as compressive strength, tensile strength and flexural strength were investigated to find the optimum replacement of mineral admixture and by-product admixture. HPC with mineral admixture of silica fume at the replacement levels of 0%, 5%, 10%, 15% & 20% were studied at the age of 28 days and industrial by-products of bottom ash and steel slag aggregate at the replacement level of 10%, 20%, 30%, 40% & 50% were studied at the age of 28 days. There were a total of 15 mixes created with different material contents. Out of 14 were HPC mixes and 1 were conventional concrete mixes. Finally strength has enhanced with the mix of silica fume can replaced by cement with 5% and bottom ash and steel slag can replaced by fine and coarse aggregate with 10% can be achieved higher strength when compared with other percentage of mixes. The combination mixes can be classified as binary and ternary mixes. Binary mixes involved combinations of silica fume and bottom ash (SF+BA), silica fume and steel slag aggregate (SF+SSA), bottom ash and steel slag aggregate (BA+SSA) and Ternary mixes involved combination of three materials such as silica fume, bottom ash and steel slag aggregate (SF+BA+SSA) in High performance concrete. The investigation revealed that the combined use of silica fume, bottom ash and steel slag aggregate improved the mechanical properties of HPC and thus there 3 materials may use as a partial replacement material in making HPC. The durability studies such as acid resistance, salt resistance, sulphate resistance & water absorption were conducted. From the experimental investigation, it was observed that mineral admixture of silica fume and industrial by-products of bottom ash & steel slag aggregate plays a vital role in improving the strength and durability parameter itself.
Root Causes Analysis for Lean Maintenance Using Six Sigma Approachtheijes
The aim of the project is to reduce rejection level of needle roller bearing assembly using six sigma techniques. Six Sigma is a quality improvement tool for product. It reduces the defects, minimizes the variation and improves the capability of the manufacturing process. The main objective of Six Sigma is to increase the profit margin, improve financial condition through minimizing the defects rate of product. Further it increases the customer satisfaction, retention and produces the best class product from the best process performance. The needle roller bearing has more Lining Thickness Variation (LTV) defect and bearing Fits like loose fit and tight fit in the production line. The current rejection level of lining thickness variation defect is very high which leads to consumption of money in the form of rework and rejection of the job. The aim of the project is to identify the causes for various assembly defect and its remedies to reduce rejection level in needle roller bearing
Selection Sort with Improved Asymptotic Time Boundstheijes
Sorting and searching are the most fundamental problems in computer science. Sorting is used for most of the times to help in searching. One of the most well known sorting algorithms that are taught at introductory computer science courses is the classical selection sort. While such an algorithms is easy to explain and grasp at the introductory computer science level, it is far from being an efficient sorting technique, since it requires 푶(풏 ퟐ ) time to sort a list of n numbers. It does so by repeatedly finding the minimum. In this paper we explore the benefit of reducing the search time for the minimum on each pass of the algorithm, and show that we can obtain a worst case time bound of 푶(풏 풏 ퟐ ) by making only minor modifications to the input list. Thus our bound is a factor 푶( 풏 ퟐ ) of faster than the classical selections sort and other classical sorts such as insertion and bubble sort.
The Effect of Moisture Content on Some Physical and Engineering Properties of...theijes
Locust bean is a perennial edible crop and important source of food that must be processed for preservation and availability throughout the year. Physical and mechanical properties of locust beans are necessary for the design of equipment to handle, transport, process and store the crop. The properties were evaluated as a function of moisture content of locust beans. The locust seeds were tested for size and shape, true density, bulk density, porosity, sphericity, static coefficient of friction on plywood, aluminium and stainless, angle of repose and specific heat atmoisture conten to franging from 10.50 and 20.76 % (dry basis). The average length, width, thickness and geometric mean diameter of the locust beans were 12.04, 8.36, 5.04 and 7.50 mm respectively, while the true density, bulk density, porosity, surface area and sphere city were, 1166.09 kg/m3 ,729.90 kg/m3 , 37.37 %, 204.25 mm2 and 0.67, respectively. The respective values of static coefficient of friction for plywood, stainless and aluminium were 0.56, 0.51 and 0.48 while the angle of repose was 40.17o .The higher friction of coefficient was observed on plywood and the lowest on aluminium. The specific heat was observed to be 3.8 kJ/kg/Kat moisture content of 10.50 %.The information provided in this study will be useful for locust bean seed processing machine design and fabrication as well as industrial processing and structural design of storage bin of the seed.
Measuring Environmental Performance of Supply Chaintheijes
Environmental performance is a hot topic for researchers in management science. It is also one of the major concerns of supply chain leaders. To assess this performance, there are increasingly many management tools. It is then appropriate to wonder the role of these tools in supply chain: are these tools meet real organizational needs? Or they are used to promote supply chain image face institutional constraints increasingly strong? In this context, many modules and methodologies have been established in literature in order to evaluate environmental performance of supply chain, since it has become an important issue for society. However, few of them analyze environmental impacts. So, this work presents an integrated methodology to perform this evaluation, based on issues which significantly affect the environment. We purpose a module which will allow the assessment of this performance.This module was tested in an automotive supply chain in north of Morocco.
H.D.L Design for Ultra High Multi Frequency Rate P.R.B.S Generator for Identi...theijes
The Aim is to H.D.L Design for Multi Frequency Rate like Giga/Tera/Peta/Exa/Zetta Hertz/Bits Per Second (Baud Rate) Speed .P.R.B.S Carrier Generator ASIC for Ultra High Speed Long Distance Communication Hitech Smart Computing Products like Cloud & Internet Computing, L.T.E A.S.I.C, Wi-Fi,Gi-Fi, O.F.D.M.A W.C.D.M.A, Q.C.D.M.A, G.P.S, and Wi-MAX Technologies etc. Basically This Design Contains P.R.B.S Generators of Different Tapped Sequences 2e7 -1, 2e10 -1, 2e15 -1, 2e23 -1, 2e31 -1 etc and Multiplexer. These different pattern sequences are Designated as per C.C.I.T.T I.T.U Standards. This Soft I.P Core Designed by V.H.D.L & Verilog H.D.L Languages. Design flow Implemented by Xilinx ISE 9.2i IDE Software. This P.R.B.S Generator Mainly suit for latest coming generation New Innovative Low Power Portable Smart Computing Products like I phones, Tablets, Note Book, Pocket Multimedia SOC Computing, GPS Mobile phone Cards, GPRS, and Handheld Instruments etc
The purpose of this paper two fold. First and foremost it presents a background narrative on the origins, innovations and applications of novel structural automation technologies and the rarity of experts involved in research, development and practice of this field. The second part of this paper presents a rudimentary framework for a solution addressing this paucity – the creation of an interdisciplinary academic program at PAAET that will be the first ever in the region to address applied information communication technologies ICT in the design, planning, engineering and management of structural automation projects. In doing so, we need also to define the level of implementation. This field, as all fields in ICT, have been loosely defined and most applications carry less weight in its implementation than what should be applied. This paper gives an attempt to define an indexing scheme by which we can easily classify such implementation and generate a ranking by which we can safely define its level of ―Intelligence‖.International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
Our journal has become is a renowned international journal that operates on a monthly basis. It embraces the principles of open access, peer review, and full refereeing to ensure the highest standards of scholarly communication stands as a testament to the commitment of the global scientific community towards advancing research, promoting interdisciplinary collaborations, and enhancing academic excellence.
Security Attacks And Solutions On Ubiquitous Computing NetworksAhmad Sharifi
This document discusses security challenges in ubiquitous computing environments. It begins by defining ubiquitous computing as involving the integration of computing technology into everyday objects and environments. This allows information access from any device at any time, but also increases security risks. The document then outlines some common ubiquitous applications like smart homes. It identifies key security issues like lack of authentication, unauthorized access, and privacy concerns. Finally, it discusses challenges in ubiquitous security including how the expanded computing environment impacts traditional security methods and introduces new privacy and trust issues.
Include at least 250 words in your posting and at least 250 words inmaribethy2y
Include at least 250 words in your posting and at least 250 words in your reply. Indicate at least one source or reference in your original post. Please see syllabus for details on submission requirements.
Module 1 Discussion Question
Search "scholar.google.com" for a company, school, or person that has been the target of a network
or system intrusion? What information was targeted? Was the attack successful? If so, what changes
were made to ensure that this vulnerability was controlled? If not, what mechanisms were in-place to protect against the intrusion.
Reply-1(Shravan)
Introduction:
Interruption location frameworks (IDSs) are programming or equipment frameworks that robotize the way toward observing the occasions happening in a PC framework or system, examining them for indications of security issues. As system assaults have expanded in number and seriousness in the course of recent years, interruption recognition frameworks have turned into an essential expansion to the security foundation of generally associations. This direction archive is planned as a preliminary in interruption recognition, created for the individuals who need to comprehend what security objectives interruption location components serve, how to choose and design interruption discovery frameworks for their particular framework and system situations, how to deal with the yield of interruption identification frameworks, and how to incorporate interruption recognition capacities with whatever remains of the authoritative security foundation. References to other data sources are likewise accommodated the peruse who requires particular or more point by point guidance on particular interruption identification issues.
In the most recent years there has been an expanding enthusiasm for the security of process control and SCADA frameworks. Moreover, ongoing PC assaults, for example, the Stunt worm, host appeared there are gatherings with the inspiration and assets to viably assault control frameworks.
While past work has proposed new security components for control frameworks, few of them have investigated new and in a general sense distinctive research issues for anchoring control frameworks when contrasted with anchoring conventional data innovation (IT) frameworks. Specifically, the complexity of new malware assaulting control frameworks - malware including zero-days assaults, rootkits made for control frameworks, and programming marked by confided in declaration specialists - has demonstrated that it is exceptionally hard to avert and identify these assaults dependent on IT framework data.
In this paper we demonstrate how, by joining information of the physical framework under control, we can distinguish PC assaults that change the conduct of the focused on control framework. By utilizing information of the physical framework we can center around the last goal of the assault, and not on the specific instruments of how vulnerabilities are misused, and how ...
A Resiliency Framework For An Enterprise CloudJeff Nelson
The document summarizes a research paper that proposes a resiliency framework called the Cloud Computing Adoption Framework (CCAF) for enterprise clouds. CCAF includes four major emerging services - software resilience, service components, guidelines, and real case studies - that are designed to improve an organization's security when adopting cloud computing. The framework was validated through a large survey that provided user requirements to guide the system's design and development. CCAF aims to illustrate how software resilience and security can be improved for enterprises moving to the cloud.
Team research paper and project on network vulnerabilities with multiple attacks and defesnses:
Cybersecurity
-For this project, our class was paired with teams to attempt to find vulnerabilities in other teams networks and to successfully beach their network.
-My role in this group was to help breach other team vulnerabilities through different attacks like responder attacks, honeypots, etc.
-The main challenges of this project were trying to find the vulnerabilities successfully, as the whole team had troubles with each of our different attacks and defenses.
-We learned how to use cybersecurity tools to help find vulnerabilities in networks and how to protect against them better. For example, in the honeypot we used we deployed it to port 80, when the attacker tried to access our fake server we were notified. We also deployed palto alto firewall to create our private and secure network. For an attack, we also used password crackers like john the ripper. This project taught us how to breach networks as a team.
This document provides summaries of 7 IEEE papers from 2012 related to software projects in various domains such as Java, J2ME, J2EE, .NET, MATLAB and NS2. The papers discuss topics such as password security, data provenance, trust-aware routing in wireless sensor networks, content distribution via network coding, detecting insider threats, secure message passing interfaces, and the security of an anonymity system with traceability.
Security against Web Application Attacks Using Ontology Based Intrusion Detec...IRJET Journal
The document presents a proposed ontology-based intrusion detection system to provide security against web application attacks. The system aims to address limitations of existing signature-based intrusion detection systems, such as high false positive and negative rates. The proposed system uses an ontology created with Protege to model web application attacks, vulnerabilities, threats and security controls. Rules are also defined to allow the system to predict and classify attacks based on the ontology. The system architecture includes components for system analysis, interface, rule engine, ontology generation and a knowledge base. The system is evaluated on its ability to detect common web attacks like SQL injection, cross-site scripting and buffer overflow attacks.
IRJET- Zombie - Venomous File: Analysis using Legitimate Signature for Securi...IRJET Journal
The document discusses a proposed method for detecting viruses and malware that evade existing antivirus software. It uses a combination of analyzing files with VirusTotal's database of known threats and applying natural language processing techniques like suffix trees and TF-IDF to identify malicious patterns in files. An evaluation shows the proposed method can detect viruses that existing antivirus and VirusTotal miss, achieving a 97% accuracy rate in testing.
1. The document discusses intrusion detection systems and proposes a cluster-based intrusion detection system for wireless sensor networks.
2. It proposes a multi-level intrusion detection architecture with detection at both the cluster head and network-wide levels.
3. The proposed system would detect intrusions through anomaly detection and has been evaluated through a survey of 50 experts in the field.
Cyber-Defensive Architecture for Networked Industrial Control SystemsIJEACS
This paper deals with the inevitable consequence of the convenience and efficiency we benefit from the open, networked control system operation of safety-critical applications: vulnerability to such system from cyber-attacks. Even with numerous metrics and methods for intrusion detection and mitigation strategy, a complete detection and deterrence of internal code flaws and outside cyber-attacks has not been found and would not be found anytime soon. Considering the ever incompleteness of detection and prevention and the impact and consequence of mal-functions of the safety-critical operations caused by cyber incidents, this paper proposes a new computer control system architecture which assures resiliency even under compromised situations. The proposed architecture is centered on diversification of hardware systems and unidirectional communication from the proposed system in alerting suspicious activities to upper layers. This paper details the architectural structure of the proposed cyber defensive computer control system architecture for power substation applications and its validation in lab experimentation and on a cybersecurity testbed.
We have evolved an IT system that is ubiquitous and pervasive and integrated into most aspects of our lives. Many of us are working on 4th and 5th level refinements in efficiency and functionality. But, we stand on the shoulders of those who came before and this restricts our freedom of action. The prior work has left us with an ecosystem which is the living embodiment
of our state-of-the-art. While we work on integration, refinement, broader application and efficiency, the results must move seamlessly into the ecosystem. Fundamental concepts are
being researched in the lab and may rebuild the world we all live in, until that happens, we must work within the ecosystem.
Modification data attack inside computer systems: A critical reviewCSITiaesprime
This paper is a review of types of modification data attack based on computer systems and it explores the vulnerabilities and mitigations. Altering information is a kind of cyber-attack during which intruders interfere, catch, alter, take, or erase critical data on the personal computers (PCs) and applications through using network exploit or by running malicious executable codes on victim's system. One of the most difficult and trendy areas in information security is to protect the sensitive information and secure devices from any kind of threats. Latest advancements in information technology in the field of information security reveal huge amount of budget funded for and spent on developing and addressing security threats to mitigate them. This helps in a variety of settings such as military, business, science, and entertainment. Considering all concerns, the security issues almost always come at first as the most critical concerns in the modern time. As a matter of fact, there is no ultimate security solution; although recent developments in security analysis are finding daily vulnerabilities, there are many motivations to spend billions of dollars to ensure there are vulnerabilities waiting for any kind of breach or exploit to penetrate into the systems and networks and achieve particular interests. In terms of modifying data and information, from old-fashioned attacks to recent cyber ones, all of the attacks are using the same signature: either controlling data streams to easily breach system protections or using non-control-data attack approaches. Both methods can damage applications which work on decision-making data, user input data, configuration data, or user identity data to a large extent. In this review paper, we have tried to express trends of vulnerabilities in the network protocols’ applications.
AN ISP BASED NOTIFICATION AND DETECTION SYSTEM TO MAXIMIZE EFFICIENCY OF CLIE...IJNSA Journal
End users are increasingly vulnerable to attacks directed at web browsers which make the most of popularity of today’s web services. While organizations deploy several layers of security to protect their systems and data against unauthorised access, surveys reveal that a large fraction of end users do not utilize and/or are not familiar with any security tools. End users’ hesitation and unfamiliarity with security products contribute vastly to the number of online DDoS attacks, malware and Spam distribution. This work on progress paper proposes a design focused on the notion of increased participation of internet service providers in protecting end users. The proposed design takes advantage of three different detection tools to identify the maliciousness of a website content and alerts users through utilising Internet Content Adaptation Protocol (ICAP) by an In-Browser cross-platform messaging system. The system also incorporates the users’ online behaviour analysis to minimize the scanning intervals of malicious websites database by client honeypots. Findings from our proof of concept design and other research indicate that such a design can provide a reliable hybrid detection mechanism while introducing low delay time into user browsing experience.
This document discusses three methods of software assurance: kernel separation, desktop virtualization, and the Trusted Platform Module (TPM).
Kernel separation (also known as MILS) isolates operating system processes and partitions hardware to separate developer code, system resources, and data objects. This aims to reduce vulnerabilities by compartmentalizing different functions.
Desktop virtualization stores the desktop environment on centralized servers rather than individual devices. This allows for easier maintenance, troubleshooting, and access controls. All user data and customizations can be removed when logging off.
TPMs create encryption keys during the boot process to validate that critical software and firmware have not been modified. This helps detect malware early and takes a proactive approach to
FLOODING ATTACKS DETECTION OF MOBILE AGENTS IN IP NETWORKScsandit
This document summarizes a research paper that proposes a new framework for detecting flooding attacks in mobile agent networks. The framework integrates divergence measures like Hellinger distance and Chi-square over a sketch data structure. The sketch data structure is used to derive probability distributions from traffic data in fixed memory. Divergence measures compare the current and prior probability distributions to detect deviations indicating attacks. The performance of detecting attacks while minimizing false alarms is evaluated using real network traces with injected flooding attacks. Experimental results show the proposed approach outperforms existing solutions.
Distributed Digital Artifacts on the Semantic WebEditor IJCATR
Distributed digital artifacts incorporate cryptographic hash values to URI called trusty URIs in a distributed environment
building good in quality, verifiable and unchangeable web resources to prevent the rising man in the middle attack. The greatest
challenge of a centralized system is that it gives users no possibility to check whether data have been modified and the communication
is limited to a single server. As a solution for this, is the distributed digital artifact system, where resources are distributed among
different domains to enable inter-domain communication. Due to the emerging developments in web, attacks have increased rapidly,
among which man in the middle attack (MIMA) is a serious issue, where user security is at its threat. This work tries to prevent MIMA
to an extent, by providing self reference and trusty URIs even when presented in a distributed environment. Any manipulation to the
data is efficiently identified and any further access to that data is blocked by informing user that the uniform location has been
changed. System uses self-reference to contain trusty URI for each resource, lineage algorithm for generating seed and SHA-512 hash
generation algorithm to ensure security. It is implemented on the semantic web, which is an extension to the world wide web, using
RDF (Resource Description Framework) to identify the resource. Hence the framework was developed to overcome existing
challenges by making the digital artifacts on the semantic web distributed to enable communication between different domains across
the network securely and thereby preventing MIMA.
This document describes a proposed vulnerability management system (VMS) that aims to automate the process of scanning software applications to identify vulnerabilities. The proposed system uses a hybrid algorithm approach that incorporates features from existing vulnerability detection tools and algorithms. The algorithm involves five main phases: inspection, scanning, attack detection, analysis, and reporting. The algorithm is intended to increase the accuracy of vulnerability detection compared to existing systems. The proposed VMS system and hybrid algorithm were tested using various vulnerability scanning tools on virtual machines, and results demonstrated that the VMS could automate the vulnerability assessment process and generate reports on detected vulnerabilities with severity levels. The main limitation is that scans using the VMS may take more time than some existing tools.
Similar to Novel Advances in Measuring and Preventing Software Security Weakness: Continuous Trust Restoration (CTR) (20)
A brief introduction to quadcopter (drone) working. It provides an overview of flight stability, dynamics, general control system block diagram, and the electronic hardware.
An Internet Protocol address (IP address) is a logical numeric address that is assigned to every single computer, printer, switch, router, tablets, smartphones or any other device that is part of a TCP/IP-based network.
Types of IP address-
Dynamic means "constantly changing “ .dynamic IP addresses aren't more powerful, but they can change.
Static means staying the same. Static. Stand. Stable. Yes, static IP addresses don't change.
Most IP addresses assigned today by Internet Service Providers are dynamic IP addresses. It's more cost effective for the ISP and you.
20CDE09- INFORMATION DESIGN
UNIT I INCEPTION OF INFORMATION DESIGN
Introduction and Definition
History of Information Design
Need of Information Design
Types of Information Design
Identifying audience
Defining the audience and their needs
Inclusivity and Visual impairment
Case study.
Conservation of Taksar through Economic RegenerationPriyankaKarn3
This was our 9th Sem Design Studio Project, introduced as Conservation of Taksar Bazar, Bhojpur, an ancient city famous for Taksar- Making Coins. Taksar Bazaar has a civilization of Newars shifted from Patan, with huge socio-economic and cultural significance having a settlement of about 300 years. But in the present scenario, Taksar Bazar has lost its charm and importance, due to various reasons like, migration, unemployment, shift of economic activities to Bhojpur and many more. The scenario was so pityful that when we went to make inventories, take survey and study the site, the people and the context, we barely found any youth of our age! Many houses were vacant, the earthquake devasted and ruined heritages.
Conservation of those heritages, ancient marvels,a nd history was in dire need, so we proposed the Conservation of Taksar through economic regeneration because the lack of economy was the main reason for the people to leave the settlement and the reason for the overall declination.
OCS Training Institute is pleased to co-operate with
a Global provider of Rig Inspection/Audits,
Commission-ing, Compliance & Acceptance as well as
& Engineering for Offshore Drilling Rigs, to deliver
Drilling Rig Inspec-tion Workshops (RIW) which
teaches the inspection & maintenance procedures
required to ensure equipment integrity. Candidates
learn to implement the relevant standards &
understand industry requirements so that they can
verify the condition of a rig’s equipment & improve
safety, thus reducing the number of accidents and
protecting the asset.
Unblocking The Main Thread - Solving ANRs and Frozen FramesSinan KOZAK
In the realm of Android development, the main thread is our stage, but too often, it becomes a battleground where performance issues arise, leading to ANRS, frozen frames, and sluggish Uls. As we strive for excellence in user experience, understanding and optimizing the main thread becomes essential to prevent these common perforrmance bottlenecks. We have strategies and best practices for keeping the main thread uncluttered. We'll examine the root causes of performance issues and techniques for monitoring and improving main thread health as wel as app performance. In this talk, participants will walk away with practical knowledge on enhancing app performance by mastering the main thread. We'll share proven approaches to eliminate real-life ANRS and frozen frames to build apps that deliver butter smooth experience.
Natural Is The Best: Model-Agnostic Code Simplification for Pre-trained Large...YanKing2
Pre-trained Large Language Models (LLM) have achieved remarkable successes in several domains. However, code-oriented LLMs are often heavy in computational complexity, and quadratically with the length of the input code sequence. Toward simplifying the input program of an LLM, the state-of-the-art approach has the strategies to filter the input code tokens based on the attention scores given by the LLM. The decision to simplify the input program should not rely on the attention patterns of an LLM, as these patterns are influenced by both the model architecture and the pre-training dataset. Since the model and dataset are part of the solution domain, not the problem domain where the input program belongs, the outcome may differ when the model is trained on a different dataset. We propose SlimCode, a model-agnostic code simplification solution for LLMs that depends on the nature of input code tokens. As an empirical study on the LLMs including CodeBERT, CodeT5, and GPT-4 for two main tasks: code search and summarization. We reported that 1) the reduction ratio of code has a linear-like relation with the saving ratio on training time, 2) the impact of categorized tokens on code simplification can vary significantly, 3) the impact of categorized tokens on code simplification is task-specific but model-agnostic, and 4) the above findings hold for the paradigm–prompt engineering and interactive in-context learning and this study can save reduce the cost of invoking GPT-4 by 24%per API query. Importantly, SlimCode simplifies the input code with its greedy strategy and can obtain at most 133 times faster than the state-of-the-art technique with a significant improvement. This paper calls for a new direction on code-based, model-agnostic code simplification solutions to further empower LLMs.
Software Engineering and Project Management - Introduction to Project ManagementPrakhyath Rai
Introduction to Project Management: Introduction, Project and Importance of Project Management, Contract Management, Activities Covered by Software Project Management, Plans, Methods and Methodologies, some ways of categorizing Software Projects, Stakeholders, Setting Objectives, Business Case, Project Success and Failure, Management and Management Control, Project Management life cycle, Traditional versus Modern Project Management Practices.
Paharganj @ℂall @Girls ꧁❤ 9873777170 ❤꧂VIP Arti Singh Top Model Safe
Novel Advances in Measuring and Preventing Software Security Weakness: Continuous Trust Restoration (CTR)
1. The International Journal Of Engineering And Science (IJES)
|| Volume || 5 || Issue || 6 || Pages || PP -98-102 || 2016 ||
ISSN (e): 2319 – 1813 ISSN (p): 2319 – 1805
www.theijes.com The IJES Page 98
Novel Advances in Measuring and Preventing Software Security
Weakness: Continuous Trust Restoration (CTR)
Abdulrehman A. Mohamedand Dr. Michael W. Kimwele, Phd
School Of Computing And Information Technology (SCIT), College Of Pure And Applied Sciences
(COPAS),Jomo Kenyatta University Of Agriculture And Technology (Mombasa Campus), P. O. Box94090–
80107, Mombasa
School Of Computing And Information Technology (SCIT), College Of Pure And Applied Sciences
(COPAS),Jomo Kenyatta University Of Agriculture And Technology (JKUAT), P. O. Box 62000- 00200,
Nairobi. Kenya
--------------------------------------------------------ABSTRACT-----------------------------------------------------------
Software weaknesses in design, architecture, code and deployment have led to software vulnerability exploited
by the perpetrators. Although counter measure tools have been developed such as patch management systems,
firewalls and antivirus, but the perpetrators have advance sophisticated tools such malware with crypto-lock
and crypto-wall technologies. The current counter measures technologies are based on detection and respond
model or risk management framework, which are no match to the attacker’s technologies based on speed
technologies such as machine generated malwares and precision or stealth technologies such as command-and-
control node malwares. Although lots of ink has been poured on advances in measuring and preventing software
weakness on the detection and respond concept,this study is motivated to explore the state-of-art advances
specifically on the novel concept of Continuous Trust Restoration (CTR). The Continuous Trust Restoration is a
process of breaking down attacker’s activities kill chain and restoring the system trust. The CTR concept
deploys speed, precision and stealth technologies on random route mutation, random host mutation,
hypervisors, trust boot, software identities and software define infrastructure. Moreover, to deploy these
technologies the study further explores a common security architectural framework with software metrics such
as CVE (Common Vulnerability and Exposure), CWE (Common Weakness Enumeration), CVSS (Common
Vulnerability Scoring System), CWSS (Common Weakness Scoring System), and CAPEC (Common Attack
Pattern Enumeration and Classification). Finally, the study recommends a software security counter measures
research paradigm shift from the current detection and respond models to Continuous Trust Restoration
concept and from risk management frameworks to a Common Security Architectural Framework.
Keywords: Continuous Trust Restoration (CTR)), CVE (Common Vulnerability and Exposure), CWE (Common
Weakness Enumeration), CVSS (Common Vulnerability Scoring System), CWSS (Common Weakness Scoring
System), and CAPEC (Common Attack Pattern Enumeration and Classification).
-------------------------------------------------------------------------------------------------------------------------------------
Date of Submission: 17 May 2016 Date of Accepted: 30 June 2016
I. INTRODUCTION
1.1 Background Information
The computer platform paradigm shift from mainframe, personal computers, web, mobile, smartphones, tablets,
and to cloud have ledto digital revolution of Internet of Things. The commercial industries have taken advantage
of these technologies to manufacture cheap devices ranging from wearable which gather personal data to the
cloud and to vehicle fitted with Wi-Fi technology. Even though some of these devices were built with automated
patch enable and update enable systems, but the majority of them were built with no architectural security
measures. These software weaknesses have led to enormous vulnerabilities to the cyber space to be exploited by
hackers. As a result of this, there is urgency in the academic world in research advances for measuring and
preventing software weakness. Therefore, the study is motivated to explore two main research concepts of;
Continuous Trust Restoration (CTR) and common security architectural framework.
According to (Boyle, 2015), Continuous Trust Restorationis a process whereby networks are continually
renewing themselves which can give cyber defenders the advantage of speed, stealth, and precision in getting
ahead of the growing cyber threat from attackers.If an attacker scans the network to build a map, the CTR
technologies such as random host mutation should be able to restore the system into a state of trust within 24
hours.
It is reported by (R. Martin, 2015) that, in today’s world of Internet of Things (IoT), everything is connected and
co-dependent, hence, when a system get subverted through un-patched vulnerability, or a miss-configuration or
software weakness then everything is susceptible to attack. Currently, vendors use their own security
2. Novel Advances In Measuring And Preventing Software Security Weakness: Continuous Trust…
www.theijes.com The IJES Page 99
frameworks for developing devices, which make it difficult to measure and prevent software weakness.
Therefore, research advances, in software weakness and measure are directed towards developing common
security architectural framework.
II. CONTINUOUS TRUST RESTORATION (CTR)
2.1 Introduction
The control of cyber domain has been a nightmare according to (Boyle, 2015), but controlling the physical
domain such as the air domain which brings the image of F35; the most advance fighter aircraft in the world, the
sea domain which brings the images of nuclear submarines; running deep silently with stealth coordinating the
surface ships, the land domain with solders equipped with laser designators and precision guided bombs, and the
space domain which bring myriad of complex satellite ground station systems. All these domains can be
controlled and have a common designattributes of speed, precision and stealth; as they can move fast, un-
noticed and hit their target with precision.
While the cyber domain toolkit contains machine generated malwares introducing the strategy of speed and
command-and-control node malwares with precision and stealth strategies, against the unmatched strategies of
detect and respond currently(Bosire & Kimwele, 2015). Therefore, the advances in research for software
weakness extended this idea to a concept called Continuous Trust Restoration (CTR) where the perpetrator’s
activities are quelled before they are mature using strategies of speed, precision and stealth.
2.2 Hypervisor
It is elaborated by (Prasad, 2014) that, hypervisor (Virtual Machine Monitor ) is a piece of software, firmware
or hardware that gives an impression to the guest machines (virtual machines) as if they were operating on a
physical hardware. Its main purpose is to allow multiple “machines” to share a single hardware platform. It
separates the operating system (OS) from the hardware by taking the responsibility of allowing each running OS
time with the underlying hardware. It acts as a traffic controller to allow time to use the CPU, memory, GPU,
and other hardware.
2.2.1 Types of Hypervisor
It is explained by (Sumastre, 2016) that, there are two types of hypervisors: bare metal (native or type I)
hypervisors; which are run on the host's hardware to control it as well as manage the virtual machines on it.
They included various examples such as Microsoft Hyper-V hypervisor, VMware ESX/ESXi, Oracle VM
Server for x86, KVM, or Citrix XenServer, and the embedded (hosted or type II) hypervisors; which are run as
software using an operating system such as Windows, Linux or FreeBSD. They included various examples such
as Virtagehypervisor, VirtualBox, and VMWare.Therefore, native hypervisors run directly on the hardware
while a hosted hypervisor needs an operating system to do its work.
Although, type I hypervisors are more secure than type II hypervisors but are target for hackers, because they
are designed to control all the resources of the hardware while managing all the virtual machines residing on
it. Therefore, the Continuous Trust Restoration concept explores research advancements in developing
hypervisors with strategies of speed, precision and stealth.
2.3 Random Host Mutation (RHM)
It is reported by (Al-Shaer, Duan, & Jafarian, 2012) that RHM is an approach that turns end-hosts into
untraceable moving targets by transparently mutating their IP addresses in an intelligent and unpredictable
fashion and without sacrificing network integrity, manageability or performance. In RHM, moving target hosts
are assigned virtual IP addresses that change randomly and synchronously in a distributed fashion over time. In
order to prevent disruption of active connections, the IP address mutation is managed by network appliances and
totally transparent to end-host. AlthoughRHM can effectively defend against stealthy scanning, many types of
worm propagation and attacks that require reconnaissance for successful launching, but the IP addresses and
ports still are prerequisite to many host and network attacks. Therefore, the Continuous Trust Restoration
concept explores research advancements in developing RHMs with strategies of speed, precision and stealth.
2.4 Software identities
It is reported by (Rouse, 2013) that, software identities deals with identifying individuals in a system and
controlling their access to resources within that system by associating user rights and restrictions with the
established identity. At the core of an identity management system are policies defining which devices and users
are allowed on the network and what a user can accomplish, including policy definition, reporting, alerts, alarms
and other common management and operations requirements. An alarm might be triggered, for example, when a
specific user tries to access a resource for which they do not have permission. Reporting produces an audit
log documenting what specific activities were initiated.
3. Novel Advances In Measuring And Preventing Software Security Weakness: Continuous Trust…
www.theijes.com The IJES Page 100
The information in digital identities is used by computers to make decisions about how to interact with external
agents. It allows a computer to answer two basic questions:which external agent is it interacting, and has it
interacted with an external agent in the past. The information contained in a digital identity allows these
questions to be answered without the involvement of human operators. The hackers take advantage of the
vulnerabilities in the software of this concept to launch their attacks. Therefore, the Continuous Trust
Restoration concept explores research advancements in developing software identities with strategies of speed,
precision and stealth.
2.5 Random Route Mutation
It is asserted by (Duan, Al-Shaer, & Jafarian, 2013) that, Random Route Mutation (RRM) is a technique that
enables changing randomly the route of the multiple flows in a network simultaneously to defend against
reconnaissance, eavesdrop and DoS attacks, while preserving end-to-end quality of service (QoS) properties
Although RRM can protect at least 90% of the packet flow from being attacked against adversaries attackers, as
compared with static routes but still recently the number of attacks specifically from adversaries to eavesdrop, or
launch denial of service (DoS) attacks on certain network flows have increased drastically. As result of this gap,
the Continuous Trust Restoration concept explores research advancements in developing random route mutation
with strategies of speed, precision and stealth.
2.6 Software Define Infrastructure (SDI)
It is described by (Piff, 2015) that, Software Defined Infrastructure (SDI) is the definition of technical
computing infrastructure entirely under the control of software with no operator or human intervention. It
operates independent of any hardware-specific dependencies and is programmatically extensible. The concept
refers to the ability to define application requirements from the infrastructure (both functional and non-
functional requirements) and have physical implementation of the hardware require delivering those
requirements automatically derived and provisioned (Goldman, 2015).
AlthoughSDI, have the intelligence to manage the hardware, supported workloads, automatically corrects issues,
and optimizes performance to ensure security but still the adversaries have found vulnerabilities around these
intelligent software to make calculated attacks. This creates a gap which can be address by the Continuous Trust
Restoration concept with strategies of speed, precision and stealth.
2.7 Trusted boot
According to (Loisel & di Vito, 2015)that, the trusted Boot, is a security feature that leverages the Unified
Extensible Firmware Interface (UEFI) to block the loading and operation of any program or driver that has not
been signed by an OS-provided key, and thus protects the integrity of the kernel, system files, boot-critical
drivers, and even antimalware software.When a rootkit is encountered, the UEFI wouldn’t allow it to boot. In
other words, UEFI protects the pre-OS environment. Additionally, as the system boots, the OS detects if any of
the OS elements have been tampered with and automatically restores the unmodified versions.
Although, impregnable protection are achieved using read-only memory (ROM), or flash (EEPROM) memory
internal to the microcontroller to store the root-of-trust software, the adversaries have gain access to various
systems through this route and inflict harm. As a result which the counter measures should be to employ the
concept of Continuous Trust Restoration concept with strategies of speed, precision and stealth.
III. COMMON SECURITY ARCHITECTURAL FRAMEWORK
3.1 Introduction
According to (R. Martin, 2015), the idea of standards in software security is of an urgency now than ever before,
because cyber security is not an individual issue but communal issues and its mitigation is communal.
Therefore, the idea of Common Security Architectural Framework is core to advances in measuring and
preventing software weakness.These advances are witness in collaboration of various stakeholders, both private
and government. The Consortium for IT Software Quality (CISQ) developed an Automated Source Code
Security Measure that predicts the vulnerability of source code to external attack. The measures were based on
the Top 25 in the Common Weakness Enumeration (CWE) repository which is managed by the MITRE
Corporation. The system was also endorsed by Object Management Group (OMG), an international, open
membership and not-for-profit technology standards consortium.
It is further elaborated by (R. Martin, 2015) that, the center of the Common Security Architectural Framework is
MITRE Corporation, a non-profit organization that manages Federally Funded Research and Development
Centers (FFRDCs) and supporting various governmental organs such the Department of Defense (DOD and the
National Institute of Standards and Technology (NIST) among others. Therefore, the study is motivated to
explore the metrics supported by MITRE Corporation as areas of advance research in software measures and
weakness. The metrics are; CVE (Common Vulnerability and Exposure), CWE (Common Weakness
4. Novel Advances In Measuring And Preventing Software Security Weakness: Continuous Trust…
www.theijes.com The IJES Page 101
Enumeration), CVSS (Common Vulnerability Scoring System), CWSS (Common Weakness Scoring System),
and CAPEC (Common Attack Pattern Enumeration and Classification).
3.2 CVE (Common Vulnerability and Exposure)
According to (MITRE, 2016), Common Vulnerabilities and Exposures (CVE) is a dictionary of common names
(i.e., CVE Identifiers) for publicly known cyber-security vulnerabilities. CVE's common identifiers make it
easier to share data across separate network security databases and tools, and provide a baseline for evaluating
the coverage of an organization’s security tools. CVE was launched in 1999 when most information security
tools used their own databases with their own names for security vulnerabilities. The consequences were
potential gaps in security coverage and no effective interoperability among the disparate databases and tools. In
addition, each tool vendor used different metrics to state the number of vulnerabilities or exposures they
detected, which meant there was no standardized basis for evaluation among the tools
The process of creating a CVE Identifier begins with the discovery of potential security vulnerability. The
information is then assigned a CVE Identifier by a CVE Numbering Authority (CNA) and posted on the CVE
List on the CVE website by the CVE Editor. As part of its management of CVE, The MITRE
Corporation functions as Editor and Primary CNA.In addition to approving the data sources and product
coverage goals for entries on the CVE List, the CVE Editorial Board oversees this process.
3.3 CWE (Common Weakness Enumeration)
(Rouse, 2013) describes that, Common Weakness Enumeration (CWE) is a universal online dictionary of
weaknesses that have been found in computer software. The dictionary is maintained by the MITRE
Corporation and can be accessed free on a worldwide basis. The purpose of CWE is to facilitate the effective
use of tools that can identify, find and resolve bugs, vulnerabilities and exposures in computer software before
the programs are publicly distributed or sold.
As a result of this, it also described by (Needham, 2015) that, the CISQ developed an Automated Source Code
Security Measure and it’sAutomated Quality Characteristic Measures are conformant to the definitions of the
following quality characteristics in ISO/IEC 25010; Security : Identifies critical security violations in the source
code drawn from the Top 25 security weaknesses in the Common Weakness Enumeration (CWE) repository,
Reliability: Identifies critical violations of availability, fault tolerance, and recoverability of software,
Performance Efficiency: Identifies critical violations of response time, as well as processor, memory, and
utilization of other resources by the software, and Maintainability: Identifies critical violations of modularity,
architectural compliance, reusability, analyzability, and changeability in software
3.4 CVSS (Common Vulnerability Scoring System)
It is asserted by (Czagan, 2013) that, the Common Vulnerability Scoring System (CVSS) is a free
and open industry standard for assessing the severity of computer system security vulnerabilities. CVSS
attempts to assign severity scores to vulnerabilities, allowing responders to prioritize responses and resources
according to threat. Scores are calculated based on a formula that depends on several metrics that approximate
ease of exploit and the impact of exploit. Scores range from 0 to 10, with 10 being the most severe. While many
utilize only the CVSS Base score for determining severity, Temporal and Environmental scores also exist, to
factor in availability of mitigations and how widespread vulnerable systems are within an organization,
respectively.
The CVSS assessment measures three areas of concern: base Metrics for qualities intrinsic to vulnerability,
temporal Metrics for characteristics that evolve over the lifetime of vulnerability, and environmental Metrics for
vulnerabilities that depend on a particular implementation or environment. A numerical score is generated for
each of these metric groups. A vector string (or simply "vector" in CVSS), represents the values of all the
metrics as a block of text.
3.5 CWSS (Common Weakness Scoring System)
According to (B. Martin, 2014) the Common Weakness Scoring System (CWSS) provides a mechanism for
prioritizing software weaknesses in a consistent, flexible, open manner. It is a collaborative, community-based
effort that is addressing the needs of its stakeholders across government, academia, and industry. CWSS is
organized into three metric groups: base finding, attack surface, and environmental. Each group contains
multiple metrics - also known as factors - that are used to compute a CWSS score for a weakness.
Various weakness scoring systems have been used or proposed over the years. Automated tools such as source
code scanners typically perform their own custom scoring; as a result, multiple tools can produce inconsistent
scores for the same weakness. The Common Vulnerability Scoring System (CVSS) is perhaps the most similar
scoring system. However, it has some important limitations that make it difficult to adapt to software security
assessment.
5. Novel Advances In Measuring And Preventing Software Security Weakness: Continuous Trust…
www.theijes.com The IJES Page 102
3.6 CAPEC (Common Attack Pattern Enumeration and Classification)
According (Picuira, 2016), CAPEC (Common Attack Patterns Enumeration and Classification) is a community-
developed formal list of common attack patterns. Attack patterns are descriptions of common methods for
exploiting software providing the attacker's perspective and guidance on ways to mitigate their effect. They
derive from the concept of design patterns applied in a destructive rather than constructive context and are
generated from in-depth analysis of specific real-world exploit examples. Security-Database use CVEs along the
appropriate CAPECs if available.
"CAPEC-compatible" means that a tool, Web site, database, or other security product or service uses CAPEC
names in a manner that allows it to be cross-referenced with other products that employ CAPEC names.
Security-Database is creating a new generation of complete XML feed. The complete XML feedwill enumerate
all known information on vulnerability (CVE, CPE, OVAL ID, CVSS, CWE, CAPEC, CCE, Vendor Patches
etc.)
IV. RECOMMENDATION
The reviews revealed a shift away from the traditional reactive detection and respond principles to more
proactive continuous trust restoration principles. Therefore, the study recommends a further exploration of
software security counter measures research paradigm shift from the current detection and respond models to
Continuous Trust Restoration concept and from risk management frameworks to a Common Security
Architectural Framework.
ACKNOWLEDGMENT
I would like to acknowledge and appreciate the material and moral support that I have always received from my
family, colleagues and friends. Special thanks to my lecturer and mentor Dr. Michael W. Kimwele, Deputy
Director, School of Computing and Information Technology (SCIT), College of Pure and Applied Sciences
(COPAS)Jomo Kenyatta University of Agriculture and Technology (JKUAT), P. O. Box 62000- 00200,
Nairobi- Kenya for his kind words of encouragement and guidance.
REFERENCE
[1]. Al-Shaer, E., Duan, Q., & Jafarian, J. H. (2012). Random Host Mutation for Moving Target Defense. In A. D. Keromytis & R. D.
Pietro (Eds.), Security and Privacy in Communication Networks (pp. 310–327). Springer Berlin Heidelberg. Retrieved from
http://link.springer.com/chapter/10.1007/978-3-642-36883-7_19
[2]. Bosire, A., & Kimwele, M. (2015). Advances in Measuring and Preventing Software Security Weaknesses. International Journal
of Advanced Research in Computer Science and Software Engineering Research Paper, 5(12), 150–154.
[3]. Boyle, V. (2015). The Cutting Edge of Cybersecurity Research. In How Cyber Defenders Can Get Ahead: Speed, Stealth, and
Precision Grumman. Washington, D.C. Retrieved from
http://www.northropgrumman.com/Capabilities/Cybersecurity/Pages/CSM_Passcode_Cybersecurity_Event.aspx
[4]. Czagan, D. (2013). Common Vulnerability Scoring System - InfoSec Resources. Retrieved June 17, 2016, from
http://resources.infosecinstitute.com/common-vulnerability-scoring-system/
[5]. Duan, Q., Al-Shaer, E., & Jafarian, H. (2013). Efficient Random Route Mutation considering flow and network constraints. In
2013 IEEE Conference on Communications and Network Security (CNS) (pp. 260–268).
http://doi.org/10.1109/CNS.2013.6682715
[6]. Goldman, E. (2015). Where Are You on the Road to Software-Defined Infrastructure? | CIO. Retrieved June 17, 2016, from
http://www.cio.com/article/2931083/infrastructure/where-are-you-on-the-road-to-software-defined-infrastructure.html
[7]. Loisel, Y., & di Vito, S. (2015). Securing the IoT: Part 2 - Secure boot as root of trust. Retrieved June 16, 2016, from
http://www.embedded.com/design/safety-and-security/4438300/Securing-the-IoT--Part-2---Secure-boot-as-root-of-trust-
[8]. Martin, B. (2014). CWE - Common Weakness Scoring System (CWSS). Retrieved June 17, 2016, from
https://cwe.mitre.org/cwss/cwss_v1.0.1.html
[9]. Martin, R. (2015). Latest Advances in Cybersecurity and the NEW CISQ Security Standard | CISQ - Consortium for IT Software
Quality. In Latest Advances in Cybersecurity and the NEW OMG/CISQ Security Standard. Washington, D.C. Retrieved from
http://it-cisq.org/cisq-webcast-latest-advances-in-cybersecurity-and-the-new-cisq-security-standard/
[10]. MITRE. (2016). CVE - About CVE. Retrieved June 17, 2016, from https://cve.mitre.org/about/
[11]. Needham, M. A. (2015). CISQ Announces New Measures for Software Quality. Retrieved June 17, 2016, from
http://www.darkreading.com/application-security/cisq-announces-new-measures-for-software-quality-/d/d-id/1322198
[12]. Picuira, B. (2016). CAPEC Compatibility - Security Database. Retrieved June 17, 2016, from https://www.security-
database.com/about.php?type=capec
[13]. Piff, S. (2015). CIO-Asia - Drive for innovation boosts demand for software-defined technologies: IDC. Retrieved June 17, 2016,
from http://www.cio-asia.com/mgmt/leadership-and-mgmt/drive-for-innovation-boosts-demand-for-software-defined-
technologies-idc/
[14]. Prasad, D. (2014). Comparison Type 1 vs Type 2 Hypervisor ~ GoLinuxHub. Retrieved from
http://www.golinuxhub.com/2014/07/comparison-type-1-vs-type-2-hypervisor.html
[15]. Rouse, M. (2013). What is identity management (ID management) ? - Definition from WhatIs.com. Retrieved June 16, 2016,
from http://searchsecurity.techtarget.com/definition/identity-management-ID-management
[16]. Sumastre, M. G. (2016). Virtualization 101: What is a Hypervisor? Retrieved from https://www.pluralsight.com/blog/it-
ops/what-is-hypervisor