SlideShare a Scribd company logo
Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering
          Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
                      Vol. 2, Issue4, July-August 2012, pp.1857-1860
        A New Generic Architecture For Time Series Prediction

           *Nageswara Rao Puli, **Nagul Shaik,***M.Kishore Kumar
                                   *Dept. of Computer Science & Engg
                                Nimra Institute of Science and Technology
                                            Vijayawada, India
                            **Asst.Professor, Dept. of Computer Science & Engg
                                Nimra Institute of Science and Technology
                                            Vijayawada, India
                            ***Professor & HOD Dept. of Computer Science & Engg
                                Nimra Institute of Science and Technology
                                            Vijayawada, India



Abstract
         Rapidly evolving businesses generate          xt 1 , xt 2 , data values. The goal is to observe or
massive amounts of time-stamped data sequences
                                                       model the existing data series to enable future
and cause a demand for both univari- ate and
                                                       unknown data values to be forecasted accurately.
multivariate time series forecasting. For such data,
                                                       Examples of data series include financial data series
traditional     predictive    models     based    on
                                                       (stocks, indices, rates, etc.), physically observed data
autoregression are often not sufficient to capture
                                                       series (sunspots, weather, etc.), and mathematical
complex      non-linear     relationships    between
                                                       data series (Fibonacci sequence, integrals of
multidimensional fea- tures and the time series
                                                       differential equations, etc.). The phrase “time series”
outputs. In order to exploit these relationships for
                                                       generically refers to any data series, whether or not
improved time series forecasting while also better
                                                       the data are dependent on a certain time increment.
dealing with a wider variety of prediction
                                                       Throughout the literature, many techniques have been
scenarios, a forecasting system requires a flexible
                                                       implemented to perform time series forecasting. This
and generic architecture to accommodate and tune
                                                       paper will focus on two techniques: neural networks
various individual predictors as well as
                                                       and k-nearest-neighbor. This paper will attempt to
combination methods.
                                                       fill a gap in the abundant neural network time series
         In reply to this challenge, an architecture
                                                       forecasting literature, where testing arbitrary neural
for combined, multilevel time series prediction is
                                                       networks on arbitrarily complex data series is
proposed, which is suitable for many different
                                                       common, but not very enlightening. This paper
universal regressors and combination methods. The
                                                       thoroughly analyzes the responses of specific neural
key strength of this architecture is its ability to
                                                       network configurations to artificial data series, where
build a diversified ensemble of individual
                                                       each data series has a specific characteristic. A better
predictors that form the input to a multilevel
                                                       understanding of what causes the basic neural
selection and fusion process before the final
                                                       network to become an inadequate forecasting
optimised      output    is    obtained.   Excellent
                                                       technique will be gained. In addition, the influence
generalisation ability is achieved due to the highly
                                                       of data preprocessing will be noted. The forecasting
boosted complementarity of indi- vidual models
                                                       performance of k-nearest-neighbor, which is a much
further enforced through crossvalidation-linked
                                                       simpler forecasting technique, will be compared to
training on exclusive data subsets and ensemble
                                                       the neural networks’ performance. Finally, both
output post-processing. In a sample configuration
                                                       techniques will be used to forecast a real data series.
with basic neural network predictors and a mean
combiner, the proposed system has been evaluated
in different scenarios and showed a clear prediction   Difficulties
performance gain.                                               Several difficulties can arise when
                                                       performing time series forecasting. Depending on the
                                                       type of data series, a particular difficulty may or may
Index Terms— Time series forecasting, combining
                                                       not exist. A first difficulty is a limited quantity of
predictors, regression, ensembles, neural networks,
                                                       data. With data series that are observed, limited data
diversity
                                                       may be the foremost difficulty. For example, given a
                                                       company’s stock that has been publicly traded for one
Introduction                                           year, a very limited amount of data are available for
         Time series forecasting, or time series       use by the forecasting technique.
prediction, takes an existing series of data
xt n , , xt 2 , xt 1 , xt and forecasts the
                                                                                               1857 | P a g e
Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering
            Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
                        Vol. 2, Issue4, July-August 2012, pp.1857-1860
         A second difficulty is noise. Two types of                                         data series.Another application is forecasting
noisy data are (1) erroneous data points and (2)                                            undesirable, yet unavoidable, events to preemptively
components that obscure the underlying form of the                                          lessen their impact. At the time of this writing, the
data series. Two examples of erroneous data are                                             sun’s cycle of storms, called solar maximum, is of
measurement errors and a change in measurement                                              concern because the storms cause technological
methods or metrics. In this paper, we will not be                                           disruptions on Earth. The sunspots data series, which
concerned about erroneous data points. An example                                           is data counting dark patches on the sun and is related
of a component that obscures the underlying form of                                         to the solar storms, shows an eleven-year cycle of
the data series is an additive high-frequency                                               solar maximum activity, and if accurately modeled,
component. The technique used in this paper to                                              can forecast the severity of future activity. While
reduce or remove this type of noise is the moving                                           solar activity is unavoidable, its impact can be
average. The data series , xt 4 , xt 3 , xt 2 , xt 1 , xt                              lessened with appropriate forecasting and proactive
                                                                                            action.
becomes
                                                                                                      Finally, many people, primarily in the
, ( xt4  xt3  xt2 ) / 3, ( xt3  xt2  xt1 ) / 3, ( xt2  xt1  xt ) / 3   financial markets, would like to profit from time
after taking a moving average with an interval i of                                         series forecasting. Whether this is viable is most
three. Taking a moving average reduces the number                                           likely a never-to-be-resolved question. Nevertheless
of data points in the series by i  1 .                                                     many products are available for financial forecasting.
         A third difficulty is nonstationarity, data that                                   Difficulties inherent in time series forecasting and the
do not have the same statistical properties (e.g., mean                                     importance of time series forecasting are presented
and variance) at each point in time. A simple                                               next. Then, neural networks and k-nearest-neighbor
example of a nonstationary series is the Fibonacci                                          are detailed. Section Error! Reference source not
sequence: at every step the sequence takes on a new,                                        found. presents related work.           Section Error!
higher mean value. The technique used in this paper                                         Reference source not found. gives an application
to make a series stationary in the mean is first-                                           level description of the test-bed application, and
                                                                                            Section Error! Reference source not found.
differencing. The data series , xt 3 , xt 2 , xt 1 , xt                                 presents an empirical evaluation of the results
becomes                                                                                     obtained with the application.A time series is a
, ( xt 2  xt 3 ), ( xt 1  xt 2 ), ( xt  xt 1 )                          after      sequence of observations of a random variable.
                                                                                            Hence, it is a stochasticprocess. Examples include the
taking the first-difference. This usually makes a data
                                                                                            monthly demand for a product, the annual
series stationary in the mean. If not, the second-
                                                                                            freshmanenrollment in a department of a university,
difference of the series can be taken. Taking the first-
                                                                                            and the daily volume of flows in a river.Forecasting
difference reduces the number of data points in the
                                                                                            time series data is important component of operations
series by one.
                                                                                            research because thesedata often provide the
         A fourth difficulty is forecasting technique
                                                                                            foundation for decision models. An inventory model
selection. From statistics to artificial intelligence,
                                                                                            requiresestimates of future demands, a course
there are myriad choices of techniques. One of the
                                                                                            scheduling and staffing model for a universityrequires
simplest techniques is to search a data series for
                                                                                            estimates of future student inflow, and a model for
similar past events and use the matches to make a
                                                                                            providing warnings to thepopulation in a river basin
forecast. One of the most complex techniques is to
                                                                                            requires estimates of river flows for the immediate
train a model on the series and use the model to make
                                                                                            future.Time series analysis provides tools for
a forecast. K-nearest-neighbor and neural networks
                                                                                            selecting a model that can be used to forecastof future
are examples of the first and second techniques,
                                                                                            events. Modeling the time series is a statistical
respectively.
                                                                                            problem. Forecasts are used incomputational
                                                                                            procedures to estimate the parameters of a model
1)       Importance
                                                                                            being used to allocatedlimited resources or to
         Time series forecasting has several
                                                                                            describe random processes such as those mentioned
important applications. One application is preventing
                                                                                            above. Timeseries models assume that observations
undesirable events by forecasting the event,
                                                                                            vary according to some probability distributionabout
identifying the circumstances preceding the event,
                                                                                            an underlying function of time.styles are built-in;
and taking corrective action so the event can be
                                                                                            examples of the type styles are provided throughout
avoided. At the time of this writing, the Federal
                                                                                            this document and are identified in italic type, within
Reserve Committee is actively raising interest rates to
                                                                                            parentheses, following the example. PLEASE DO
head off a possible inflationary economic period.
                                                                                            NOT RE-ADJUST THESE MARGINS.
The Committee possibly uses time series forecasting
with many data series to forecast the inflationary
period and then acts to alter the future values of the



                                                                                                                                    1858 | P a g e
Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering
           Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
                       Vol. 2, Issue4, July-August 2012, pp.1857-1860
                                                              . Each output layer unit performs the calculation in
         In this section and the next, subscripts c, p,       Equation II.1 on its inputs and transfers the result
and n will identify units in the current layer, the           (Oc) to a network output.
previous layer, and the next layer, respectively.
                                                              Equation II.1 Activation function of an output layer
When the network is run, each hidden layer unit
                                                              unit.
performs the calculation in Error! Reference source
                                                              Each output layer unit performs the calculation in
not found. on its inputs and transfers the result (Oc)
                                                              Equation II.1 on its inputs and transfers the result
to the next layer of units.
                                                              (Oc) to a network output.
                                                              Equation II.2 Activation function of an output layer
                                                              unit.


                                                                           P                     
                                                               Oc  hOutput  ic , p wc , p  bc  where hOutput( x )  x
                                                                            p1                  
                                                                          Oc is the output of the current output layer
                                                                unit c, P is the number of units in the previous
               P                                             1
Oc  hHidden   ic , p wc , p  bc  where hHidden ( x )      hidden layer, ic,p is an input to unit c from the
                p1                                       1  ex
                                                                previous hidden layer unit p, wc,p is the weight
                                                                modifying the connection from unit p to unit c, and
Fig: Forwarding the node values                                 bc is the bias. For this research, hOutput(x) is a
                                                                linear activation function1
Oc is the output of the current hidden layer unit c, P is
                                                                K-Nearest-Neighbor
                                                                          In contrast to the complexity of the neural
either the number of units in the previous hidden               network forecasting technique, the simpler k-nearest-
layer or number of network inputs, ic,p is an input to          neighbor forecasting technique is also implemented
unit c from either the previous hidden layer unit p or          and tested. K-nearest-neighbor is simpler because
network input p, wc,p is the weight modifying the               there is no model to train on the data series. Instead,
connection from either unit p to unit c or from input p         the data series is searched for situations similar to the
to unit c, and bc is the bias.                                  current one each time a forecast needs to be made.
In Error! Reference source not found., hHidden(x) is            To make the k-nearest-neighbor process description
                                                                easier, several terms will be defined. The final data
                                                                points of the data series are the reference, and the
                                                                length of the reference is the window size. The data
                                                                series without the last data point is the shortened data
                                                                series. To forecast the data series’ next data point,
                                                                the reference is compared to the first group of data
                                                                points in the shortened data series, called a candidate,
                                                                and an error is computed. Then the reference is
                                                                moved one data point forward to the next candidate
the sigmoid activation function of the unit and is              and another error is computed, and so on. All errors
charted in Error! Reference source not found..                  are stored and sorted.         The smallest k errors
                                                                correspond to the k candidates that closest match the
                                                                Fig: Nearest Neighbor Transformation
Fig: Prediction Graph for Forwarding node

Other types of activation functions exist, but the
sigmoid was implemented for this research. To avoid
saturating the activation function, which makes
training the network difficult, the training data must
be scaled appropriately. Similarly, before training, the
weights and biases are initialized to appropriately
scaled values.                                                         reference. Finally, the forecast will be the
                                                              average of the k data points that follow these
                                                              candidates. Then, to forecast the next data point, the




                                                                                                        1859 | P a g e
Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering
           Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com
                       Vol. 2, Issue4, July-August 2012, pp.1857-1860
process is repeated with the previously forecasted data     [4].  Hebb, D. O. (1949). The Organization of
point appended to the end of the data series.                     Behavior: A Neuropsychological Theory.
                                                                  New York: Wiley & Sons.
II. CONCLUSION                                              [5]. Kingdon, J. (1997). Intelligent Systems and
          Section Error! Reference source not found.              Financial Forecasting. New York: Springer-
introduced time series forecasting, described the work            Verlag.
presented in the typical neural network paper, which        [6]. Lawrence, S., Tsoi, A. C., & Giles, C. L.
justified this paper, and identified several difficulties         (1996). Noisy Time Series Prediction Using
associated with time series forecasting. Among these              Symbolic Representation and Recurrent
difficulties, noisy and nonstationary data were                   Neural Network Grammatical Inference
investigated further in this paper. Section Error!                [Online].                          Available:
Reference source not found. also presented feed-                  http://www.neci.nj.nec.com/homepages/lawr
forward neural networks and backpropagation                       ence/papers/finance-tr96/latex.html [March
training, which was used as the primary time series               27, 2000].
forecasting technique in this paper. Finally, k-nearest-    [7]. McCulloch, W. S., & Pitts, W. H. (1943). A
neighbor was presented as an alternative forecasting              Logical Calculus of the Ideas Imminent in
technique.                                                        Nervous Activity. Bulletin of Mathematical
Section Error! Reference source not found. briefly                Biophysics, 5, 115-133.
discussed previous time series forecasting papers. The      [8]. Minsky, M., & Papert, S. (1969).
most notable of these being the paper by Drossu and               Perceptrons:      An      Introduction       to
Obradovic (1996), who presented compelling research               Computational Geometry. Cambridge, MA:
combining stochastic techniques and neural networks.              MIT Press.
Also of interest were the paper by Geva (1998) and          [9]. Rosenblatt, F. (1962).          Principles of
the book by Kingdon (1997), which took significantly              Neurodynamics: Perceptrons and the Theory
more sophisticated approaches to time series                      of Brain Mechanisms. Washington, D. C.:
forecasting.                                                      Spartan.
Section Error! Reference source not found.                  [10]. Rumelhart, D. E., Hinton, G. E., &
presented Forecaster and went through several                     Williams, R. J. (1986). Learning Internal
important aspects of its design, including parsing data           Representations by Error Propagation. In D.
files, using the Wizard to create networks, training              E. Rumelhart, et al. (Eds.), Parallel
networks, and forecasting using neural networks and               Distributed Processing: Explorations in the
k-nearest-neighbor.                                               Microstructures      of     Cognition,       1:
          Section Error! Reference source not found.              Foundations, 318-362. Cambridge, MA:
presented the crux of the paper. First, the data series           MIT Press.
used in the evaluation were described, and then             [11]. Torrence, C., & Compo, G. P. (1998). A
parameters and procedures used in forecasting were                Practical Guide to Wavelet Analysis
given. Among these was a method for selecting the                 [Online].      Bulletin of the American
number of neural network inputs based on data series              Meteorological     Society.        Available:
characteristics (also applicable to selecting the                 http://paos.colorado.edu/research/wavelets/
window size for k-nearest-neighbor), a training                   [July 2, 2000].
heuristic, and a metric for making quantitative forecast    [12]. Zhang, X., & Thearling, K. (1994). Non-
comparisons. Finally, a variety of charts and tables,             Linear Time-Series Prediction by Systematic
accompanied by many empirical observations, were                  Data Exploration on a Massively Parallel
presented for networks trained heuristically and                  Computer        [Online].          Available:
simply and for k-nearest-neighbor.                                http://www3.shore.net/~kht/text/sfitr/sfitr.ht
                                                                  m [March 27, 2000].
REFERENCES
  [1].   Drossu, R., & Obradovic, Z. (1996). Rapid
         Design of Neural Networks for Time Series
         Prediction. IEEE Computational Science &
         Engineering, Summer 1996, 78-89.
  [2].   Geva, A. (1998). ScaleNet—Multiscale
         Neural-Network Architecture for Time
         Series Prediction. IEEE Transactions on
         Neural Networks, 9(5), 1471-1482.
  [3].   Gonzalez, R. C. & Woods, R. E. (1993).
         Digital Image Processing.     New York:
         Addison-Wesley.


                                                                                                 1860 | P a g e

Recommended for you

A survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor NetworkA survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor Network

This document summarizes various object tracking techniques in wireless sensor networks. It begins with an introduction to wireless sensor networks and object tracking applications. It then classifies network architectures for object tracking into four categories: naive architecture, tree-based architecture, cluster-based architecture, and hybrid architecture. For each category, several representative algorithms are described in terms of how they perform object detection, data transmission, energy efficiency, and other metrics. Overall, most algorithms aim to minimize energy consumption by activating only a subset of sensor nodes for tracking and using techniques like clustering, prediction, and dynamic scheduling of active sensor nodes.

irjet
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...

This document proposes a structureless and efficient data aggregation technique for wireless sensor networks that ensures data integrity with low transmission overhead. It introduces a concept where the base station can recover individual sensor data even after aggregation by cluster heads. This allows the base station to verify data integrity and authenticity, as well as perform any desired aggregation functions. It then proposes a structure-free scheme using intracluster and intercluster encryption and aggregation procedures. This scheme aims to address limitations of previous work such as high transmission costs and inability to query individual data values, while maintaining security and scalability. The document analyzes security and scalability aspects and argues the proposed scheme offers improved performance and efficiency for data aggregation in wireless sensor networks.

sensor networksdata integrityprivacy homomorphism encryption
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDSAn Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS

Avionics Sensor Health Assessment is a sub-discipline of Integrated Vehicle Health Management (IVHM), which relates to the collection of sensor data, distributing it to diagnostics/prognostics algorithms, detecting run-time anomalies, and scheduling maintenance procedures. Real-time availability of the sensor health diagnostics for aircraft (manned or unmanned) subsystems allows pilots and operators to improve operational decisions. Therefore, avionics sensor health assessments are used extensively in the mil-aero domain. As avionics platforms consist of a variety of hardware and software components, standards such as Open System Architecture for Condition-Based Maintenance (OSA-CBM) have emerged to facilitate integration and interoperability. However, OSA-CBM is a platform-independent standard that provides little guidance for avionics sensor health monitoring, which requires onboard health assessment of airborne sensors in real-time. In this paper, we present a distributed architecture for avionics sensor health assessment using the Data Distribution Service (DDS), an Object Management Group (OMG) standard for developing loosely coupled high-performance real-time distributed systems. We use the data-centric publish/subscribe model supported by DDS for data acquisition, distribution, health monitoring, and presentation of diagnostics. We developed a normalized data model for exchanging the sensor and diagnostics information in a global data space in the system. Moreover, Extensible and Dynamic Topic Types (XTypes) specification allows incremental evolution of any subset of system components without disrupting the overall health monitoring system. We believe, the DDS standard and in particular RTI Connext DDS, is a viable technology for implementing OSA-CBM for avionics systems due to its real-time characteristics and extremely low resource requirements. RTI Connext DDS is being used in other major avionics programs, such as FACE™ and UCS. We evaluated our approach to sensor health assessment in a hardware-in-the-loop simulation of an Inertial Measurement Unit (IMU) onboard a simulated General Atomics MQ-9 Reaper UAV. Our proof-of-concept effectively demonstrates real-time health monitoring of avionics sensors using a Bayesian Network –based analysis running on an extremely low-power and lightweight processing unit.

dds avionics sensors health

More Related Content

What's hot

Reliable and Efficient Data Acquisition in Wireless Sensor Network
Reliable and Efficient Data Acquisition in Wireless Sensor NetworkReliable and Efficient Data Acquisition in Wireless Sensor Network
Reliable and Efficient Data Acquisition in Wireless Sensor Network
IJMTST Journal
 
An implementation of recovery algorithm for fault nodes in a wireless sensor ...
An implementation of recovery algorithm for fault nodes in a wireless sensor ...An implementation of recovery algorithm for fault nodes in a wireless sensor ...
An implementation of recovery algorithm for fault nodes in a wireless sensor ...
eSAT Publishing House
 
Semantics in Sensor Networks
Semantics in Sensor NetworksSemantics in Sensor Networks
Semantics in Sensor Networks
Oscar Corcho
 
A survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor NetworkA survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor Network
IRJET Journal
 
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET Journal
 
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDSAn Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS
Sumant Tambe
 
Ijarcet vol-2-issue-2-756-760
Ijarcet vol-2-issue-2-756-760Ijarcet vol-2-issue-2-756-760
Ijarcet vol-2-issue-2-756-760
Editor IJARCET
 
A overview on WAMS/PMU.
A overview on WAMS/PMU.A overview on WAMS/PMU.
A overview on WAMS/PMU.
Rahul Singh
 
40120130406006
4012013040600640120130406006
40120130406006
IAEME Publication
 
E FFICIENT E NERGY U TILIZATION P ATH A LGORITHM I N W IRELESS S ENSOR...
E FFICIENT  E NERGY  U TILIZATION  P ATH  A LGORITHM  I N  W IRELESS  S ENSOR...E FFICIENT  E NERGY  U TILIZATION  P ATH  A LGORITHM  I N  W IRELESS  S ENSOR...
E FFICIENT E NERGY U TILIZATION P ATH A LGORITHM I N W IRELESS S ENSOR...
IJCI JOURNAL
 
IRJET - Detection of False Data Injection Attacks using K-Means Clusterin...
IRJET -  	  Detection of False Data Injection Attacks using K-Means Clusterin...IRJET -  	  Detection of False Data Injection Attacks using K-Means Clusterin...
IRJET - Detection of False Data Injection Attacks using K-Means Clusterin...
IRJET Journal
 
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
IJMTST Journal
 
50120140505014
5012014050501450120140505014
50120140505014
IAEME Publication
 
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
pijans
 
Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...
Sayed Abulhasan Quadri
 
Report on Enhancing the performance of WSN
Report on Enhancing the performance of WSNReport on Enhancing the performance of WSN
Report on Enhancing the performance of WSN
Dheeraj Kumar
 
Throughput analysis of energy aware routing protocol for real time load distr...
Throughput analysis of energy aware routing protocol for real time load distr...Throughput analysis of energy aware routing protocol for real time load distr...
Throughput analysis of energy aware routing protocol for real time load distr...
eSAT Journals
 
Throughput analysis of energy aware routing protocol
Throughput analysis of energy aware routing protocolThroughput analysis of energy aware routing protocol
Throughput analysis of energy aware routing protocol
eSAT Publishing House
 
middleware
middlewaremiddleware
middleware
rajeswarimca
 
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKSHOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
ijaia
 

What's hot (20)

Reliable and Efficient Data Acquisition in Wireless Sensor Network
Reliable and Efficient Data Acquisition in Wireless Sensor NetworkReliable and Efficient Data Acquisition in Wireless Sensor Network
Reliable and Efficient Data Acquisition in Wireless Sensor Network
 
An implementation of recovery algorithm for fault nodes in a wireless sensor ...
An implementation of recovery algorithm for fault nodes in a wireless sensor ...An implementation of recovery algorithm for fault nodes in a wireless sensor ...
An implementation of recovery algorithm for fault nodes in a wireless sensor ...
 
Semantics in Sensor Networks
Semantics in Sensor NetworksSemantics in Sensor Networks
Semantics in Sensor Networks
 
A survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor NetworkA survey on Object Tracking Techniques in Wireless Sensor Network
A survey on Object Tracking Techniques in Wireless Sensor Network
 
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
IRJET-Structure less Efficient Data Aggregation and Data Integrity in Sensor ...
 
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDSAn Extensible Architecture for Avionics Sensor Health Assessment Using DDS
An Extensible Architecture for Avionics Sensor Health Assessment Using DDS
 
Ijarcet vol-2-issue-2-756-760
Ijarcet vol-2-issue-2-756-760Ijarcet vol-2-issue-2-756-760
Ijarcet vol-2-issue-2-756-760
 
A overview on WAMS/PMU.
A overview on WAMS/PMU.A overview on WAMS/PMU.
A overview on WAMS/PMU.
 
40120130406006
4012013040600640120130406006
40120130406006
 
E FFICIENT E NERGY U TILIZATION P ATH A LGORITHM I N W IRELESS S ENSOR...
E FFICIENT  E NERGY  U TILIZATION  P ATH  A LGORITHM  I N  W IRELESS  S ENSOR...E FFICIENT  E NERGY  U TILIZATION  P ATH  A LGORITHM  I N  W IRELESS  S ENSOR...
E FFICIENT E NERGY U TILIZATION P ATH A LGORITHM I N W IRELESS S ENSOR...
 
IRJET - Detection of False Data Injection Attacks using K-Means Clusterin...
IRJET -  	  Detection of False Data Injection Attacks using K-Means Clusterin...IRJET -  	  Detection of False Data Injection Attacks using K-Means Clusterin...
IRJET - Detection of False Data Injection Attacks using K-Means Clusterin...
 
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
Embedding Wireless Intelligent Sensors Based on Compact Measurement for Struc...
 
50120140505014
5012014050501450120140505014
50120140505014
 
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
TWO LEVEL DATA FUSION MODEL FOR DATA MINIMIZATION AND EVENT DETECTION IN PERI...
 
Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...Multi sensor data fusion system for enhanced analysis of deterioration in con...
Multi sensor data fusion system for enhanced analysis of deterioration in con...
 
Report on Enhancing the performance of WSN
Report on Enhancing the performance of WSNReport on Enhancing the performance of WSN
Report on Enhancing the performance of WSN
 
Throughput analysis of energy aware routing protocol for real time load distr...
Throughput analysis of energy aware routing protocol for real time load distr...Throughput analysis of energy aware routing protocol for real time load distr...
Throughput analysis of energy aware routing protocol for real time load distr...
 
Throughput analysis of energy aware routing protocol
Throughput analysis of energy aware routing protocolThroughput analysis of energy aware routing protocol
Throughput analysis of energy aware routing protocol
 
middleware
middlewaremiddleware
middleware
 
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKSHOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
HOME APPLIANCE IDENTIFICATION FOR NILM SYSTEMS BASED ON DEEP NEURAL NETWORKS
 

Viewers also liked

Consideracions sobre escriptura
Consideracions sobre escripturaConsideracions sobre escriptura
Consideracions sobre escriptura
Crp Baix Camp
 
Prototyping is an attitude
Prototyping is an attitudePrototyping is an attitude
Prototyping is an attitude
With Company
 
10 Insightful Quotes On Designing A Better Customer Experience
10 Insightful Quotes On Designing A Better Customer Experience10 Insightful Quotes On Designing A Better Customer Experience
10 Insightful Quotes On Designing A Better Customer Experience
Yuan Wang
 
Learn BEM: CSS Naming Convention
Learn BEM: CSS Naming ConventionLearn BEM: CSS Naming Convention
Learn BEM: CSS Naming Convention
In a Rocket
 
How to Build a Dynamic Social Media Plan
How to Build a Dynamic Social Media PlanHow to Build a Dynamic Social Media Plan
How to Build a Dynamic Social Media Plan
Post Planner
 
SEO: Getting Personal
SEO: Getting PersonalSEO: Getting Personal
SEO: Getting Personal
Kirsty Hulse
 
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika AldabaLightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
ux singapore
 
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job? Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
Stanford GSB Corporate Governance Research Initiative
 

Viewers also liked (8)

Consideracions sobre escriptura
Consideracions sobre escripturaConsideracions sobre escriptura
Consideracions sobre escriptura
 
Prototyping is an attitude
Prototyping is an attitudePrototyping is an attitude
Prototyping is an attitude
 
10 Insightful Quotes On Designing A Better Customer Experience
10 Insightful Quotes On Designing A Better Customer Experience10 Insightful Quotes On Designing A Better Customer Experience
10 Insightful Quotes On Designing A Better Customer Experience
 
Learn BEM: CSS Naming Convention
Learn BEM: CSS Naming ConventionLearn BEM: CSS Naming Convention
Learn BEM: CSS Naming Convention
 
How to Build a Dynamic Social Media Plan
How to Build a Dynamic Social Media PlanHow to Build a Dynamic Social Media Plan
How to Build a Dynamic Social Media Plan
 
SEO: Getting Personal
SEO: Getting PersonalSEO: Getting Personal
SEO: Getting Personal
 
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika AldabaLightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldaba
 
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job? Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
Succession “Losers”: What Happens to Executives Passed Over for the CEO Job?
 

Similar to Kz2418571860

A Literature Review on Rainfall Prediction using different Data Mining Techni...
A Literature Review on Rainfall Prediction using different Data Mining Techni...A Literature Review on Rainfall Prediction using different Data Mining Techni...
A Literature Review on Rainfall Prediction using different Data Mining Techni...
IRJET Journal
 
A Hybrid Deep Neural Network Model For Time Series Forecasting
A Hybrid Deep Neural Network Model For Time Series ForecastingA Hybrid Deep Neural Network Model For Time Series Forecasting
A Hybrid Deep Neural Network Model For Time Series Forecasting
Martha Brown
 
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
Journal For Research
 
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEYRAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
csandit
 
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
Ricardo Viana Vargas
 
Kn2518431847
Kn2518431847Kn2518431847
Kn2518431847
IJERA Editor
 
Kn2518431847
Kn2518431847Kn2518431847
Kn2518431847
IJERA Editor
 
Intelligent methods in load forecasting
Intelligent methods in load forecastingIntelligent methods in load forecasting
Intelligent methods in load forecasting
prj_publication
 
IRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their ApplicationsIRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their Applications
IRJET Journal
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
Alexander Decker
 
Hyperparameters analysis of long short-term memory architecture for crop cla...
Hyperparameters analysis of long short-term memory  architecture for crop cla...Hyperparameters analysis of long short-term memory  architecture for crop cla...
Hyperparameters analysis of long short-term memory architecture for crop cla...
IJECEIAES
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
Vaibhav Dhattarwal
 
Brema tarigan 09030581721015
Brema tarigan 09030581721015Brema tarigan 09030581721015
Brema tarigan 09030581721015
ferdiandersen08
 
K41018186
K41018186K41018186
K41018186
IJERA Editor
 
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEYRAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
cscpconf
 
Optimal artificial neural network configurations for hourly solar irradiation...
Optimal artificial neural network configurations for hourly solar irradiation...Optimal artificial neural network configurations for hourly solar irradiation...
Optimal artificial neural network configurations for hourly solar irradiation...
IJECEIAES
 
A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting
ijctcm
 
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
IJLT EMAS
 
1 s2.0-s0957417410007244-main
1 s2.0-s0957417410007244-main1 s2.0-s0957417410007244-main
1 s2.0-s0957417410007244-main
asimnawaz54
 
Expert systems with applications
Expert systems with applicationsExpert systems with applications
Expert systems with applications
asimnawaz54
 

Similar to Kz2418571860 (20)

A Literature Review on Rainfall Prediction using different Data Mining Techni...
A Literature Review on Rainfall Prediction using different Data Mining Techni...A Literature Review on Rainfall Prediction using different Data Mining Techni...
A Literature Review on Rainfall Prediction using different Data Mining Techni...
 
A Hybrid Deep Neural Network Model For Time Series Forecasting
A Hybrid Deep Neural Network Model For Time Series ForecastingA Hybrid Deep Neural Network Model For Time Series Forecasting
A Hybrid Deep Neural Network Model For Time Series Forecasting
 
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
WIND SPEED & POWER FORECASTING USING ARTIFICIAL NEURAL NETWORK (NARX) FOR NEW...
 
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEYRAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
 
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
Applying Neural Networks and Analogous Estimating to Determine the Project Bu...
 
Kn2518431847
Kn2518431847Kn2518431847
Kn2518431847
 
Kn2518431847
Kn2518431847Kn2518431847
Kn2518431847
 
Intelligent methods in load forecasting
Intelligent methods in load forecastingIntelligent methods in load forecasting
Intelligent methods in load forecasting
 
IRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their ApplicationsIRJET- The Essentials of Neural Networks and their Applications
IRJET- The Essentials of Neural Networks and their Applications
 
Artificial neural network
Artificial neural networkArtificial neural network
Artificial neural network
 
Hyperparameters analysis of long short-term memory architecture for crop cla...
Hyperparameters analysis of long short-term memory  architecture for crop cla...Hyperparameters analysis of long short-term memory  architecture for crop cla...
Hyperparameters analysis of long short-term memory architecture for crop cla...
 
Project Report -Vaibhav
Project Report -VaibhavProject Report -Vaibhav
Project Report -Vaibhav
 
Brema tarigan 09030581721015
Brema tarigan 09030581721015Brema tarigan 09030581721015
Brema tarigan 09030581721015
 
K41018186
K41018186K41018186
K41018186
 
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEYRAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
RAINFALL PREDICTION USING DATA MINING TECHNIQUES - A SURVEY
 
Optimal artificial neural network configurations for hourly solar irradiation...
Optimal artificial neural network configurations for hourly solar irradiation...Optimal artificial neural network configurations for hourly solar irradiation...
Optimal artificial neural network configurations for hourly solar irradiation...
 
A Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather ForecastingA Time Series ANN Approach for Weather Forecasting
A Time Series ANN Approach for Weather Forecasting
 
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
Short Term Load Forecasting: One Week (With & Without Weekend) Using Artifici...
 
1 s2.0-s0957417410007244-main
1 s2.0-s0957417410007244-main1 s2.0-s0957417410007244-main
1 s2.0-s0957417410007244-main
 
Expert systems with applications
Expert systems with applicationsExpert systems with applications
Expert systems with applications
 

Kz2418571860

  • 1. Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue4, July-August 2012, pp.1857-1860 A New Generic Architecture For Time Series Prediction *Nageswara Rao Puli, **Nagul Shaik,***M.Kishore Kumar *Dept. of Computer Science & Engg Nimra Institute of Science and Technology Vijayawada, India **Asst.Professor, Dept. of Computer Science & Engg Nimra Institute of Science and Technology Vijayawada, India ***Professor & HOD Dept. of Computer Science & Engg Nimra Institute of Science and Technology Vijayawada, India Abstract Rapidly evolving businesses generate xt 1 , xt 2 , data values. The goal is to observe or massive amounts of time-stamped data sequences model the existing data series to enable future and cause a demand for both univari- ate and unknown data values to be forecasted accurately. multivariate time series forecasting. For such data, Examples of data series include financial data series traditional predictive models based on (stocks, indices, rates, etc.), physically observed data autoregression are often not sufficient to capture series (sunspots, weather, etc.), and mathematical complex non-linear relationships between data series (Fibonacci sequence, integrals of multidimensional fea- tures and the time series differential equations, etc.). The phrase “time series” outputs. In order to exploit these relationships for generically refers to any data series, whether or not improved time series forecasting while also better the data are dependent on a certain time increment. dealing with a wider variety of prediction Throughout the literature, many techniques have been scenarios, a forecasting system requires a flexible implemented to perform time series forecasting. This and generic architecture to accommodate and tune paper will focus on two techniques: neural networks various individual predictors as well as and k-nearest-neighbor. This paper will attempt to combination methods. fill a gap in the abundant neural network time series In reply to this challenge, an architecture forecasting literature, where testing arbitrary neural for combined, multilevel time series prediction is networks on arbitrarily complex data series is proposed, which is suitable for many different common, but not very enlightening. This paper universal regressors and combination methods. The thoroughly analyzes the responses of specific neural key strength of this architecture is its ability to network configurations to artificial data series, where build a diversified ensemble of individual each data series has a specific characteristic. A better predictors that form the input to a multilevel understanding of what causes the basic neural selection and fusion process before the final network to become an inadequate forecasting optimised output is obtained. Excellent technique will be gained. In addition, the influence generalisation ability is achieved due to the highly of data preprocessing will be noted. The forecasting boosted complementarity of indi- vidual models performance of k-nearest-neighbor, which is a much further enforced through crossvalidation-linked simpler forecasting technique, will be compared to training on exclusive data subsets and ensemble the neural networks’ performance. Finally, both output post-processing. In a sample configuration techniques will be used to forecast a real data series. with basic neural network predictors and a mean combiner, the proposed system has been evaluated in different scenarios and showed a clear prediction Difficulties performance gain. Several difficulties can arise when performing time series forecasting. Depending on the type of data series, a particular difficulty may or may Index Terms— Time series forecasting, combining not exist. A first difficulty is a limited quantity of predictors, regression, ensembles, neural networks, data. With data series that are observed, limited data diversity may be the foremost difficulty. For example, given a company’s stock that has been publicly traded for one Introduction year, a very limited amount of data are available for Time series forecasting, or time series use by the forecasting technique. prediction, takes an existing series of data xt n , , xt 2 , xt 1 , xt and forecasts the 1857 | P a g e
  • 2. Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue4, July-August 2012, pp.1857-1860 A second difficulty is noise. Two types of data series.Another application is forecasting noisy data are (1) erroneous data points and (2) undesirable, yet unavoidable, events to preemptively components that obscure the underlying form of the lessen their impact. At the time of this writing, the data series. Two examples of erroneous data are sun’s cycle of storms, called solar maximum, is of measurement errors and a change in measurement concern because the storms cause technological methods or metrics. In this paper, we will not be disruptions on Earth. The sunspots data series, which concerned about erroneous data points. An example is data counting dark patches on the sun and is related of a component that obscures the underlying form of to the solar storms, shows an eleven-year cycle of the data series is an additive high-frequency solar maximum activity, and if accurately modeled, component. The technique used in this paper to can forecast the severity of future activity. While reduce or remove this type of noise is the moving solar activity is unavoidable, its impact can be average. The data series , xt 4 , xt 3 , xt 2 , xt 1 , xt lessened with appropriate forecasting and proactive action. becomes Finally, many people, primarily in the , ( xt4  xt3  xt2 ) / 3, ( xt3  xt2  xt1 ) / 3, ( xt2  xt1  xt ) / 3 financial markets, would like to profit from time after taking a moving average with an interval i of series forecasting. Whether this is viable is most three. Taking a moving average reduces the number likely a never-to-be-resolved question. Nevertheless of data points in the series by i  1 . many products are available for financial forecasting. A third difficulty is nonstationarity, data that Difficulties inherent in time series forecasting and the do not have the same statistical properties (e.g., mean importance of time series forecasting are presented and variance) at each point in time. A simple next. Then, neural networks and k-nearest-neighbor example of a nonstationary series is the Fibonacci are detailed. Section Error! Reference source not sequence: at every step the sequence takes on a new, found. presents related work. Section Error! higher mean value. The technique used in this paper Reference source not found. gives an application to make a series stationary in the mean is first- level description of the test-bed application, and Section Error! Reference source not found. differencing. The data series , xt 3 , xt 2 , xt 1 , xt presents an empirical evaluation of the results becomes obtained with the application.A time series is a , ( xt 2  xt 3 ), ( xt 1  xt 2 ), ( xt  xt 1 ) after sequence of observations of a random variable. Hence, it is a stochasticprocess. Examples include the taking the first-difference. This usually makes a data monthly demand for a product, the annual series stationary in the mean. If not, the second- freshmanenrollment in a department of a university, difference of the series can be taken. Taking the first- and the daily volume of flows in a river.Forecasting difference reduces the number of data points in the time series data is important component of operations series by one. research because thesedata often provide the A fourth difficulty is forecasting technique foundation for decision models. An inventory model selection. From statistics to artificial intelligence, requiresestimates of future demands, a course there are myriad choices of techniques. One of the scheduling and staffing model for a universityrequires simplest techniques is to search a data series for estimates of future student inflow, and a model for similar past events and use the matches to make a providing warnings to thepopulation in a river basin forecast. One of the most complex techniques is to requires estimates of river flows for the immediate train a model on the series and use the model to make future.Time series analysis provides tools for a forecast. K-nearest-neighbor and neural networks selecting a model that can be used to forecastof future are examples of the first and second techniques, events. Modeling the time series is a statistical respectively. problem. Forecasts are used incomputational procedures to estimate the parameters of a model 1) Importance being used to allocatedlimited resources or to Time series forecasting has several describe random processes such as those mentioned important applications. One application is preventing above. Timeseries models assume that observations undesirable events by forecasting the event, vary according to some probability distributionabout identifying the circumstances preceding the event, an underlying function of time.styles are built-in; and taking corrective action so the event can be examples of the type styles are provided throughout avoided. At the time of this writing, the Federal this document and are identified in italic type, within Reserve Committee is actively raising interest rates to parentheses, following the example. PLEASE DO head off a possible inflationary economic period. NOT RE-ADJUST THESE MARGINS. The Committee possibly uses time series forecasting with many data series to forecast the inflationary period and then acts to alter the future values of the 1858 | P a g e
  • 3. Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue4, July-August 2012, pp.1857-1860 . Each output layer unit performs the calculation in In this section and the next, subscripts c, p, Equation II.1 on its inputs and transfers the result and n will identify units in the current layer, the (Oc) to a network output. previous layer, and the next layer, respectively. Equation II.1 Activation function of an output layer When the network is run, each hidden layer unit unit. performs the calculation in Error! Reference source Each output layer unit performs the calculation in not found. on its inputs and transfers the result (Oc) Equation II.1 on its inputs and transfers the result to the next layer of units. (Oc) to a network output. Equation II.2 Activation function of an output layer unit. P  Oc  hOutput  ic , p wc , p  bc  where hOutput( x )  x  p1  Oc is the output of the current output layer unit c, P is the number of units in the previous P  1 Oc  hHidden   ic , p wc , p  bc  where hHidden ( x )  hidden layer, ic,p is an input to unit c from the  p1  1  ex previous hidden layer unit p, wc,p is the weight modifying the connection from unit p to unit c, and Fig: Forwarding the node values bc is the bias. For this research, hOutput(x) is a linear activation function1 Oc is the output of the current hidden layer unit c, P is K-Nearest-Neighbor In contrast to the complexity of the neural either the number of units in the previous hidden network forecasting technique, the simpler k-nearest- layer or number of network inputs, ic,p is an input to neighbor forecasting technique is also implemented unit c from either the previous hidden layer unit p or and tested. K-nearest-neighbor is simpler because network input p, wc,p is the weight modifying the there is no model to train on the data series. Instead, connection from either unit p to unit c or from input p the data series is searched for situations similar to the to unit c, and bc is the bias. current one each time a forecast needs to be made. In Error! Reference source not found., hHidden(x) is To make the k-nearest-neighbor process description easier, several terms will be defined. The final data points of the data series are the reference, and the length of the reference is the window size. The data series without the last data point is the shortened data series. To forecast the data series’ next data point, the reference is compared to the first group of data points in the shortened data series, called a candidate, and an error is computed. Then the reference is moved one data point forward to the next candidate the sigmoid activation function of the unit and is and another error is computed, and so on. All errors charted in Error! Reference source not found.. are stored and sorted. The smallest k errors correspond to the k candidates that closest match the Fig: Nearest Neighbor Transformation Fig: Prediction Graph for Forwarding node Other types of activation functions exist, but the sigmoid was implemented for this research. To avoid saturating the activation function, which makes training the network difficult, the training data must be scaled appropriately. Similarly, before training, the weights and biases are initialized to appropriately scaled values. reference. Finally, the forecast will be the average of the k data points that follow these candidates. Then, to forecast the next data point, the 1859 | P a g e
  • 4. Nageswara Rao Puli, Nagul Shaik, M.Kishore Kumar / International Journal of Engineering Research and Applications (IJERA) ISSN: 2248-9622 www.ijera.com Vol. 2, Issue4, July-August 2012, pp.1857-1860 process is repeated with the previously forecasted data [4]. Hebb, D. O. (1949). The Organization of point appended to the end of the data series. Behavior: A Neuropsychological Theory. New York: Wiley & Sons. II. CONCLUSION [5]. Kingdon, J. (1997). Intelligent Systems and Section Error! Reference source not found. Financial Forecasting. New York: Springer- introduced time series forecasting, described the work Verlag. presented in the typical neural network paper, which [6]. Lawrence, S., Tsoi, A. C., & Giles, C. L. justified this paper, and identified several difficulties (1996). Noisy Time Series Prediction Using associated with time series forecasting. Among these Symbolic Representation and Recurrent difficulties, noisy and nonstationary data were Neural Network Grammatical Inference investigated further in this paper. Section Error! [Online]. Available: Reference source not found. also presented feed- http://www.neci.nj.nec.com/homepages/lawr forward neural networks and backpropagation ence/papers/finance-tr96/latex.html [March training, which was used as the primary time series 27, 2000]. forecasting technique in this paper. Finally, k-nearest- [7]. McCulloch, W. S., & Pitts, W. H. (1943). A neighbor was presented as an alternative forecasting Logical Calculus of the Ideas Imminent in technique. Nervous Activity. Bulletin of Mathematical Section Error! Reference source not found. briefly Biophysics, 5, 115-133. discussed previous time series forecasting papers. The [8]. Minsky, M., & Papert, S. (1969). most notable of these being the paper by Drossu and Perceptrons: An Introduction to Obradovic (1996), who presented compelling research Computational Geometry. Cambridge, MA: combining stochastic techniques and neural networks. MIT Press. Also of interest were the paper by Geva (1998) and [9]. Rosenblatt, F. (1962). Principles of the book by Kingdon (1997), which took significantly Neurodynamics: Perceptrons and the Theory more sophisticated approaches to time series of Brain Mechanisms. Washington, D. C.: forecasting. Spartan. Section Error! Reference source not found. [10]. Rumelhart, D. E., Hinton, G. E., & presented Forecaster and went through several Williams, R. J. (1986). Learning Internal important aspects of its design, including parsing data Representations by Error Propagation. In D. files, using the Wizard to create networks, training E. Rumelhart, et al. (Eds.), Parallel networks, and forecasting using neural networks and Distributed Processing: Explorations in the k-nearest-neighbor. Microstructures of Cognition, 1: Section Error! Reference source not found. Foundations, 318-362. Cambridge, MA: presented the crux of the paper. First, the data series MIT Press. used in the evaluation were described, and then [11]. Torrence, C., & Compo, G. P. (1998). A parameters and procedures used in forecasting were Practical Guide to Wavelet Analysis given. Among these was a method for selecting the [Online]. Bulletin of the American number of neural network inputs based on data series Meteorological Society. Available: characteristics (also applicable to selecting the http://paos.colorado.edu/research/wavelets/ window size for k-nearest-neighbor), a training [July 2, 2000]. heuristic, and a metric for making quantitative forecast [12]. Zhang, X., & Thearling, K. (1994). Non- comparisons. Finally, a variety of charts and tables, Linear Time-Series Prediction by Systematic accompanied by many empirical observations, were Data Exploration on a Massively Parallel presented for networks trained heuristically and Computer [Online]. Available: simply and for k-nearest-neighbor. http://www3.shore.net/~kht/text/sfitr/sfitr.ht m [March 27, 2000]. REFERENCES [1]. Drossu, R., & Obradovic, Z. (1996). Rapid Design of Neural Networks for Time Series Prediction. IEEE Computational Science & Engineering, Summer 1996, 78-89. [2]. Geva, A. (1998). ScaleNet—Multiscale Neural-Network Architecture for Time Series Prediction. IEEE Transactions on Neural Networks, 9(5), 1471-1482. [3]. Gonzalez, R. C. & Woods, R. E. (1993). Digital Image Processing. New York: Addison-Wesley. 1860 | P a g e