call for papers, research paper publishing, where to publish research paper, journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJEI, call for papers 2012,journal of science and technolog
The document proposes a temporal-based multimedia data management system for distributed environments that uses transaction time and valid time to track changes to multimedia data over time without deleting previous versions, allowing users to access historical data. It describes issues with existing multimedia database management systems not allowing access to historical data and presents a solution using a temporal database model with transaction and valid time elements to better manage and retrieve time-varying multimedia information across networks.
Report
Share
Report
Share
1 of 2
Download to read offline
More Related Content
Similar to call for papers, research paper publishing, where to publish research paper, journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJEI, call for papers 2012,journal of science and technolog
Temporal database, Multimedia database, Access control, Flow controlPooja Dixit
This document discusses temporal databases, multimedia databases, access control, and flow control. It defines temporal databases as storing data related to time instances and offering temporal data types. Multimedia databases are described as collections of interrelated multimedia data like text, graphics, images, video and audio. The document outlines different types of access control including mandatory, discretionary and role-based access control. It also defines flow control as managing data flow between devices to prevent overflow.
Message Oriented Middleware for Library’s Metadata ExchangeTELKOMNIKA JOURNAL
Library is one of the important tools in the development of science to store various intellectual properties. Currently most libraries are managed by standalone systems and are not equipped with data exchange facilities with other libraries for sharing information. Sharing of information between libraries can be done with integration metadata owned library. In this research, the integration architecture of metadata exchange is done with Message Oriented Middleware (MOM) technology. This MOM redeems the collection metadata that matches the standard Dublin Core format. In this research, database structure, MOM structure and set of rules to perform data sharing process. With the proposed MOM architectural design is expected to search process information between libraries will become easier and cheaper.
Dynamic Semantics for the Internet of Things PayamBarnaghi
Ontology Summit 2015 : Track A Session - Ontology Integration in the Internet of Things - Thu 2015-02-05,
http://ontolog-02.cim3.net/wiki/ConferenceCall_2015_02_05
SURVEY ON DYNAMIC DATA SHARING IN PUBLIC CLOUD USING MULTI-AUTHORITY SYSTEMijiert bestjournal
The continuous development of cloud computing,seve ral trends are opening up to new forms of outsourci ng. Public data integrity auditing is not secure and efficient for shared dynamic data. In existing scheme figure out the collusion attack and provide an efficient public integrity au diting scheme,with the help of secure group user r evocation based on vector commitment and verifier�local revocation group signature. It provides secure and efficient s cheme which support public checking and efficient user revocati on. Problem of existing work they used TPA (Third p arty auditor) for key generation and key agreement. Use of TPA as central system if it fails then whole system gets failed. If we are working with cloud,user identity is major conc ern because user doesn�t want to reveal his persona l information to public. This concept not included in it. In this paper,based these con�s we proposed a dynamic dat a sharing in public cloud using multi-authority system. The prop osed scheme is able to protect user�s privacy again st each single authority. .
The document discusses the history and development of multimedia database management systems (MMDBMS). It traces the evolution of MMDBMS from early systems in the 1980s to recent advancements in security, indexing/retrieval, manageability, and performance. Personal computer growth, hardware compatibility, and internet usage drove increased demand for multimedia databases. Recent improvements include enhanced security features, improved indexing and retrieval of multimedia data through standards like MPEG, better manageability through synchronization and replication, and higher retrieval performance due to disk technology advances. The ongoing development of networking and hardware will continue shaping this evolving technology.
DRMS cu uuvhg. HbbhyfufucufMiniproject.pptxshubhamrkokare
The document outlines a student project to develop a Document Repository Management System called DocVault. It will address challenges with managing documents for events in the college's computer department. A team of 4 students led by Ms. Rohini Bhosale will create DocVault, which aims to provide centralized storage, simple organization and search, and secure access to documents. The project will involve researching existing systems, designing interfaces and features, development, and testing to ensure DocVault streamlines document management for the department.
WHAT IS DIGITAL PRESERVATION? DISCUSS ITS SIGNIFICANCE IN TODAY’S INFORMATIO...`Shweta Bhavsar
This document discusses digital preservation, its significance, and strategies. Digital preservation involves managing digital information for long-term access. It is important to preserve digital content as the amount of digital information has grown rapidly. Digital preservation strategies include bit-stream copying, migration, and emulation. Metadata is essential for digital preservation as it helps identify, describe, and access digital objects over time. The document also outlines the importance of digital preservation in today's information world such as enhancing access, supporting long-term data retention, and protecting original items.
This document summarizes a review of literature on data warehouse management strategies. The review found that Immediate Incremental Management (IIM) is preferred over Deferred Incremental Management (DIM) for updating data warehouse views in real-time. IIM allows for concurrent updates between the data source and views to provide accurate, timely information to users without periods of warehouse downtime. The optimal data warehouse architecture incorporates source databases, monitoring modules to detect changes, integration modules to process updates, view management modules to maintain consistency, and access applications for users.
The document discusses new media management and the digital content lifecycle. It describes the key stages in managing digital content, including selecting content, creating digital files, describing content with metadata, managing content, discovering and organizing content so it can be found, enabling use and re-use of content, and preserving content long-term. Each stage of the lifecycle is important for ensuring digital content remains accessible and usable over time.
This document summarizes a study that compares the performance of time series databases using real-world datasets versus synthetic datasets. The study measures three key performance metrics - data loading throughput, storage space usage, and query latency - for different time series databases when ingesting and querying both real and synthetic time series data. The results show significant differences in performance between real and synthetic datasets for data injection throughput and query execution times. Specifically, databases perform differently when handling real-world versus synthetic datasets, indicating that benchmarks using only synthetic data may not accurately represent real-world database performance for time series applications.
Performance Comparison between Pytorch and Mindsporeijdms
The growth of big-data sectors such as the Internet of Things (IoT) generates enormous volumes of data. As IoT devices generate a vast volume of time-series data, the Time Series Database (TSDB) popularity has grown alongside the rise of IoT. Time series databases are developed to manage and analyze huge amounts of time series data. However, it is not easy to choose the best one from them. The most popular benchmarks compare the performance of different databases to each other but use random or synthetic data that applies to only one domain. As a result, these benchmarks may not always accurately represent real-world performance. It is required to comprehensively compare the performance of time series databases with real datasets. The experiment shows significant performance differences for data injection time and query execution time when comparing real and synthetic datasets. The results are reported and analyzed.
HITS: A History-Based Intelligent Transportation System IJDKP
Transportation is the driving force of any country. Today we are facing an explosion in the number of motor vehicles that affects our daily routines. Intelligent transportation systems (ITS) aim to provide efficient tools that solve traffic problems. Predicting route congestions during different day periods can help drivers choose better routes for their trips. In this paper we propose “HITS” a traffic control system that integrates moving object database techniques [30, 28] along with data warehousing techniques [15].
Our system uses historical traffic information to answer queries about moving objects on road network, and to analyze historical traffic conditions to enhance future traffic related decisions.
IJRET : International Journal of Research in Engineering and Technology is an international peer reviewed, online journal published by eSAT Publishing House for the enhancement of research in various disciplines of Engineering and Technology. The aim and scope of the journal is to provide an academic medium and an important reference for the advancement and dissemination of research results that support high-level learning, teaching and research in the fields of Engineering and Technology. We bring together Scientists, Academician, Field Engineers, Scholars and Students of related fields of Engineering and Technology
Spatio-Temporal Database and Its Models: A ReviewIOSR Journals
This document provides a review of spatial-temporal databases and their models. It discusses the key components and characteristics of spatial databases, temporal databases, and spatial-temporal databases. Some of the main models of spatial-temporal data modeling that are described include the snapshot model, space-time composite data model, simple time-stamping models, event-oriented models, three-domain model, and history graph model. The review examines how these different models approach representing and querying spatial and temporal data.
Data Mining is the process of discovering new correlations, patterns, and trends by digging into (mining) large amounts of data stored in warehouses, using artificial intelligence, statistical and mathematical techniques. Data mining can also be defined as the process of extracting knowledge hidden from large volumes of raw data i.e. the nontrivial extraction of implicit, previously unknown, and potentially useful information from data. The alternative name of Data Mining is Knowledge discovery (mining) in databases (KDD), knowledge extraction, data/pattern analysis, etc.
Data Management for Internet of things : A Survey and DiscussionIRJET Journal
This document discusses data management for the Internet of Things (IoT). It begins with an abstract that outlines the need for improved data management techniques to handle the massive volumes of data produced by IoT devices. The document then provides background on IoT data characteristics that make traditional database solutions inadequate. It surveys current research in IoT data management and proposes a framework that considers the full data lifecycle from collection to deletion. Finally, it performs a gap analysis of existing solutions based on factors like data format, storage architecture, processing speed, and server response time.
This document provides an overview of Medici 2, a scalable content management system for cultural heritage datasets. Medici 2 allows for the digitization, storage, management and analysis of large cultural heritage collections. It provides tools for dataset preprocessing, automatic metadata extraction, user metadata input, and visualization of data formats like images, 3D models, videos and Reflectance Transformation Imaging (RTI). The document describes Medici 2's architecture, which includes a web server, preprocessors/extractors, previewers, and database. It also covers Medici 2's metadata handling and provides an example of its dataset upload and preprocessing workflow.
RECURRENT FEATURE GROUPING AND CLASSIFICATION MODEL FOR ACTION MODEL PREDICTI...IJDKP
Content based retrieval has an advantage of higher prediction accuracy as compared to tagging based approach. However, the complexity in its representation and classification approach, results in lower processing accuracy and computation overhead. The correlative nature of the feature data are un-explored in the conventional modeling, where all the data features are taken as a set of feature values to give a decision. The recurrent feature class attribute is observed for the feature regrouping in action model prediction. In this paper a co-relative information, bounding grouping approach is suggested for action model prediction in CBMR application. The co-relative recurrent feature mapping results in faster retrieval process as compared to the conventional retrieval system.
RECURRENT FEATURE GROUPING AND CLASSIFICATION MODEL FOR ACTION MODEL PREDICTI...IJDKP
Content based retrieval has an advantage of higher prediction accuracy as compared to tagging based approach. However, the complexity in its representation and classification approach, results in lower processing accuracy and computation overhead. The correlative nature of the feature data are un-explored in the conventional modeling, where all the data features are taken as a set of feature values to give a decision. The recurrent feature class attribute is observed for the feature regrouping in action model prediction. In this paper a co-relative information, bounding grouping approach is suggested for action model prediction in CBMR application. The co-relative recurrent feature mapping results in faster retrieval process as compared to the conventional retrieval system.
DEVELOPING A KNOWLEDGE MANAGEMENT SPIRAL FOR THE LONG-TERM PRESERVATION SYSTE...cscpconf
This document discusses the development of a knowledge management system for long-term digital preservation on a semantic grid. It presents a conceptual model that integrates knowledge management principles with archival workflows. A logical architecture is described that realizes this model using a semantic datagrid infrastructure based on the OAIS reference model. The goal is to support flexible management of distributed digital archives while enabling new knowledge discovery and value creation through dynamic semantic annotations over time.
Similar to call for papers, research paper publishing, where to publish research paper, journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJEI, call for papers 2012,journal of science and technolog (20)
This document discusses the impact of data mining on business intelligence. It begins by defining business intelligence as using new technologies to quickly respond to changes in the business environment. Data mining is an important part of the business intelligence lifecycle, which includes determining requirements, collecting and analyzing data, generating reports, and measuring performance. Data mining allows businesses to access real-time, accurate data from multiple sources to improve decision making. Using business intelligence and data mining techniques can help businesses become more efficient and make better decisions to increase profits and customer satisfaction. The expected results of applying business intelligence include improved decision making through accurate, timely information to support organizational goals and strategic plans.
This document presents a novel technique for solving the transcendental equations of selective harmonics elimination pulse width modulation (SHEPWM) inverters based on the secant method. The proposed algorithm uses the secant method to simplify the numerical solution of the nonlinear equations and solve them faster compared to other methods. Simulation results validate that the proposed method accurately estimates the switching angles to eliminate specific harmonics from the output voltage waveform and achieves near sinusoidal output current for various modulation indices and numbers of harmonics eliminated.
This document summarizes a research paper that designed and implemented a dual tone multi-frequency (DTMF) based GSM-controlled car security system. The system uses a DTMF decoder and GSM module to allow a car to be remotely controlled and secured from a mobile phone. It works by sending DTMF tones from the phone through calls to the GSM module in the car. The decoder interprets the tones and a microcontroller executes commands to disable the ignition or control other devices. The system was created to improve car security and accessibility through remote monitoring and control with DTMF and GSM technology.
This document presents an algorithm for imperceptibly embedding a DNA-encoded watermark into a color image for authentication purposes. It applies a multi-resolution discrete wavelet transform to decompose the image. The watermark, encoded into DNA nucleotides, is then embedded into the third-level wavelet coefficients through a quantization process. Specifically, the watermark nucleotides are complemented and used to quantize coefficients in the middle frequency band, modifying the coefficients. The watermarked image is reconstructed through inverse wavelet transform. Extraction reverses these steps to recover the watermark without the original image. The algorithm aims to balance imperceptibility and robustness through this wavelet-based, blind watermarking scheme.
1) The document analyzes the dynamic saturation point of a deep-water channel in Shanghai port based on actual traffic data and a ship domain model.
2) A dynamic channel transit capacity model is established that considers factors like channel width, ship density, speed, and reductions due to traffic conditions.
3) Based on AIS data from the channel, the average traffic flow is calculated to be 15.7 ships per hour, resulting in a dynamic saturation of 32.5%, or 43.3% accounting for uneven day/night traffic volumes.
The document summarizes research on the use of earth air tunnels and wind towers as passive solar techniques. Key findings include:
- Earth air tunnels circulate air through underground pipes to take advantage of the stable temperature 4 meters below ground for cooling in summer and heating in winter. Testing showed the technique can reduce ambient temperatures by up to 14 degrees Celsius.
- Wind towers circulate air through tall shafts to cool air entering buildings at night and provide downward airflow of cooled air during the day.
- Experimental testing of an earth air tunnel system over multiple months found maximum temperature reductions of 33% in spring and minimum reductions of 15% in summer.
The document compares the mechanical and physical properties of low density polyethylene (LDPE) thin films and sheets reinforced with graphene nanoparticles. LDPE/graphene thin films were produced via solution casting, while sheets were made by compression molding. Testing showed that the thin films had enhanced tensile strength, lower melt flow index, and higher thermal stability compared to sheets. The tensile strength of thin films increased by up to 160% with 1% graphene, while sheets increased by 70%. Melt flow index decreased more for thin films, indicating higher viscosity. Thin films also showed greater improvement in glass transition temperature. These results demonstrate that processing technique affects the properties of LDPE/graphene nanocomposites.
The document describes improvements made to a friction testing machine. A stepper motor and PLC control system were added to automatically vary the load on friction pairs, replacing the manual method. Tests using the improved machine found that the friction coefficient decreases as the load increases, and that abrasive and adhesive wear increased with higher loads. The improved machine allows more accurate and convenient testing of friction pairs under varying load conditions.
This document summarizes a research article that investigates the steady, two-dimensional Falkner-Skan boundary layer flow over a stationary wedge with momentum and thermal slip boundary conditions. The flow considers a temperature-dependent thermal conductivity in the presence of a porous medium and viscous dissipation. Governing partial differential equations are non-dimensionalized and transformed into ordinary differential equations using similarity transformations. The equations are highly nonlinear and cannot be solved analytically, so a numerical solver is used. Numerical results are presented for the skin friction coefficient, local Nusselt number, velocity and temperature profiles for varying parameters like the Falkner-Skan parameter and Eckert number.
An improvised white board compass was designed and developed to enhance the teaching of geometrical construction concepts in basic technology courses. The compass allows teachers to visually demonstrate geometric concepts and constructions on a white board in an engaging, hands-on manner. It supports constructivist learning principles by enabling students to observe and emulate the teacher. The design process utilized design and development research methodology to test educational theories and validate the practical application of the compass. The improvised compass was found to effectively engage students and improve their performance in learning geometric constructions.
The document describes the design of an energy meter that calculates energy using a one second logic for improved accuracy. The meter samples voltage and current values using an ADC synchronized to the line frequency via PLL. It calculates active and reactive power by averaging the sampled values over each second. The accumulated active power for each second is multiplied by one second to calculate energy, which is accumulated and converted to kWh. Test results showed the meter achieved an error of 0.3%, within the acceptable limit for class 1 meters. Considering energy over longer durations like one second helps reduce percentage error in the calculation.
This document presents a two-stage method for solving fuzzy transportation problems where the costs, supplies, and demands are represented by symmetric trapezoidal fuzzy numbers. In the first stage, the problem is solved to satisfy minimum demand requirements. Remaining supplies are then distributed in the second stage to further minimize costs. A numerical example demonstrates using robust ranking techniques to convert the fuzzy problem into a crisp one, which is then solved using a zero suffix method. The total optimal costs from both stages provide the solution to the original fuzzy transportation problem.
1) The document proposes using an Adaptive Neuro-Fuzzy Inference System (ANFIS) controller for a Distributed Power Flow Controller (DPFC) to improve voltage regulation and power quality in a transmission system.
2) A DPFC is placed at a load bus in an IEEE 4 bus system and its performance is compared using a PI controller and ANFIS controller.
3) Simulation results show the ANFIS controller provides faster convergence and better voltage profile maintenance during voltage sags and swells compared to the PI controller.
The document describes an improved particle swarm optimization algorithm to solve vehicle routing problems. It introduces concepts of leptons and hadrons to particles in the algorithm. Leptons interact weakly based on individual and neighborhood best positions, while hadrons (local best particles) undergo strong interactions by colliding with the global best particle. When stagnation occurs, particle decay is used to increase diversity. Simulations show the improved algorithm avoids premature convergence and finds better solutions compared to the basic particle swarm optimization.
This document presents a method for analyzing photoplethysmographic (PPG) signals using correlative analysis. The method involves calculating the autocorrelation function of the PPG signal, extracting the envelope of the autocorrelation function using a low pass filter, and approximating the envelope by determining attenuation coefficients. Ten PPG signals were collected from volunteers and analyzed using this method. The attenuation coefficients were found to have similar values around 0.46, providing a potentially useful parameter for medical diagnosis.
This document describes the simulation and design of a process to recover monoethylene glycol (MEG) from effluent waste streams of a petrochemical company in Iran. Aspen Plus simulation software was used to model the process, which involves separating water, salts, and various glycols (MEG, DEG, TEG, TTEG) using a series of distillation columns. Sensitivity analyses were performed to optimize column parameters such as pressure, reflux ratio, and boilup ratio. The results showed that MEG, DEG, TEG, and TTEG could be recovered at rates of 5.01, 2.039, 0.062, and 0.089 kg/hr, respectively.
This document presents a numerical analysis of fluid flow and heat transfer characteristics of ventilated disc brake rotors using computational fluid dynamics (CFD). Two types of rotor configurations are considered: circular pillared (CP) and diamond pillared radial vane (DP). A 20° sector of each rotor is modeled and meshed. Governing equations for mass, momentum, and energy are solved using ANSYS CFX. Boundary conditions include 900K and 1500K isothermal rotor walls for different speeds. Results show the DP rotor has 70% higher mass flow and 24% higher heat dissipation than the CP rotor. Velocity and pressure distributions are more uniform for the DP rotor at higher speeds, ensuring more uniform cooling. The
This document describes the design and testing of an automated cocoa drying house prototype in Trinidad and Tobago. The prototype included automated features like a retractable roof, automatic heaters, and remote control. It aims to address issues with the traditional manual sun drying process, which is time-consuming and relies on human monitoring of changing weather conditions. Initial testing with farmers showed interest in the automated system as a potential solution.
This document presents the design of a telemedical system for remote monitoring of cardiac insufficiency. The system includes an electrocardiography (ECG) device that collects and digitizes ECG signals. The ECG signals undergo digital signal processing including autocorrelation analysis. Graphical interfaces allow patients and doctors to view ECG data and attenuation coefficients derived from autocorrelation analysis. Data is transmitted between parties using TCP/IP protocol. The system aims to facilitate remote monitoring of cardiac patients to reduce hospitalizations through early detection of health changes.
The document summarizes a polygon oscillating piston engine invention. The engine uses multiple pistons arranged around the sides of a polygon within cylinders. As the pistons oscillate, they compress and combust air-fuel mixtures to produce power. This design achieves a very high power-to-weight ratio of up to 2 hp per pound. Engineering analysis and design of a prototype 6-sided engine is presented, showing it can produce 168 hp from a 353 cubic feet per minute air flow at 12,960 rpm. The invention overcomes issues with prior oscillating piston designs by keeping the pistons moving in straight lines within cylinders using conventional piston rings.
More from International Journal of Engineering Inventions www.ijeijournal.com (20)
call for papers, research paper publishing, where to publish research paper, journal publishing, how to publish research paper, Call For research paper, international journal, publishing a paper, IJEI, call for papers 2012,journal of science and technolog
1. International Journal of Engineering Inventions
ISSN: 2278-7461, www.ijeijournal.com
Volume 1, Issue 5 (September2012) PP: 31-32
Temporal Based Multimedia Data Management System in a
Distributed Environment
Uthaya Kumar M1, Seemakurty Utej2, Vasanth K3, B.Persis Urbana Ivy4
1,2,3
SITE VIT University Vellore
4
Asst.Prof(SG) SITE VIT University Vellore
Abstract––Multimedia Data Management is an application which focuses on the storage, access, indexing, retrieval and
preserving multimedia data in a distributed environment. But it was found that a reliable method was required to discover
about the validate information of multimedia data. To overcome this problem, a new concept of temporal database is been
proposed which includes two time elements that is been embedded in the application. The two time elements include the
transaction time and valid time which provides a schema to manage historical data based on past, present and future.
These time elements have a track of changes and transactions that are been updated in the database, without the deletion
of previous version of the data and thus ensuring the users, the data the up-to-date.
I. INTRODUCTION
We present this research paper concerning the development of multimedia data management application in an
effective way of accessing the information. This application intends to collect, store and access, retrieve & preserve the data
in a distributed environment. It traces the data based on the historical view and update several versions of the data without
overwriting the previous version.
II. DETAILED PROBLEM DEFINITION
For accessing the multimedia data, multimedia database management system (MDMS) was been developed. But
this method was not efficient enough to access the data based on historical view. Hence it is been enhanced to provide
complex features providing indexing and classification for multimedia data retrieval using temporal databases.
III. SOLUTION METHODOLOGY
The new method proposed provides an efficient way to manage the historic data .This method consists of two
elements. This includes transaction time which records the actual time; the data is entered into the database. The second
element is the valid time, which specifies the validation time of the data. These temporal operators ensure the management
of data becoming easier and classified. It also plays major role in monitoring the changes and the transactions of the data
during the insertion and updating process.
IV. EXPERIMENTAL ANALYSIS
It is demonstrated that the proposed approach is been easy & efficient to store the time varying information
ensuring the up-to-date multimedia data. It was analyzed that the existing data was been discarded during the updating
process. Hence the temporal elements are been embedded into the database for accessing the data based on their transaction
time and validation time.
V. MODELING PRINCIPLES
The two modes or time elements used for tracing time varying information are transaction time & valid time.
1. Transaction time:
This time element is used for recording the time during the insertion and the updation process. Hence the inserted
and the updated data have their own transaction time. This presents a clear view on the historical time of the data.
2. Valid time:
This time element represents the valid period of the multimedia data stored in the database. The validation of the
data changes when the data is been edited and modified. The valid time includes the two attributes valid from & valid to. The
valid time is combined with a set of operators equal, before, after, meets and met-by.
Consider a multimedia database containing set of multimedia data M = {m1, m2,……m n } then the complete model for
temporal based multimedia database is TEMPORAL ( mi € M ) ≤ (tt ∩ OP ∩ vt) where tt is a transaction time, OP is
temporal operator & vt is valid time
31
2. Temporal Based Multimedia Data Management System in a Distributed Environment
Results
User
Query TEMPORAL ELEMENTS
Transaction time
Valid time
Temporal Operator
Temporal
Multimedia
Database
Figure 1 Conceptual of Temp Multimedia Database
VI. RESULTS AND DISCUSSION
The experimental results show that the temporal elements, the transaction time and the validation time involved in
the database yields high performance in the indexing and classification of data based on current, past and present versions.
These time elements thus monitor the changes that have been incorporated into the database and ensure the user with the
updated information.
VII. CONCLUSION
An efficient method was required to trace the validation of the data information due to the lack of knowledge
regarding the time varying information. To overcome this situation, a temporal database is been developed to discover and
monitor the historical data based on past, present and future in an efficient way. This method is been evaluated to improve
the multimedia data management process by storing the multimedia objects dynamically after the changes been made in the
current record without deleting the previous the previous version of record.
REFERENCES
1. Mahmood, N., Burney, A., Ahsan, Kamran: “A Logical Temporal Relational Data Model.” International Journal of
Computer Science Issues, Vol.7, Issues 1, 2010.
2. Halawani, S.M., Al-Romema, N.A.: “Memory Storage Issues of Temporal Database Applications on Relational
Database Management Systems”. Journal of Computer Science, 6(3), 296-304, 2010.
3. Nordin, A.R.M., Farham, M.: “Applying Time Granularity in Multimedia Data Management”. International
Conference on Computational Intelligence and Vehicular System, 60-64, 2010
32