Search results for: conceptual data model
12580 A Parallel Approach for 3D-Variational Data Assimilation on GPUs in Ocean Circulation Models
Authors: Rossella Arcucci, Luisa D’Amore, Simone Celestino, Giuseppe Scotti, Giuliano Laccetti
Abstract:
This work is the first dowel in a rather wide research activity in collaboration with Euro Mediterranean Center for Climate Changes, aimed at introducing scalable approaches in Ocean Circulation Models. We discuss designing and implementation of a parallel algorithm for solving the Variational Data Assimilation (DA) problem on Graphics Processing Units (GPUs). The algorithm is based on the fully scalable 3DVar DA model, previously proposed by the authors, which uses a Domain Decomposition approach (we refer to this model as the DD-DA model). We proceed with an incremental porting process consisting of 3 distinct stages: requirements and source code analysis, incremental development of CUDA kernels, testing and optimization. Experiments confirm the theoretic performance analysis based on the so-called scale up factor demonstrating that the DD-DA model can be suitably mapped on GPU architectures.Keywords: Data Assimilation, Parallel Algorithm, GPU architectures, Ocean Models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 201012579 A Novel Web Metric for the Evaluation of Internet Trends
Authors: Radek Malinský, Ivan Jelínek
Abstract:
Web 2.0 (social networking, blogging and online forums) can serve as a data source for social science research because it contains vast amount of information from many different users. The volume of that information has been growing at a very high rate and becoming a network of heterogeneous data; this makes things difficult to find and is therefore not almost useful. We have proposed a novel theoretical model for gathering and processing data from Web 2.0, which would reflect semantic content of web pages in better way. This article deals with the analysis part of the model and its usage for content analysis of blogs. The introductory part of the article describes methodology for the gathering and processing data from blogs. The next part of the article is focused on the evaluation and content analysis of blogs, which write about specific trend.Keywords: Blog, Sentiment Analysis, Web 2.0, Webometrics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 354312578 Data Mining Applied to the Predictive Model of Triage System in Emergency Department
Authors: Wen-Tsann Lin, Yung-Tsan Jou, Yih-Chuan Wu, Yuan-Du Hsiao
Abstract:
The Emergency Department of a medical center in Taiwan cooperated to conduct the research. A predictive model of triage system is contracted from the contract procedure, selection of parameters to sample screening. 2,000 pieces of data needed for the patients is chosen randomly by the computer. After three categorizations of data mining (Multi-group Discriminant Analysis, Multinomial Logistic Regression, Back-propagation Neural Networks), it is found that Back-propagation Neural Networks can best distinguish the patients- extent of emergency, and the accuracy rate can reach to as high as 95.1%. The Back-propagation Neural Networks that has the highest accuracy rate is simulated into the triage acuity expert system in this research. Data mining applied to the predictive model of the triage acuity expert system can be updated regularly for both the improvement of the system and for education training, and will not be affected by subjective factors.Keywords: Back-propagation Neural Networks, Data Mining, Emergency Department, Triage System.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 230812577 Deriving Causal Explanation from Qualitative Model Reasoning
Authors: Alicia Y. C. Tang, Sharifuddin M. Zain, Noorsaadah A. Rahman, Rukaini Abdullah
Abstract:
This paper discusses a qualitative simulator QRiOM that uses Qualitative Reasoning (QR) technique, and a process-based ontology to model, simulate and explain the behaviour of selected organic reactions. Learning organic reactions requires the application of domain knowledge at intuitive level, which is difficult to be programmed using traditional approach. The main objective of QRiOM is to help learners gain a better understanding of the fundamental organic reaction concepts, and to improve their conceptual comprehension on the subject by analyzing the multiple forms of explanation generated by the software. This paper focuses on the generation of explanation based on causal theories to explicate various phenomena in the chemistry subject. QRiOM has been tested with three classes problems related to organic chemistry, with encouraging results. This paper also presents the results of preliminary evaluation of QRiOM that reveal its explanation capability and usefulness.Keywords: Artificial intelligence, explanation, ontology, organicreactions, qualitative reasoning, QPT.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 164812576 Cultural Production and Urban Regeneration: The Case Study of Amphawa District, Thailand
Authors: P. Techaratpong
Abstract:
This research aims to study the role of cultural production in urban regeneration and argue that cultural production, if properly used, can play a vital role in reviving cities and create substantial positive impacts to the cities. The argument can be elucidated by the case study of Amphawa, a district in Samutsongkram province, Thailand, as an example of successful use of cultural productions. The conceptual framework is based on the model of culture contributions in regeneration to examine the impacts.
The research methodology is qualitative. This study found that cultural productions can revive cities into vibrant ones and exert considerable impacts: physical, social and economic.
It is suggested that, despite that there is not one-fit-all model, cultural production can be an important initiative for any city transformation if it is appropriately implemented. The city planners and authorities ought to consider the conditions and factors and design a specific plan to fit the city context and integrated with other planning.
Keywords: Cultural production, culture, cultural planning, impact, urban regeneration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 278612575 CFD Simulation of Fixed Bed Reactor in Fischer-Tropsch Synthesis of GTL Technology
Authors: Sh. Shahhosseini, S. Alinia, M. Irani
Abstract:
In this paper 2D Simulation of catalytic Fixed Bed Reactor in Fischer-Tropsch Synthesis of GTL technology has been performed utilizing computational fluid dynamics (CFD). Synthesis gas (a mixture of carbon monoxide and hydrogen) has been used as feedstock. The reactor was modeled and the model equations were solved employing finite volume method. The model was validated against the experimental data reported in literature. The comparison showed a good agreement between simulation results and the experimental data. In addition, the model was applied to predict the concentration contours of the reactants and products along the length of reactor.
Keywords: GTL, Fischer–Tropsch synthesis, Fixed Bed Reactor, CFD simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 292312574 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model
Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li
Abstract:
Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.
Keywords: Spatial Information Network, Traffic prediction, Wavelet decomposition, Time series model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 63712573 Knowledge Based Chords Manipulation in MES
Authors: V. Hepsiba Mabel, K. Alagarsamy, Justus S.
Abstract:
Chord formation in western music notations is an intelligent art form which is learnt over the years by a musician to acquire it. Still it is a question of creativity that brings the perfect chord sequence that matches music score. This work focuses on the process of forming chords using a custom-designed knowledgebase (KB) of Music Expert System. An optimal Chord-Set for a given music score is arrived by using the chord-pool in the KB and the finding the chord match using Jusic Distance (JD). Conceptual Graph based knowledge representation model is followed for knowledge storage and retrieval in the knowledgebase.
Keywords: Knowledge, Music, Representation, Knowledgebase, Chord-Set.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 183212572 Generic Model for Timetabling Problems by Integer Linear Programming Approach
Authors: N. A. H. Aizam, V. Uvaraja
Abstract:
The agenda of showing the scheduled time for performing certain tasks is known as timetabling. It is widely used in many departments such as transportation, education, and production. Some difficulties arise to ensure all tasks happen in the time and place allocated. Therefore, many researchers invented various programming models to solve the scheduling problems from several fields. However, the studies in developing the general integer programming model for many timetabling problems are still questionable. Meanwhile, this thesis describes about creating a general model which solves different types of timetabling problems by considering the basic constraints. Initially, the common basic constraints from five different fields are selected and analyzed. A general basic integer programming model was created and then verified by using the medium set of data obtained randomly which is much similar to realistic data. The mathematical software, AIMMS with CPLEX as a solver has been used to solve the model. The model obtained is significant in solving many timetabling problems easily since it is modifiable to all types of scheduling problems which have same basic constraints.
Keywords: AIMMS mathematical software, integer linear programming, scheduling problems, timetabling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 303312571 An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML
Authors: Shinpei Ogata, Yoshitaka Aoki, Hirotaka Okuda, Saeko Matsuura
Abstract:
A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.Keywords: CRUD, Model Checking, Model Driven Development, Requirements Analysis, Unified Modeling Language, UPPAAL.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 167312570 A Context-Aware Supplier Selection Model
Authors: Mohammadreza Razzazi, Maryam Bayat
Abstract:
Selection of the best possible set of suppliers has a significant impact on the overall profitability and success of any business. For this reason, it is usually necessary to optimize all business processes and to make use of cost-effective alternatives for additional savings. This paper proposes a new efficient context-aware supplier selection model that takes into account possible changes of the environment while significantly reducing selection costs. The proposed model is based on data clustering techniques while inspiring certain principles of online algorithms for an optimally selection of suppliers. Unlike common selection models which re-run the selection algorithm from the scratch-line for any decision-making sub-period on the whole environment, our model considers the changes only and superimposes it to the previously defined best set of suppliers to obtain a new best set of suppliers. Therefore, any recomputation of unchanged elements of the environment is avoided and selection costs are consequently reduced significantly. A numerical evaluation confirms applicability of this model and proves that it is a more optimal solution compared with common static selection models in this field.Keywords: Supplier Selection, Context-Awareness, OnlineAlgorithms, Data Clustering.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181812569 Application of Build-up and Wash-off Models for an East-Australian Catchment
Authors: Iqbal Hossain, Monzur Alam Imteaz, Mohammed Iqbal Hossain
Abstract:
Estimation of stormwater pollutants is a pre-requisite for the protection and improvement of the aquatic environment and for appropriate management options. The usual practice for the stormwater quality prediction is performed through water quality modeling. However, the accuracy of the prediction by the models depends on the proper estimation of model parameters. This paper presents the estimation of model parameters for a catchment water quality model developed for the continuous simulation of stormwater pollutants from a catchment to the catchment outlet. The model is capable of simulating the accumulation and transportation of the stormwater pollutants; suspended solids (SS), total nitrogen (TN) and total phosphorus (TP) from a particular catchment. Rainfall and water quality data were collected for the Hotham Creek Catchment (HTCC), Gold Coast, Australia. Runoff calculations from the developed model were compared with the calculated discharges from the widely used hydrological models, WBNM and DRAINS. Based on the measured water quality data, model water quality parameters were calibrated for the above-mentioned catchment. The calibrated parameters are expected to be helpful for the best management practices (BMPs) of the region. Sensitivity analyses of the estimated parameters were performed to assess the impacts of the model parameters on overall model estimations of runoff water quality.Keywords: Calibration, Model Parameters, Suspended Solids, TotalNitrogen, Total Phosphorus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 218312568 A Mixture Model of Two Different Distributions Approach to the Analysis of Heterogeneous Survival Data
Authors: Ülkü Erişoğlu, Murat Erişoğlu, Hamza Erol
Abstract:
In this paper we propose a mixture of two different distributions such as Exponential-Gamma, Exponential-Weibull and Gamma-Weibull to model heterogeneous survival data. Various properties of the proposed mixture of two different distributions are discussed. Maximum likelihood estimations of the parameters are obtained by using the EM algorithm. Illustrative example based on real data are also given.Keywords: Exponential-Gamma, Exponential-Weibull, Gamma-Weibull, EM Algorithm, Survival Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 406412567 3D Face Modeling based on 3D Dense Morphable Face Shape Model
Authors: Yongsuk Jang Kim, Sun-Tae Chung, Boogyun Kim, Seongwon Cho
Abstract:
Realistic 3D face model is more precise in representing pose, illumination, and expression of face than 2D face model so that it can be utilized usefully in various applications such as face recognition, games, avatars, animations, and etc. In this paper, we propose a 3D face modeling method based on 3D dense morphable shape model. The proposed 3D modeling method first constructs a 3D dense morphable shape model from 3D face scan data obtained using a 3D scanner. Next, the proposed method extracts and matches facial landmarks from 2D image sequence containing a face to be modeled, and then reconstructs 3D vertices coordinates of the landmarks using a factorization-based SfM technique. Then, the proposed method obtains a 3D dense shape model of the face to be modeled by fitting the constructed 3D dense morphable shape model into the reconstructed 3D vertices. Also, the proposed method makes a cylindrical texture map using 2D face image sequence. Finally, the proposed method generates a 3D face model by rendering the 3D dense face shape model using the cylindrical texture map. Through building processes of 3D face model by the proposed method, it is shown that the proposed method is relatively easy, fast and precise.Keywords: 3D Face Modeling, 3D Morphable Shape Model, 3DReconstruction, 3D Correspondence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 242812566 Formal Verification of Cache System Using a Novel Cache Memory Model
Authors: Guowei Hou, Lixin Yu, Wei Zhuang, Hui Qin, Xue Yang
Abstract:
Formal verification is proposed to ensure the correctness of the design and make functional verification more efficient. As cache plays a vital role in the design of System on Chip (SoC), and cache with Memory Management Unit (MMU) and cache memory unit makes the state space too large for simulation to verify, then a formal verification is presented for such system design. In the paper, a formal model checking verification flow is suggested and a new cache memory model which is called “exhaustive search model” is proposed. Instead of using large size ram to denote the whole cache memory, exhaustive search model employs just two cache blocks. For cache system contains data cache (Dcache) and instruction cache (Icache), Dcache memory model and Icache memory model are established separately using the same mechanism. At last, the novel model is employed to the verification of a cache which is module of a custom-built SoC system that has been applied in practical, and the result shows that the cache system is verified correctly using the exhaustive search model, and it makes the verification much more manageable and flexible.
Keywords: Cache system, formal verification, novel model, System on Chip (SoC).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 229812565 A Purpose Based Usage Access Control Model
Abstract:
As privacy becomes a major concern for consumers and enterprises, many research have been focused on the privacy protecting technology in recent years. In this paper, we present a comprehensive approach for usage access control based on the notion purpose. In our model, purpose information associated with a given data element specifies the intended use of the subjects and objects in the usage access control model. A key feature of our model is that it allows when an access is required, the access purpose is checked against the intended purposes for the data item. We propose an approach to represent purpose information to support access control based on purpose information. Our proposed solution relies on usage access control (UAC) models as well as the components which based on the notions of the purpose information used in subjects and objects. Finally, comparisons with related works are analyzed.Keywords: Purpose, privacy, access control, authorization
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 188512564 Experimental Evaluation of Methane Adsorptionon Granular Activated Carbon (GAC) and Determination of Model Isotherm
Authors: M. Delavar, A.A. Ghoreyshi, M. Jahanshahi, M. Irannejad
Abstract:
This study investigates the capacity of granular activated carbon (GAC) for the storage of methane through the equilibrium adsorption. An experimental apparatus consist of a dual adsorption vessel was set up for the measurement of equilibrium adsorption of methane on GAC using volumetric technique (pressure decay). Experimental isotherms of methane adsorption were determined by the measurement of equilibrium uptake of methane in different pressures (0-50 bar) and temperatures (285.15-328.15°K). The experimental data was fitted to Freundlich and Langmuir equations to determine the model isotherm. The results show that the experimental data is equally well fitted by the both model isotherms. Using the experimental data obtained in different temperatures the isosteric heat of methane adsorption was also calculated by the Clausius-Clapeyron equation from the Sips isotherm model. Results of isosteric heat of adsorption show that decreasing temperature or increasing methane uptake by GAC decrease the isosteric heat of methane adsorption.Keywords: Methane adsorption, Activated carbon, Modelisotherm, Isosteric heat
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 247912563 Blockchain-Based Assignment Management System
Authors: Amogh Katti, J. Sai Asritha, D. Nivedh, M. Kalyan Srinivas, B. Somnath Chakravarthi
Abstract:
Today's modern education system uses Learning Management System (LMS) portals for the scoring and grading of student performances, to maintain student records, and teachers are instructed to accept assignments through online submissions of .pdf, .doc, .ppt, etc. There is a risk of data tampering in the traditional portals; we will apply the Blockchain model instead of this traditional model to avoid data tampering and also provide a decentralized mechanism for overall fairness. Blockchain technology is a better and also recommended model because of the following features: consensus mechanism, decentralized system, cryptographic encryption, smart contracts, Ethereum blockchain. The proposed system ensures data integrity and tamper-proof assignment submission and grading, which will be helpful for both students and also educators.
Keywords: Education technology, learning management system, decentralized applications, blockchain.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15312562 Development of a Catchment Water Quality Model for Continuous Simulations of Pollutants Build-up and Wash-off
Authors: Iqbal Hossain, Dr. Monzur Imteaz, Dr. Shirley Gato-Trinidad, Prof. Abdallah Shanableh
Abstract:
Estimation of runoff water quality parameters is required to determine appropriate water quality management options. Various models are used to estimate runoff water quality parameters. However, most models provide event-based estimates of water quality parameters for specific sites. The work presented in this paper describes the development of a model that continuously simulates the accumulation and wash-off of water quality pollutants in a catchment. The model allows estimation of pollutants build-up during dry periods and pollutants wash-off during storm events. The model was developed by integrating two individual models; rainfall-runoff model, and catchment water quality model. The rainfall-runoff model is based on the time-area runoff estimation method. The model allows users to estimate the time of concentration using a range of established methods. The model also allows estimation of the continuing runoff losses using any of the available estimation methods (i.e., constant, linearly varying or exponentially varying). Pollutants build-up in a catchment was represented by one of three pre-defined functions; power, exponential, or saturation. Similarly, pollutants wash-off was represented by one of three different functions; power, rating-curve, or exponential. The developed runoff water quality model was set-up to simulate the build-up and wash-off of total suspended solids (TSS), total phosphorus (TP) and total nitrogen (TN). The application of the model was demonstrated using available runoff and TSS field data from road and roof surfaces in the Gold Coast, Australia. The model provided excellent representation of the field data demonstrating the simplicity yet effectiveness of the proposed model.
Keywords: Catchment, continuous pollutants build-up, pollutants wash-off, runoff, runoff water quality model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 313512561 Estimating Shortest Circuit Path Length Complexity
Authors: Azam Beg, P. W. Chandana Prasad, S.M.N.A Senenayake
Abstract:
When binary decision diagrams are formed from uniformly distributed Monte Carlo data for a large number of variables, the complexity of the decision diagrams exhibits a predictable relationship to the number of variables and minterms. In the present work, a neural network model has been used to analyze the pattern of shortest path length for larger number of Monte Carlo data points. The neural model shows a strong descriptive power for the ISCAS benchmark data with an RMS error of 0.102 for the shortest path length complexity. Therefore, the model can be considered as a method of predicting path length complexities; this is expected to lead to minimum time complexity of very large-scale integrated circuitries and related computer-aided design tools that use binary decision diagrams.Keywords: Monte Carlo circuit simulation data, binary decision diagrams, neural network modeling, shortest path length estimation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 137812560 A Constructive Analysis of the Formation of LGBTQ Families: Where Utopia and Reality Meet
Authors: Panagiotis Pentaris
Abstract:
The issue of social and legal recognition of LGBTQ families is of high importance when exploring the possibility of a family. Of equal importance is the fact that both society and the individual contribute to the overall recognition of LGBTQ families. This paper is a conceptual discussion, by methodology, of both sides; it uses a method of constructive analysis to expound on this issue. This method’s aim is to broaden conceptual theory, and introduce a new relationship between concepts that were previously not associated by evidence. This exploration has found that LGBTQ realities from an international perspective may differ and both legal and social rights are critical toward self-consciousness and the formation of a family. This paper asserts that internalised and historic oppression of LGBTQ individuals, places them, not always and not in all places, in a disadvantageous position as far as engaging with the potential of forming a family goes. The paper concludes that lack of social recognition and internalised oppression are key barriers regarding LGBTQ families.
Keywords: Family, gay, LGBTQ, self-worth, social rights.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 213612559 Jitter Transfer in High Speed Data Links
Authors: Tsunwai Gary Yip
Abstract:
Phase locked loops for data links operating at 10 Gb/s or faster are low phase noise devices designed to operate with a low jitter reference clock. Characterization of their jitter transfer function is difficult because the intrinsic noise of the device is comparable to the random noise level in the reference clock signal. A linear model is proposed to account for the intrinsic noise of a PLL. The intrinsic noise data of a PLL for 10 Gb/s links is presented. The jitter transfer function of a PLL in a test chip for 12.8 Gb/s data links was determined in experiments using the 400 MHz reference clock as the source of simultaneous excitations over a wide range of frequency. The result shows that the PLL jitter transfer function can be approximated by a second order linear model.Keywords: Intrinsic phase noise, jitter in data link, PLL jitter transfer function, high speed clocking in electronic circuit
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 194612558 A Biometric Template Security Approach to Fingerprints Based on Polynomial Transformations
Authors: Ramon Santana
Abstract:
The use of biometric identifiers in the field of information security, access control to resources, authentication in ATMs and banking among others, are of great concern because of the safety of biometric data. In the general architecture of a biometric system have been detected eight vulnerabilities, six of them allow obtaining minutiae template in plain text. The main consequence of obtaining minutia templates is the loss of biometric identifier for life. To mitigate these vulnerabilities several models to protect minutiae templates have been proposed. Several vulnerabilities in the cryptographic security of these models allow to obtain biometric data in plain text. In order to increase the cryptographic security and ease of reversibility, a minutiae templates protection model is proposed. The model aims to make the cryptographic protection and facilitate the reversibility of data using two levels of security. The first level of security is the data transformation level. In this level generates invariant data to rotation and translation, further transformation is irreversible. The second level of security is the evaluation level, where the encryption key is generated and data is evaluated using a defined evaluation function. The model is aimed at mitigating known vulnerabilities of the proposed models, basing its security on the impossibility of the polynomial reconstruction.Keywords: Fingerprint, template protection, bio-cryptography, minutiae protection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 84212557 Artificial Neural Network Prediction for Coke Strength after Reaction and Data Analysis
Authors: Sulata Maharana, B Biswas, Adity Ganguly, Ashok Kumar
Abstract:
In this paper, the requirement for Coke quality prediction, its role in Blast furnaces, and the model output is explained. By applying method of Artificial Neural Networking (ANN) using back propagation (BP) algorithm, prediction model has been developed to predict CSR. Important blast furnace functions such as permeability, heat exchanging, melting, and reducing capacity are mostly connected to coke quality. Coke quality is further dependent upon coal characterization and coke making process parameters. The ANN model developed is a useful tool for process experts to adjust the control parameters in case of coke quality deviations. The model also makes it possible to predict CSR for new coal blends which are yet to be used in Coke Plant. Input data to the model was structured into 3 modules, for tenure of past 2 years and the incremental models thus developed assists in identifying the group causing the deviation of CSR.Keywords: Artificial Neural Networks, backpropagation, CokeStrength after Reaction, Multilayer Perceptron.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 261112556 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: Big data, building-value analysis, machine learning, price prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 116412555 Algorithm for Determining the Parameters of a Two-Layer Soil Model
Authors: Adekitan I. Aderibigbe, Fakolujo A. Olaosebikan
Abstract:
The parameters of a two-layer soil can be determined by processing resistivity data obtained from resistivity measurements carried out on the soil of interest. The processing usually entails applying the resistivity data as inputs to an optimisation function. This paper proposes an algorithm which utilises the square error as an optimisation function. Resistivity data from previous works were applied to test the accuracy of the new algorithm developed and the result obtained conforms significantly to results from previous works.
Keywords: Algorithm, earthing, resistivity, two-layer soil-model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 333112554 Information Quality Evaluation Framework: Extending ISO 25012 Data Quality Model
Authors: Irfan Rafique, Philip Lew, Maissom Qanber Abbasi, Zhang Li
Abstract:
The world wide web coupled with the ever-increasing sophistication of online technologies and software applications puts greater emphasis on the need of even more sophisticated and consistent quality requirements modeling than traditional software applications. Web sites and Web applications (WebApps) are becoming more information driven and content-oriented raising the concern about their information quality (InQ). The consistent and consolidated modeling of InQ requirements for WebApps at different stages of the life cycle still poses a challenge. This paper proposes an approach to specify InQ requirements for WebApps by reusing and extending the ISO 25012:2008(E) data quality model. We also discuss learnability aspect of information quality for the WebApps. The proposed ISO 25012 based InQ framework is a step towards a standardized approach to evaluate WebApps InQ.Keywords: Data Quality Model, Information learnability, Information Quality, Web applications.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 578612553 A Novel Framework for User-Friendly Ontology-Mediated Access to Relational Databases
Authors: Efthymios Chondrogiannis, Vassiliki Andronikou, Efstathios Karanastasis, Theodora Varvarigou
Abstract:
A large amount of data is typically stored in relational databases (DB). The latter can efficiently handle user queries which intend to elicit the appropriate information from data sources. However, direct access and use of this data requires the end users to have an adequate technical background, while they should also cope with the internal data structure and values presented. Consequently the information retrieval is a quite difficult process even for IT or DB experts, taking into account the limited contributions of relational databases from the conceptual point of view. Ontologies enable users to formally describe a domain of knowledge in terms of concepts and relations among them and hence they can be used for unambiguously specifying the information captured by the relational database. However, accessing information residing in a database using ontologies is feasible, provided that the users are keen on using semantic web technologies. For enabling users form different disciplines to retrieve the appropriate data, the design of a Graphical User Interface is necessary. In this work, we will present an interactive, ontology-based, semantically enable web tool that can be used for information retrieval purposes. The tool is totally based on the ontological representation of underlying database schema while it provides a user friendly environment through which the users can graphically form and execute their queries.
Keywords: Ontologies, Relational Databases, SPARQL, Web Interface.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 193012552 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI
Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi
Abstract:
This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.
Keywords: Ungauged Basin, Catchment Characteristics Model, Synthetic data, GIS.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 131112551 Simulation Data Management Approach for Developing Adaptronic Systems – The W-Model Methodology
Authors: Roland S. Nattermann, Reiner Anderl
Abstract:
Existing proceeding-models for the development of mechatronic systems provide a largely parallel action in the detailed development. This parallel approach is to take place also largely independent of one another in the various disciplines involved. An approach for a new proceeding-model provides a further development of existing models to use for the development of Adaptronic Systems. This approach is based on an intermediate integration and an abstract modeling of the adaptronic system. Based on this system-model a simulation of the global system behavior, due to external and internal factors or Forces is developed. For the intermediate integration a special data management system is used. According to the presented approach this data management system has a number of functions that are not part of the "normal" PDM functionality. Therefore a concept for a new data management system for the development of Adaptive system is presented in this paper. This concept divides the functions into six layers. In the first layer a system model is created, which divides the adaptronic system based on its components and the various technical disciplines. Moreover, the parameters and properties of the system are modeled and linked together with the requirements and the system model. The modeled parameters and properties result in a network which is analyzed in the second layer. From this analysis necessary adjustments to individual components for specific manipulation of the system behavior can be determined. The third layer contains an automatic abstract simulation of the system behavior. This simulation is a precursor for network analysis and serves as a filter. By the network analysis and simulation changes to system components are examined and necessary adjustments to other components are calculated. The other layers of the concept treat the automatic calculation of system reliability, the "normal" PDM-functionality and the integration of discipline-specific data into the system model. A prototypical implementation of an appropriate data management with the addition of an automatic system development is being implemented using the data management system ENOVIA SmarTeam V5 and the simulation system MATLAB.
Keywords: Adaptronic, Data-Management, LOEWE-CentreAdRIA
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2368