Search results for: digital terrain models
3165 Self-Sensing versus Reference Air Gaps
Authors: Alexander Schulz, Ingrid Rottensteiner, Manfred Neumann, Michael Wehse, Johann Wassermann
Abstract:
Self-sensing estimates the air gap within an electro magnetic path by analyzing the bearing coil current and/or voltage waveform. The self-sensing concept presented in this paper has been developed within the research project “Active Magnetic Bearings with Supreme Reliability" and is used for position sensor fault detection. Within this new concept gap calculation is carried out by an alldigital analysis of the digitized coil current and voltage waveform. For analysis those time periods within the PWM period are used, which give the best results. Additionally, the concept allows the digital compensation of nonlinearities, for example magnetic saturation, without degrading signal quality. This increases the accuracy and robustness of the air gap estimation and additionally reduces phase delays. Beneath an overview about the developed concept first measurement results are presented which show the potential of this all-digital self-sensing concept.Keywords: digital signal analysis, active magnetic bearing, reliability, fault detection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14683164 A New Time Dependent, High Temperature Analytical Model for the Single-electron Box in Digital Applications
Authors: M.J. Sharifi
Abstract:
Several models have been introduced so far for single electron box, SEB, which all of them were restricted to DC response and or low temperature limit. In this paper we introduce a new time dependent, high temperature analytical model for SEB for the first time. DC behavior of the introduced model will be verified against SIMON software and its time behavior will be verified against a newly published paper regarding step response of SEB.Keywords: Single electron box, SPICE, SIMON, Timedependent, Circuit model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12363163 Detecting and Measuring Fabric Pills Using Digital Image Analysis
Authors: Dariush Semnani, Hossein Ghayoor
Abstract:
In this paper a novel method was presented for evaluating the fabric pills using digital image processing techniques. This work provides a novel technique for detecting pills and also measuring their heights, surfaces and volumes. Surely, measuring the intensity of defects by human vision is an inaccurate method for quality control; as a result, this problem became a motivation for employing digital image processing techniques for detection of defects of fabric surface. In the former works, the systems were just limited to measuring of the surface of defects, but in the presented method the height and the volume of defects were also measured, which leads to a more accurate quality control. An algorithm was developed to first, find pills and then measure their average intensity by using three criteria of height, surface and volume. The results showed a meaningful relation between the number of rotations and the quality of pilled fabrics.Keywords: 3D analysis, computer vision, fabric, pile, surface evaluation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26193162 An Owl Ontology for Commonkads Template Knowledge Models
Authors: B. A. Gobin, R. K. Subramanian
Abstract:
This paper gives an overview of how an OWL ontology has been created to represent template knowledge models defined in CML that are provided by CommonKADS. CommonKADS is a mature knowledge engineering methodology which proposes the use of template knowledge model for knowledge modelling. The aim of developing this ontology is to present the template knowledge model in a knowledge representation language that can be easily understood and shared in the knowledge engineering community. Hence OWL is used as it has become a standard for ontology and also it already has user friendly tools for viewing and editing.Keywords: Ontology, OWL, Template Knowledge Models, CommonKADS
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17943161 Automated Transformation of 3D Point Cloud to Building Information Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Petar Penchev
Abstract:
The digital era has revolutionized architectural practices, with Building Information Modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research presents a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data — a collection of data points in space, typically produced by 3D scanners — into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historical preservation.
Keywords: Algorithmic modeling, Building Information Modeling, point cloud, reconstruction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 173160 Comparison of Three Turbulence Models in Wear Prediction of Multi-Size Particulate Flow through Rotating Channel
Authors: Pankaj K. Gupta, Krishnan V. Pagalthivarthi
Abstract:
The present work compares the performance of three turbulence modeling approach (based on the two-equation k -ε model) in predicting erosive wear in multi-size dense slurry flow through rotating channel. All three turbulence models include rotation modification to the production term in the turbulent kineticenergy equation. The two-phase flow field obtained numerically using Galerkin finite element methodology relates the local flow velocity and concentration to the wear rate via a suitable wear model. The wear models for both sliding wear and impact wear mechanisms account for the particle size dependence. Results of predicted wear rates using the three turbulence models are compared for a large number of cases spanning such operating parameters as rotation rate, solids concentration, flow rate, particle size distribution and so forth. The root-mean-square error between FE-generated data and the correlation between maximum wear rate and the operating parameters is found less than 2.5% for all the three models.Keywords: Rotating channel, maximum wear rate, multi-sizeparticulate flow, k −ε turbulence models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17723159 Modeling of Normal and Atherosclerotic Blood Vessels using Finite Element Methods and Artificial Neural Networks
Authors: K. Kamalanand, S. Srinivasan
Abstract:
Analysis of blood vessel mechanics in normal and diseased conditions is essential for disease research, medical device design and treatment planning. In this work, 3D finite element models of normal vessel and atherosclerotic vessel with 50% plaque deposition were developed. The developed models were meshed using finite number of tetrahedral elements. The developed models were simulated using actual blood pressure signals. Based on the transient analysis performed on the developed models, the parameters such as total displacement, strain energy density and entropy per unit volume were obtained. Further, the obtained parameters were used to develop artificial neural network models for analyzing normal and atherosclerotic blood vessels. In this paper, the objectives of the study, methodology and significant observations are presented.Keywords: Blood vessel, atherosclerosis, finite element model, artificial neural networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23083158 Net Regularity and Its Ethical Implications on Internet Stake Holders
Authors: Nourhan Elshenawi
Abstract:
Net Neutrality (NN) is the principle of treating all online data the same without any prioritization of some over others. A research gap in current scholarship about “violations of NN” and the subsequent ethical concerns paves the way for the following research question: To what extent violations of NN entail ethical concerns and implications for Internet stakeholders? To answer this question, NR is examined using the two major action-based ethical theories, Kantian and Utilitarian, across the relevant Internet stakeholders. First some necessary IT background is provided that shapes how the Internet works and who the key stakeholders are. Following the IT background, the relationship between the stakeholders, users, Internet Service Providers (ISPs) and content providers is discussed and illustrated. Then some violations of NN that are currently occurring is covered, without attracting any attention from the general public from an ethical perspective, as a new term Net Regularity (NR). Afterwards, the current scholarship on NN and its violations are discussed, that are mainly from an economic and sociopolitical perspectives to highlight the lack of ethical discussions on the issue. Before moving on to the ethical analysis however, websites are presented as digital entities that are affected by NR and their happiness is measured using functionalism. The analysis concludes that NR is prone to an unethical treatment of Internet stakeholders in the perspective of both theories. Finally, the current Digital Divide in the world is presented to be able to better illustrate the implications of NR. The implications present the new Internet divide that will take place between individuals within society. Through answering the research question using ethical analysis, it attempts to shed some light on the issue of NR and what kind of society it would lead to. NR would not just lead to a divided society, but divided individuals that are separated by something greater than distance, the Internet.Keywords: Digital divide, digital entities, digital ontology, net neutrality, internet ethics, internet law, internet service providers, websites as beings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15643157 ICT Education: Digital History Learners
Authors: Lee Bih Ni, Elvis Fung
Abstract:
This article is to review and understand the new generation of students to understand their expectations and attitudes. There are a group of students on school projects, creative work, educational software and digital signal source, the use of social networking tools to communicate with friends and a part in the competition. Today's students have been described as the new millennium students. They use information and communication technology in a more creative and innovative at home than at school, because the information and communication technologies for different purposes, in the home, usually occur in school. They collaborate and communicate more effectively when they are at home. Most children enter school, they will bring about how to use information and communication technologies, some basic skills and some tips on how to use information and communication technology will provide a more advanced than most of the school's expectations. Many teachers can help students, however, still a lot of work, "tradition", without a computer, and did not see the "new social computing networks describe young people to learn and new ways of working life in the future", in the education system of the benefits of using a computer.
Keywords: ICT Education, Digital History.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21883156 Interoperability in Component Based Software Development
Authors: M. Madiajagan, B. Vijayakumar
Abstract:
The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Keywords: Interoperability, component packaging, communication technology, heterogeneous platform, component interface, middleware.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27873155 Electricity Price Forecasting: A Comparative Analysis with Shallow-ANN and DNN
Authors: Fazıl Gökgöz, Fahrettin Filiz
Abstract:
Electricity prices have sophisticated features such as high volatility, nonlinearity and high frequency that make forecasting quite difficult. Electricity price has a volatile and non-random character so that, it is possible to identify the patterns based on the historical data. Intelligent decision-making requires accurate price forecasting for market traders, retailers, and generation companies. So far, many shallow-ANN (artificial neural networks) models have been published in the literature and showed adequate forecasting results. During the last years, neural networks with many hidden layers, which are referred to as DNN (deep neural networks) have been using in the machine learning community. The goal of this study is to investigate electricity price forecasting performance of the shallow-ANN and DNN models for the Turkish day-ahead electricity market. The forecasting accuracy of the models has been evaluated with publicly available data from the Turkish day-ahead electricity market. Both shallow-ANN and DNN approach would give successful result in forecasting problems. Historical load, price and weather temperature data are used as the input variables for the models. The data set includes power consumption measurements gathered between January 2016 and December 2017 with one-hour resolution. In this regard, forecasting studies have been carried out comparatively with shallow-ANN and DNN models for Turkish electricity markets in the related time period. The main contribution of this study is the investigation of different shallow-ANN and DNN models in the field of electricity price forecast. All models are compared regarding their MAE (Mean Absolute Error) and MSE (Mean Square) results. DNN models give better forecasting performance compare to shallow-ANN. Best five MAE results for DNN models are 0.346, 0.372, 0.392, 0,402 and 0.409.Keywords: Deep learning, artificial neural networks, energy price forecasting, Turkey.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10983154 Human Pose Estimation using Active Shape Models
Authors: Changhyuk Jang, Keechul Jung
Abstract:
Human pose estimation can be executed using Active Shape Models. The existing techniques for applying to human-body research using Active Shape Models, such as human detection, primarily take the form of silhouette of human body. This technique is not able to estimate accurately for human pose to concern two arms and legs, as the silhouette of human body represents the shape as out of round. To solve this problem, we applied the human body model as stick-figure, “skeleton". The skeleton model of human body can give consideration to various shapes of human pose. To obtain effective estimation result, we applied background subtraction and deformed matching algorithm of primary Active Shape Models in the fitting process. The images which were used to make the model were 600 human bodies, and the model has 17 landmark points which indicate body junction and key features of human pose. The maximum iteration for the fitting process was 30 times and the execution time was less than .03 sec.
Keywords: Active shape models, skeleton, pose estimation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24163153 Efficient Copy-Move Forgery Detection for Digital Images
Authors: Somayeh Sadeghi, Hamid A. Jalab, Sajjad Dadkhah
Abstract:
Due to availability of powerful image processing software and improvement of human computer knowledge, it becomes easy to tamper images. Manipulation of digital images in different fields like court of law and medical imaging create a serious problem nowadays. Copy-move forgery is one of the most common types of forgery which copies some part of the image and pastes it to another part of the same image to cover an important scene. In this paper, a copy-move forgery detection method proposed based on Fourier transform to detect forgeries. Firstly, image is divided to same size blocks and Fourier transform is performed on each block. Similarity in the Fourier transform between different blocks provides an indication of the copy-move operation. The experimental results prove that the proposed method works on reasonable time and works well for gray scale and colour images. Computational complexity reduced by using Fourier transform in this method.Keywords: Copy-Move forgery, Digital Forensics, Image Forgery.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27853152 Perceptions of Educators on the Learners’ Youngest Age for the Introduction of ICTs in Schools: A Personality Theory Approach
Authors: K. E. Oyetade, S. D. Eyono Obono
Abstract:
Age ratings are very helpful in providing parents with relevant information for the purchase and use of digital technologies by the children; this is why the non-definition of age ratings for the use of ICTs by children in schools is a major concern; and this problem serves as a motivation for this study whose aim is to examine the factors affecting the perceptions of educators on the learners’ youngest age for the introduction of ICTs in schools. This aim is achieved through two types of research objectives: the identification and design of theories and models on age ratings, and the empirical testing of such theories and models in a survey of educators from the Camperdown district of the South African KwaZulu-Natal province. A questionnaire is used for the collection of the data of this survey whose validity and reliability is checked in SPSS prior to its descriptive and correlative quantitative analysis. The main hypothesis supporting this research is the association between the demographics of educators, their personality, and their perceptions on the learners’ youngest age for the introduction of ICTs in schools; as claimed by existing research; except that the present study looks at personality from three dimensions: self-actualized personalities, fully functioning personalities, and healthy personalities. This hypothesis was fully confirmed by the empirical study conducted by this research except for the demographic factor where only the educators’ grade or class was found to be associated with the personality of educators.
Keywords: Age ratings, Educators, E-learning, Personality Theories.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18423151 Digital Image Watermarking in the Wavelet Transform Domain
Authors: Kamran Hameed, Adeel Mumtaz, S.A.M. Gilani
Abstract:
In this paper, we start by first characterizing the most important and distinguishing features of wavelet-based watermarking schemes. We studied the overwhelming amount of algorithms proposed in the literature. Application scenario, copyright protection is considered and building on the experience that was gained, implemented two distinguishing watermarking schemes. Detailed comparison and obtained results are presented and discussed. We concluded that Joo-s [1] technique is more robust for standard noise attacks than Dote-s [2] technique.Keywords: Digital image, Copyright protection, Watermarking, Wavelet transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26523150 Personal Digital Assistants for Fieldwork Training in College Campus
Authors: Takaharu Miyoshi, Tadahiko Higuchi
Abstract:
Education supported by mobile computers has been widely done for some time. Teachers have attempted to use mobile computers and to find concrete subjects for student-s fieldwork training in college education. The purpose of this research is to develop software for Personal Digital Assistant (PDA) to conduct fieldwork in our campus, and to report a fieldwork class using PDAs in the curriculum of the Department of Regional Environment Studies.
Keywords: Development of software for PDA, fieldwork training, computer supported education, experiential learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11843149 Teachers Learning about Sustainability while Co-Constructing Digital Games
Authors: M. Daskolia, C. Kynigos, N. Yiannoutsou
Abstract:
Teaching and learning about sustainability is a pedagogical endeavour with various innate difficulties and increased demands. Higher education has a dual role to play in addressing this challenge: to identify and explore innovative approaches and tools for addressing the complex and value-laden nature of sustainability in more meaningful ways, and to help teachers to integrate these approaches into their practice through appropriate professional development programs. The study reported here was designed and carried out within the context of a Masters course in Environmental Education. Eight teachers were collaboratively engaged in reconstructing a digital game microworld which was deliberately designed by the researchers to be questioned and evoke critical discussion on the idea of ‘sustainable city’. The study was based on the design-based research method. The findings indicate that the teachers’ involvement in processes of co-constructing the microworld initiated discussion and reflection upon the concepts of sustainability and sustainable lifestyles.
Keywords: sustainability, sustainable lifestyles, constructionism, environmental education, digital games, teacher training
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14053148 Simulation of Reactive Distillation: Comparison of Equilibrium and Nonequilibrium Stage Models
Authors: Asfaw Gezae Daful
Abstract:
In the present study, two distinctly different approaches are followed for modeling of reactive distillation column, the equilibrium stage model and the nonequilibrium stage model. These models are simulated with a computer code developed in the present study using MATLAB programming. In the equilibrium stage models, the vapor and liquid phases are assumed to be in equilibrium and allowance is made for finite reaction rates, where as in the nonequilibrium stage models simultaneous mass transfer and reaction rates are considered. These simulated model results are validated from the experimental data reported in the literature. The simulated results of equilibrium and nonequilibrium models are compared for concentration, temperature and reaction rate profiles in a reactive distillation column for Methyl Tert Butyle Ether (MTBE) production. Both the models show similar trend for the concentration, temperature and reaction rate profiles but the nonequilibrium model predictions are higher and closer to the experimental values reported in the literature.
Keywords: Reactive Distillation, Equilibrium model, Nonequilibrium model, Methyl Tert-Butyl Ether
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 42063147 Enhancing Pedagogical Practices in Online Arabic Language Instruction: Challenges, Opportunities, and Strategies
Authors: Salah Algabli
Abstract:
As online learning takes center stage, Arabic language instructors face the imperative to adapt their practices for the digital realm. This study investigates the experiences of online Arabic instructors to unveil the pedagogical opportunities and challenges this format presents. Utilizing a transcendental phenomenological approach with 15 diverse participants, the research shines a light on the unique realities of online language teaching at the university level, specifically in the United States. The study proposes theoretical and practical solutions to maximize the benefits of online language learning while mitigating its challenges. Recommendations cater to instructors, researchers, and program coordinators, paving the way for enhancing the quality of online Arabic language education. The findings highlight the need for pedagogical approaches tailored to the online environment, ultimately shaping a future where both instructors and learners thrive in this digital landscape.
Keywords: Online Arabic language learning, pedagogical opportunities and challenges, online Arabic teachers, online language instruction, digital pedagogy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 263146 Classification of Business Models of Italian Bancassurance by Balance Sheet Indicators
Authors: Andrea Bellucci, Martina Tofi
Abstract:
The aim of paper is to analyze business models of bancassurance in Italy for life business. The life insurance business is very developed in the Italian market and banks branches have 80% of the market share. Given its maturity, the life insurance market needs to consolidate its organizational form to allow for the development of non-life business, which nowadays collects few premiums but represents a great opportunity to enlarge the market share of bancassurance using its strength in the distribution channel while the market share of independent agents is decreasing. Starting with the main business model of bancassurance for life business, this paper will analyze the performances of life companies in the Italian market by balance sheet indicators and by main discriminant variables of business models. The study will observe trends from 2013 to 2015 for the Italian market by exploiting a database managed by Associazione Nazionale delle Imprese di Assicurazione (ANIA). The applied approach is based on a bottom-up analysis starting with variables and indicators to define business models’ classification. The statistical classification algorithm proposed by Ward is employed to design business models’ profiles. Results from the analysis will be a representation of the main business models built by their profile related to indicators. In that way, an unsupervised analysis is developed that has the limit of its judgmental dimension based on research opinion, but it is possible to obtain a design of effective business models.
Keywords: Balance sheet indicators, Bancassurance, business models, ward algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12613145 The Strengths and Limitations of the Statistical Modeling of Complex Social Phenomenon: Focusing on SEM, Path Analysis, or Multiple Regression Models
Authors: Jihye Jeon
Abstract:
This paper analyzes the conceptual framework of three statistical methods, multiple regression, path analysis, and structural equation models. When establishing research model of the statistical modeling of complex social phenomenon, it is important to know the strengths and limitations of three statistical models. This study explored the character, strength, and limitation of each modeling and suggested some strategies for accurate explaining or predicting the causal relationships among variables. Especially, on the studying of depression or mental health, the common mistakes of research modeling were discussed.Keywords: Multiple regression, path analysis, structural equation models, statistical modeling, social and psychological phenomenon.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 92503144 Mathematical Rescheduling Models for Railway Services
Authors: Zuraida Alwadood, Adibah Shuib, Norlida Abd Hamid
Abstract:
This paper presents the review of past studies concerning mathematical models for rescheduling passenger railway services, as part of delay management in the occurrence of railway disruption. Many past mathematical models highlighted were aimed at minimizing the service delays experienced by passengers during service disruptions. Integer programming (IP) and mixed-integer programming (MIP) models are critically discussed, focusing on the model approach, decision variables, sets and parameters. Some of them have been tested on real-life data of railway companies worldwide, while a few have been validated on fictive data. Based on selected literatures on train rescheduling, this paper is able to assist researchers in the model formulation by providing comprehensive analyses towards the model building. These analyses would be able to help in the development of new approaches in rescheduling strategies or perhaps to enhance the existing rescheduling models and make them more powerful or more applicable with shorter computing time.
Keywords: Mathematical modelling, Mixed-integer programming, Railway rescheduling, Service delays.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32513143 The Classification Model for Hard Disk Drive Functional Tests under Sparse Data Conditions
Authors: S. Pattanapairoj, D. Chetchotsak
Abstract:
This paper proposed classification models that would be used as a proxy for hard disk drive (HDD) functional test equitant which required approximately more than two weeks to perform the HDD status classification in either “Pass" or “Fail". These models were constructed by using committee network which consisted of a number of single neural networks. This paper also included the method to solve the problem of sparseness data in failed part, which was called “enforce learning method". Our results reveal that the constructed classification models with the proposed method could perform well in the sparse data conditions and thus the models, which used a few seconds for HDD classification, could be used to substitute the HDD functional tests.Keywords: Sparse data, Classifications, Committee network
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17363142 Improved Zero Text Watermarking Algorithm against Meaning Preserving Attacks
Authors: Jalil Z., Farooq M., Zafar H., Sabir M., Ashraf E.
Abstract:
Internet is largely composed of textual contents and a huge volume of digital contents gets floated over the Internet daily. The ease of information sharing and re-production has made it difficult to preserve author-s copyright. Digital watermarking came up as a solution for copyright protection of plain text problem after 1993. In this paper, we propose a zero text watermarking algorithm based on occurrence frequency of non-vowel ASCII characters and words for copyright protection of plain text. The embedding algorithm makes use of frequency non-vowel ASCII characters and words to generate a specialized author key. The extraction algorithm uses this key to extract watermark, hence identify the original copyright owner. Experimental results illustrate the effectiveness of the proposed algorithm on text encountering meaning preserving attacks performed by five independent attackers.Keywords: Copyright protection, Digital watermarking, Document authentication, Information security, Watermark.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21603141 Imputing Missing Data in Electronic Health Records: A Comparison of Linear and Non-Linear Imputation Models
Authors: Alireza Vafaei Sadr, Vida Abedi, Jiang Li, Ramin Zand
Abstract:
Missing data is a common challenge in medical research and can lead to biased or incomplete results. When the data bias leaks into models, it further exacerbates health disparities; biased algorithms can lead to misclassification and reduced resource allocation and monitoring as part of prevention strategies for certain minorities and vulnerable segments of patient populations, which in turn further reduce data footprint from the same population – thus, a vicious cycle. This study compares the performance of six imputation techniques grouped into Linear and Non-Linear models, on two different real-world electronic health records (EHRs) datasets, representing 17864 patient records. The mean absolute percentage error (MAPE) and root mean squared error (RMSE) are used as performance metrics, and the results show that the Linear models outperformed the Non-Linear models in terms of both metrics. These results suggest that sometimes Linear models might be an optimal choice for imputation in laboratory variables in terms of imputation efficiency and uncertainty of predicted values.
Keywords: EHR, Machine Learning, imputation, laboratory variables, algorithmic bias.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733140 Feasibility of the Evolutionary Algorithm using Different Behaviours of the Mutation Rate to Design Simple Digital Logic Circuits
Authors: Konstantin Movsovic, Emanuele Stomeo, Tatiana Kalganova
Abstract:
The evolutionary design of electronic circuits, or evolvable hardware, is a discipline that allows the user to automatically obtain the desired circuit design. The circuit configuration is under the control of evolutionary algorithms. Several researchers have used evolvable hardware to design electrical circuits. Every time that one particular algorithm is selected to carry out the evolution, it is necessary that all its parameters, such as mutation rate, population size, selection mechanisms etc. are tuned in order to achieve the best results during the evolution process. This paper investigates the abilities of evolution strategy to evolve digital logic circuits based on programmable logic array structures when different mutation rates are used. Several mutation rates (fixed and variable) are analyzed and compared with each other to outline the most appropriate choice to be used during the evolution of combinational logic circuits. The experimental results outlined in this paper are important as they could be used by every researcher who might need to use the evolutionary algorithm to design digital logic circuits.Keywords: Evolvable hardware, evolutionary algorithm, digitallogic circuit, mutation rate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15023139 High Dynamic Range Resampling for Software Radio
Authors: Arthur David Snider, Laiq Azam
Abstract:
The classic problem of recovering arbitrary values of a band-limited signal from its samples has an added complication in software radio applications; namely, the resampling calculations inevitably fold aliases of the analog signal back into the original bandwidth. The phenomenon is quantified by the spur-free dynamic range. We demonstrate how a novel application of the Remez (Parks- McClellan) algorithm permits optimal signal recovery and SFDR, far surpassing state-of-the-art resamplers.Keywords: Sampling methods, Signal sampling, Digital radio, Digital-analog conversion.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14063138 Inefficiency of Data Storing in Physical Memory
Authors: Kamaruddin Malik Mohamad, Sapiee Haji Jamel, Mustafa Mat Deris
Abstract:
Memory forensic is important in digital investigation. The forensic is based on the data stored in physical memory that involve memory management and processing time. However, the current forensic tools do not consider the efficiency in terms of storage management and the processing time. This paper shows the high redundancy of data found in the physical memory that cause inefficiency in processing time and memory management. The experiment is done using Borland C compiler on Windows XP with 512 MB of physical memory.Keywords: Digital Evidence, Memory Forensics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20193137 Comparison of Different k-NN Models for Speed Prediction in an Urban Traffic Network
Authors: Seyoung Kim, Jeongmin Kim, Kwang Ryel Ryu
Abstract:
A database that records average traffic speeds measured at five-minute intervals for all the links in the traffic network of a metropolitan city. While learning from this data the models that can predict future traffic speed would be beneficial for the applications such as the car navigation system, building predictive models for every link becomes a nontrivial job if the number of links in a given network is huge. An advantage of adopting k-nearest neighbor (k-NN) as predictive models is that it does not require any explicit model building. Instead, k-NN takes a long time to make a prediction because it needs to search for the k-nearest neighbors in the database at prediction time. In this paper, we investigate how much we can speed up k-NN in making traffic speed predictions by reducing the amount of data to be searched for without a significant sacrifice of prediction accuracy. The rationale behind this is that we had a better look at only the recent data because the traffic patterns not only repeat daily or weekly but also change over time. In our experiments, we build several different k-NN models employing different sets of features which are the current and past traffic speeds of the target link and the neighbor links in its up/down-stream. The performances of these models are compared by measuring the average prediction accuracy and the average time taken to make a prediction using various amounts of data.Keywords: Big data, k-NN, machine learning, traffic speed prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13763136 A Hybrid System of Hidden Markov Models and Recurrent Neural Networks for Learning Deterministic Finite State Automata
Authors: Pavan K. Rallabandi, Kailash C. Patidar
Abstract:
In this paper, we present an optimization technique or a learning algorithm using the hybrid architecture by combining the most popular sequence recognition models such as Recurrent Neural Networks (RNNs) and Hidden Markov models (HMMs). In order to improve the sequence/pattern recognition/classification performance by applying a hybrid/neural symbolic approach, a gradient descent learning algorithm is developed using the Real Time Recurrent Learning of Recurrent Neural Network for processing the knowledge represented in trained Hidden Markov Models. The developed hybrid algorithm is implemented on automata theory as a sample test beds and the performance of the designed algorithm is demonstrated and evaluated on learning the deterministic finite state automata.Keywords: Hybrid systems, Hidden Markov Models, Recurrent neural networks, Deterministic finite state automata.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2884