Search results for: ground truth data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26691

Search results for: ground truth data

24801 Probabilistic Study of Impact Threat to Civil Aircraft and Realistic Impact Energy

Authors: Ye Zhang, Chuanjun Liu

Abstract:

In-service aircraft is exposed to different types of threaten, e.g. bird strike, ground vehicle impact, and run-way debris, or even lightning strike, etc. To satisfy the aircraft damage tolerance design requirements, the designer has to understand the threatening level for different types of the aircraft structures, either metallic or composite. Exposing to low-velocity impacts may produce very serious internal damages such as delaminations and matrix cracks without leaving visible mark onto the impacted surfaces for composite structures. This internal damage can cause significant reduction in the load carrying capacity of structures. The semi-probabilistic method provides a practical and proper approximation to establish the impact-threat based energy cut-off level for the damage tolerance evaluation of the aircraft components. Thus, the probabilistic distribution of impact threat and the realistic impact energy level cut-offs are the essential establishments required for the certification of aircraft composite structures. A new survey of impact threat to civil aircraft in-service has recently been carried out based on field records concerning around 500 civil aircrafts (mainly single aisles) and more than 4.8 million flight hours. In total 1,006 damages caused by low-velocity impact events had been screened out from more than 8,000 records including impact dents, scratches, corrosions, delaminations, cracks etc. The impact threat dependency on the location of the aircraft structures and structural configuration was analyzed. Although the survey was mainly focusing on the metallic structures, the resulting low-energy impact data are believed likely representative to general civil aircraft, since the service environments and the maintenance operations are independent of the materials of the structures. The probability of impact damage occurrence (Po) and impact energy exceedance (Pe) are the two key parameters for describing the statistic distribution of impact threat. With the impact damage events from the survey, Po can be estimated as 2.1x10-4 per flight hour. Concerning the calculation of Pe, a numerical model was developed using the commercial FEA software ABAQUS to backward estimate the impact energy based on the visible damage characteristics. The relationship between the visible dent depth and impact energy was established and validated by drop-weight impact experiments. Based on survey results, Pe was calculated and assumed having a log-linear relationship versus the impact energy. As the product of two aforementioned probabilities, Po and Pe, it is reasonable and conservative to assume Pa=PoxPe=10-5, which indicates that the low-velocity impact events are similarly likely as the Limit Load events. Combing Pa with two probabilities Po and Pe obtained based on the field survey, the cutoff level of realistic impact energy was estimated and valued as 34 J. In summary, a new survey was recently done on field records of civil aircraft to investigate the probabilistic distribution of impact threat. Based on the data, two probabilities, Po and Pe, were obtained. Considering a conservative assumption of Pa, the cutoff energy level for the realistic impact energy has been determined, which provides potential applicability in damage tolerance certification of future civil aircraft.

Keywords: composite structure, damage tolerance, impact threat, probabilistic

Procedia PDF Downloads 308
24800 Emotional Artificial Intelligence and the Right to Privacy

Authors: Emine Akar

Abstract:

The majority of privacy-related regulation has traditionally focused on concepts that are perceived to be well-understood or easily describable, such as certain categories of data and personal information or images. In the past century, such regulation appeared reasonably suitable for its purposes. However, technologies such as AI, combined with ever-increasing capabilities to collect, process, and store “big data”, not only require calibration of these traditional understandings but may require re-thinking of entire categories of privacy law. In the presentation, it will be explained, against the background of various emerging technologies under the umbrella term “emotional artificial intelligence”, why modern privacy law will need to embrace human emotions as potentially private subject matter. This argument can be made on a jurisprudential level, given that human emotions can plausibly be accommodated within the various concepts that are traditionally regarded as the underlying foundation of privacy protection, such as, for example, dignity, autonomy, and liberal values. However, the practical reasons for regarding human emotions as potentially private subject matter are perhaps more important (and very likely more convincing from the perspective of regulators). In that respect, it should be regarded as alarming that, according to most projections, the usefulness of emotional data to governments and, particularly, private companies will not only lead to radically increased processing and analysing of such data but, concerningly, to an exponential growth in the collection of such data. In light of this, it is also necessity to discuss options for how regulators could address this emerging threat.

Keywords: AI, privacy law, data protection, big data

Procedia PDF Downloads 88
24799 Develop a Conceptual Data Model of Geotechnical Risk Assessment in Underground Coal Mining Using a Cloud-Based Machine Learning Platform

Authors: Reza Mohammadzadeh

Abstract:

The major challenges in geotechnical engineering in underground spaces arise from uncertainties and different probabilities. The collection, collation, and collaboration of existing data to incorporate them in analysis and design for given prospect evaluation would be a reliable, practical problem solving method under uncertainty. Machine learning (ML) is a subfield of artificial intelligence in statistical science which applies different techniques (e.g., Regression, neural networks, support vector machines, decision trees, random forests, genetic programming, etc.) on data to automatically learn and improve from them without being explicitly programmed and make decisions and predictions. In this paper, a conceptual database schema of geotechnical risks in underground coal mining based on a cloud system architecture has been designed. A new approach of risk assessment using a three-dimensional risk matrix supported by the level of knowledge (LoK) has been proposed in this model. Subsequently, the model workflow methodology stages have been described. In order to train data and LoK models deployment, an ML platform has been implemented. IBM Watson Studio, as a leading data science tool and data-driven cloud integration ML platform, is employed in this study. As a Use case, a data set of geotechnical hazards and risk assessment in underground coal mining were prepared to demonstrate the performance of the model, and accordingly, the results have been outlined.

Keywords: data model, geotechnical risks, machine learning, underground coal mining

Procedia PDF Downloads 274
24798 Classification of Poverty Level Data in Indonesia Using the Naïve Bayes Method

Authors: Anung Style Bukhori, Ani Dijah Rahajoe

Abstract:

Poverty poses a significant challenge in Indonesia, requiring an effective analytical approach to understand and address this issue. In this research, we applied the Naïve Bayes classification method to examine and classify poverty data in Indonesia. The main focus is on classifying data using RapidMiner, a powerful data analysis platform. The analysis process involves data splitting to train and test the classification model. First, we collected and prepared a poverty dataset that includes various factors such as education, employment, and health..The experimental results indicate that the Naïve Bayes classification model can provide accurate predictions regarding the risk of poverty. The use of RapidMiner in the analysis process offers flexibility and efficiency in evaluating the model's performance. The classification produces several values to serve as the standard for classifying poverty data in Indonesia using Naive Bayes. The accuracy result obtained is 40.26%, with a moderate recall result of 35.94%, a high recall result of 63.16%, and a low recall result of 38.03%. The precision for the moderate class is 58.97%, for the high class is 17.39%, and for the low class is 58.70%. These results can be seen from the graph below.

Keywords: poverty, classification, naïve bayes, Indonesia

Procedia PDF Downloads 56
24797 Influencing Factors of School Enterprise Cooperation: An Exploratory Study in Chinese Vocational Nursing Education

Authors: Xiao Chen, Alice Ho, Mabel Tie, Xiaoheng Xu

Abstract:

Background and Significance of the Study: School-enterprise cooperation has been the cornerstone of vocational education in China and many other countries. Researchers and policymakers have paid much attention to ensuring the implementation and improving the quality of school-enterprise cooperation. However, many problems still exist on the implementation level of the cooperation. On the one hand, the enterprises lack the motivation to participate in the cooperation. On the other hand, there is a lack of effective guidance and management during the cooperation. Furthermore, the current literature focuses greatly on policy recommendations on the national level while failing to provide a detailed practical understanding of how school-enterprise cooperation is carried out on the ground level. With emerging social problems, such as the aging population in China, there is an increasing need for diverse nursing services and better nursing quality. Methodology: To gain a deeper understanding of the influencing factors of the implementation of school-enterprise cooperation, this work conducted 37 exploratory interviews in four Chinese cities spanning first-tier to fourth-tier cities with hospital department directors, vocational school deans, nurses, and vocational students. Multiple critical policy documents that founded the current vocational education system in China were analyzed, along with the data collected from the interviews. Major Findings: Based on the policy and interview analyses, this work reveals a set of influencing factors for school-enterprise cooperation implementation. Findings from each region contribute to an overall model of influencing factors for implementing school-enterprise cooperation in vocational nursing education in China, which leads to practical insights for policy recommendation. The key influencing factors are found based on the policy, hospital, school, and social levels. Following practical policy recommendations were presented. Moving forward, further research on the implementation of school-enterprise cooperation in specific industries will become increasingly critical to improving the effectiveness of educational policies and the quality of vocational education.

Keywords: nursing, policy recommendation, school-enterprise cooperation, vocational education

Procedia PDF Downloads 116
24796 Web Search Engine Based Naming Procedure for Independent Topic

Authors: Takahiro Nishigaki, Takashi Onoda

Abstract:

In recent years, the number of document data has been increasing since the spread of the Internet. Many methods have been studied for extracting topics from large document data. We proposed Independent Topic Analysis (ITA) to extract topics independent of each other from large document data such as newspaper data. ITA is a method for extracting the independent topics from the document data by using the Independent Component Analysis. The topic represented by ITA is represented by a set of words. However, the set of words is quite different from the topics the user imagines. For example, the top five words with high independence of a topic are as follows. Topic1 = {"scor", "game", "lead", "quarter", "rebound"}. This Topic 1 is considered to represent the topic of "SPORTS". This topic name "SPORTS" has to be attached by the user. ITA cannot name topics. Therefore, in this research, we propose a method to obtain topics easy for people to understand by using the web search engine, topics given by the set of words given by independent topic analysis. In particular, we search a set of topical words, and the title of the homepage of the search result is taken as the topic name. And we also use the proposed method for some data and verify its effectiveness.

Keywords: independent topic analysis, topic extraction, topic naming, web search engine

Procedia PDF Downloads 119
24795 Estimating the Life-Distribution Parameters of Weibull-Life PV Systems Utilizing Non-Parametric Analysis

Authors: Saleem Z. Ramadan

Abstract:

In this paper, a model is proposed to determine the life distribution parameters of the useful life region for the PV system utilizing a combination of non-parametric and linear regression analysis for the failure data of these systems. Results showed that this method is dependable for analyzing failure time data for such reliable systems when the data is scarce.

Keywords: masking, bathtub model, reliability, non-parametric analysis, useful life

Procedia PDF Downloads 562
24794 Preliminary Design of Maritime Energy Management System: Naval Architectural Approach to Resolve Recent Limitations

Authors: Seyong Jeong, Jinmo Park, Jinhyoun Park, Boram Kim, Kyoungsoo Ahn

Abstract:

Energy management in the maritime industry is being required by economics and in conformity with new legislative actions taken by the International Maritime Organization (IMO) and the European Union (EU). In response, the various performance monitoring methodologies and data collection practices have been examined by different stakeholders. While many assorted advancements in operation and technology are applicable, their adoption in the shipping industry stays small. This slow uptake can be considered due to many different barriers such as data analysis problems, misreported data, and feedback problems, etc. This study presents a conceptual design of an energy management system (EMS) and proposes the methodology to resolve the limitations (e.g., data normalization using naval architectural evaluation, management of misrepresented data, and feedback from shore to ship through management of performance analysis history). We expect this system to make even short-term charterers assess the ship performance properly and implement sustainable fleet control.

Keywords: data normalization, energy management system, naval architectural evaluation, ship performance analysis

Procedia PDF Downloads 449
24793 Time-Dependent Density Functional Theory of an Oscillating Electron Density around a Nanoparticle

Authors: Nilay K. Doshi

Abstract:

A theoretical probe describing the excited energy states of the electron density surrounding a nanoparticle (NP) is presented. An electromagnetic (EM) wave interacts with a NP much smaller than the incident wavelength. The plasmon that oscillates locally around the NP comprises of excited conduction electrons. The system is based on the Jellium model of a cluster of metal atoms. Hohenberg-Kohn (HK) equations and the variational Kohn-Sham (SK) scheme have been used to obtain the NP electron density in the ground state. Furthermore, a time-dependent density functional (TDDFT) theory is used to treat the excited states in a density functional theory (DFT) framework. The non-interacting fermionic kinetic energy is shown to be a functional of the electron density. The time dependent potential is written as the sum of the nucleic potential and the incoming EM field. This view of the quantum oscillation of the electron density is a part of the localized surface plasmon resonance.

Keywords: electron density, energy, electromagnetic, DFT, TDDFT, plasmon, resonance

Procedia PDF Downloads 332
24792 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 416
24791 Anisotropic Total Fractional Order Variation Model in Seismic Data Denoising

Authors: Jianwei Ma, Diriba Gemechu

Abstract:

In seismic data processing, attenuation of random noise is the basic step to improve quality of data for further application of seismic data in exploration and development in different gas and oil industries. The signal-to-noise ratio of the data also highly determines quality of seismic data. This factor affects the reliability as well as the accuracy of seismic signal during interpretation for different purposes in different companies. To use seismic data for further application and interpretation, we need to improve the signal-to-noise ration while attenuating random noise effectively. To improve the signal-to-noise ration and attenuating seismic random noise by preserving important features and information about seismic signals, we introduce the concept of anisotropic total fractional order denoising algorithm. The anisotropic total fractional order variation model defined in fractional order bounded variation is proposed as a regularization in seismic denoising. The split Bregman algorithm is employed to solve the minimization problem of the anisotropic total fractional order variation model and the corresponding denoising algorithm for the proposed method is derived. We test the effectiveness of theproposed method for synthetic and real seismic data sets and the denoised result is compared with F-X deconvolution and non-local means denoising algorithm.

Keywords: anisotropic total fractional order variation, fractional order bounded variation, seismic random noise attenuation, split Bregman algorithm

Procedia PDF Downloads 207
24790 Wastewater Treatment Using Sodom Apple Tree in Arid Regions

Authors: D. Oulhaci, M. Zehah, S. Meguellati

Abstract:

Collected by the sewerage network, the wastewater contains many polluting elements, coming from the population, commercial, industrial and agricultural activities. These waters are collected and discharged into the natural environment and pollute it. Hence the need to transport them before discharge to a treatment plant to undergo several treatment phases. The objective of this study is to highlight the purification performance of the "Sodom apple tree" which is a very common shrub in the region of Djanet and Illizi in Algeria. As material, we used small buckets filled with sand with a gravel substrate. We sowed seeds that we let grow a few weeks. The water supply is under a horizontal flow regime under-ground. The urban wastewater used is preceded by preliminary treatment. The water obtained after purification is collected using a tap in a container placed under the seal. The comparison between the inlet and the outlet waters showed that the presence of the Sodom apple tree contributes to reducing their pollutant parameters with significant rates: 81% for COD, 84%, for BOD , 95% for SM , 82% for NO⁻² , and 85% for NO⁻³ and can be released into the environment without risk of pollution

Keywords: arid zone, pollution, purification, re-use, wastewater.

Procedia PDF Downloads 80
24789 Analysis and Prediction of the Behavior of the Landslide at Ain El Hammam, Algeria Based on the Second Order Work Criterion

Authors: Zerarka Hizia, Akchiche Mustapha, Prunier Florent

Abstract:

The landslide of Ain El Hammam (AEH) is characterized by a complex geology and a high hydrogeology hazard. AEH's perpetual reactivation compels us to look closely at its triggers and to better understand the mechanisms of its evolution in mass and in depth. This study builds a numerical model to simulate the influencing factors such as precipitation, non-saturation, and pore pressure fluctuations, using Plaxis software. For a finer analysis of instabilities, we use Hill's criterion, based on the sign of the second order work, which is the most appropriate material stability criterion for non-associated elastoplastic materials. The results of this type of calculation allow us, in theory, to predict the shape and position of the slip surface(s) which are liable to ground movements of the slope, before reaching the rupture given by the plastic limit of Mohr Coulomb. To validate the numerical model, an analysis of inclinometer measures is performed to confirm the direction of movement and kinematic of the sliding mechanism of AEH’s slope.

Keywords: landslide, second order work, precipitation, inclinometers

Procedia PDF Downloads 179
24788 NSBS: Design of a Network Storage Backup System

Authors: Xinyan Zhang, Zhipeng Tan, Shan Fan

Abstract:

The first layer of defense against data loss is the backup data. This paper implements an agent-based network backup system used the backup, server-storage and server-backup agent these tripartite construction, and we realize the snapshot and hierarchical index in the NSBS. It realizes the control command and data flow separation, balances the system load, thereby improving the efficiency of the system backup and recovery. The test results show the agent-based network backup system can effectively improve the task-based concurrency, reasonably allocate network bandwidth, the system backup performance loss costs smaller and improves data recovery efficiency by 20%.

Keywords: agent, network backup system, three architecture model, NSBS

Procedia PDF Downloads 459
24787 A t-SNE and UMAP Based Neural Network Image Classification Algorithm

Authors: Shelby Simpson, William Stanley, Namir Naba, Xiaodi Wang

Abstract:

Both t-SNE and UMAP are brand new state of art tools to predominantly preserve the local structure that is to group neighboring data points together, which indeed provides a very informative visualization of heterogeneity in our data. In this research, we develop a t-SNE and UMAP base neural network image classification algorithm to embed the original dataset to a corresponding low dimensional dataset as a preprocessing step, then use this embedded database as input to our specially designed neural network classifier for image classification. We use the fashion MNIST data set, which is a labeled data set of images of clothing objects in our experiments. t-SNE and UMAP are used for dimensionality reduction of the data set and thus produce low dimensional embeddings. Furthermore, we use the embeddings from t-SNE and UMAP to feed into two neural networks. The accuracy of the models from the two neural networks is then compared to a dense neural network that does not use embedding as an input to show which model can classify the images of clothing objects more accurately.

Keywords: t-SNE, UMAP, fashion MNIST, neural networks

Procedia PDF Downloads 198
24786 An Online Adaptive Thresholding Method to Classify Google Trends Data Anomalies for Investor Sentiment Analysis

Authors: Duygu Dere, Mert Ergeneci, Kaan Gokcesu

Abstract:

Google Trends data has gained increasing popularity in the applications of behavioral finance, decision science and risk management. Because of Google’s wide range of use, the Trends statistics provide significant information about the investor sentiment and intention, which can be used as decisive factors for corporate and risk management fields. However, an anomaly, a significant increase or decrease, in a certain query cannot be detected by the state of the art applications of computation due to the random baseline noise of the Trends data, which is modelled as an Additive white Gaussian noise (AWGN). Since through time, the baseline noise power shows a gradual change an adaptive thresholding method is required to track and learn the baseline noise for a correct classification. To this end, we introduce an online method to classify meaningful deviations in Google Trends data. Through extensive experiments, we demonstrate that our method can successfully classify various anomalies for plenty of different data.

Keywords: adaptive data processing, behavioral finance , convex optimization, online learning, soft minimum thresholding

Procedia PDF Downloads 167
24785 Contribution of the Study of Inclusion Fluids to the Knowledge of the Conditions of Formation of the Layers with SN-W of Central Hoggar, Algeria

Authors: J. Bouguebrine, L. Bouabsa

Abstract:

The ground of study is localized in central Hoggar and contains the most important layers and Stanno-Wolframifére indices of the metallogenic province have tin and wolfram of Hoggar. These layers are always associate with post-orogenetic Panafrican magmatism (GMR) which was set up in the form of circumscribed granitic solid masses of relatively reduced size or in dykes of microgranites. The area studied are in Tounine, Aléméda, Hanana-hananére, Tim Amzi, El Karoussa. The geochemical data processing watch peralumineux character rich person out of Li-F and rare metals (MR). Pegmatites of the type stocksheider, formations of greisens and mineralization Sn-W accompany these granites. Mineralisation Sn-W, expressed particularly well in the seams of quartz and greinsen is spacialement and génitiquement dependent on the maguatism specific to white feldspar-topaz (GMR) (grained and microgrenu). the mineral paragenesis is primarily made up of wolframite and cassetérite. The minerals of gangue are represented by quartz, topaz, the micas containing lithia and the fluorite. A microthermometric study of fluid inclusions related to the granites end on white feldspar-topaz of Hanana, topaz of Hananére, the microgranite of Aléméda, and the seams of quartz D In Tounine (Tiftazouine) and of Tim Amzi; allows to characterize the fluids associated with these layers. It comes out from this study the abundance of aqueous inclusions and three types of fluids were given: -Hot and salted fluids rich in volatile elements particularly CO2; -follow-ups by aquo-carbonic fluids less hot and moderately salted with temperatures of homogenisations (HT) average respectively of 300°C and 180°C; -finally of the aqueous fluids very little salted (≤1%pds.éq.NaCl) and definitely colder. An estimate depths éteé made starting from the diagram of (Haas, 1971) in the system H2O-NaCl, the results are the following: • Inclusion aqueous (L and Lw): correspond to depths of about 50 à500m. • Inclusions aquo-carbonic (Lcw and Lwc): correspond to depths of L order of 600 with 1200m • Carbonic inclusion (Vcw): correspond to depths about 1400à1800m

Keywords: fluid inclusions microthermométrie, cassiterite wolframite, granites with rare metals, Central Hoggar

Procedia PDF Downloads 405
24784 Energy Efficient Assessment of Energy Internet Based on Data-Driven Fuzzy Integrated Cloud Evaluation Algorithm

Authors: Chuanbo Xu, Xinying Li, Gejirifu De, Yunna Wu

Abstract:

Energy Internet (EI) is a new form that deeply integrates the Internet and the entire energy process from production to consumption. The assessment of energy efficient performance is of vital importance for the long-term sustainable development of EI project. Although the newly proposed fuzzy integrated cloud evaluation algorithm considers the randomness of uncertainty, it relies too much on the experience and knowledge of experts. Fortunately, the enrichment of EI data has enabled the utilization of data-driven methods. Therefore, the main purpose of this work is to assess the energy efficient of park-level EI by using a combination of a data-driven method with the fuzzy integrated cloud evaluation algorithm. Firstly, the indicators for the energy efficient are identified through literature review. Secondly, the artificial neural network (ANN)-based data-driven method is employed to cluster the values of indicators. Thirdly, the energy efficient of EI project is calculated through the fuzzy integrated cloud evaluation algorithm. Finally, the applicability of the proposed method is demonstrated by a case study.

Keywords: energy efficient, energy internet, data-driven, fuzzy integrated evaluation, cloud model

Procedia PDF Downloads 202
24783 Graph Based Traffic Analysis and Delay Prediction Using a Custom Built Dataset

Authors: Gabriele Borg, Alexei Debono, Charlie Abela

Abstract:

There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale. Furthermore, a series of traffic prediction graph neural network models are conducted to compare MalTra to large-scale traffic datasets.

Keywords: graph neural networks, traffic management, big data, mobile data patterns

Procedia PDF Downloads 131
24782 Learning Compression Techniques on Smart Phone

Authors: Farouk Lawan Gambo, Hamada Mohammad

Abstract:

Data compression shrinks files into fewer bits than their original presentation. It has more advantage on the internet because the smaller a file, the faster it can be transferred but learning most of the concepts in data compression are abstract in nature, therefore, making them difficult to digest by some students (engineers in particular). This paper studies the learning preference of engineering students who tend to have strong, active, sensing, visual and sequential learning preferences, the paper also studies the three shift of technology-aided that learning has experienced, which mobile learning has been considered to be the feature of learning that will integrate other form of the education process. Lastly, we propose a design and implementation of mobile learning application using software engineering methodology that will enhance the traditional teaching and learning of data compression techniques.

Keywords: data compression, learning preference, mobile learning, multimedia

Procedia PDF Downloads 448
24781 Effect of Different Contact Rollers on the Surface Texture during the Belt Grinding Process

Authors: Amine Hamdi, Sidi Mohammed Merghache, Brahim Fernini

Abstract:

During abrasive machining of hard steels by belt grinding, the finished surface texture is influenced by the pressure between the abrasive belt and the workpiece; this pressure is the force applied by the contact roller on the workpiece. Therefore, the contact roller has an important role and has a direct impact on process efficiency. The objective of this article is to study and compare the influence of different contact rollers on the belt ground surface texture. The quality of the surface texture is characterized by eight roughness parameters (Ra, Rz, Rp, Rv, Rsk, Rku, Rsm, and Rdq) and five parameters of the bearing area curve (Rpk, Rk, Rvk, Mr1, and Mr2). The results of the experimental tests indicate a better surface texture obtained by the PA 6 polyamide roller (hardness 60 Shore D) compared to that obtained with other rollers of the same hardness or of different hardness. Simultaneously, optimum medium pressure between the belt and the workpiece allows chip removal without fracturing the abrasive grains. This generates a good surface texture.

Keywords: belt grinding, contact roller, pressure, abrasive belt, surface texture

Procedia PDF Downloads 184
24780 Investigation of Delivery of Triple Play Services

Authors: Paramjit Mahey, Monica Sharma, Jasbinder Singh

Abstract:

Fiber based access networks can deliver performance that can support the increasing demands for high speed connections. One of the new technologies that have emerged in recent years is Passive Optical Networks. This paper is targeted to show the simultaneous delivery of triple play service (data, voice and video). The comparative investigation and suitability of various data rates is presented. It is demonstrated that as we increase the data rate, number of users to be accommodated decreases due to increase in bit error rate.

Keywords: BER, PON, TDMPON, GPON, CWDM, OLT, ONT

Procedia PDF Downloads 541
24779 Nazca: A Context-Based Matching Method for Searching Heterogeneous Structures

Authors: Karine B. de Oliveira, Carina F. Dorneles

Abstract:

The structure level matching is the problem of combining elements of a structure, which can be represented as entities, classes, XML elements, web forms, and so on. This is a challenge due to large number of distinct representations of semantically similar structures. This paper describes a structure-based matching method applied to search for different representations in data sources, considering the similarity between elements of two structures and the data source context. Using real data sources, we have conducted an experimental study comparing our approach with our baseline implementation and with another important schema matching approach. We demonstrate that our proposal reaches higher precision than the baseline.

Keywords: context, data source, index, matching, search, similarity, structure

Procedia PDF Downloads 364
24778 Studies on the Solubility of Oxygen in Water Using a Hose to fill the Air with Different Shapes

Authors: Wichan Lertlop

Abstract:

This research is to study the solubility of oxygen in water taking the form of aeration pipes that have different shaped objectives of the research to compare the amount of oxygen dissolved in the water, whice take the form of aeration pipes. Shaped differently When aeration 5 minutes on air for 10 minutes, and when air fills 30 minutes, as well as compare the durability of the oxygen is dissolved in the water of the inlet air refueling shaped differently when you fill the air 30 minutes and when. aeration and 60 minutes populations used in this study, the population of pond water from Rajabhat University in February 2014 used in this study consists of 1. Aerator 2. Hose using a hose to fill the air with 3 different shape, different shapes pyramid whose base is on the water tank. Shaped rectangular water tank onto the ground. And shapes in a vertical pipe. 3 meter, dissolved oxygen, dissolved in water to get the calibration standard. 4. The clock for timer 5. Three water tanks which are 39 cm wide, 51 cm long and 32 cm high.

Keywords: aeration, dissolve oxygen, different shapes

Procedia PDF Downloads 310
24777 Seizure Effects of FP Bearings on the Seismic Reliability of Base-Isolated Systems

Authors: Paolo Castaldo, Bruno Palazzo, Laura Lodato

Abstract:

This study deals with the seizure effects of friction pendulum (FP) bearings on the seismic reliability of a 3D base-isolated nonlinear structural system, designed according to Italian seismic code (NTC08). The isolated system consists in a 3D reinforced concrete superstructure, a r.c. substructure and the FP devices, described by employing a velocity dependent model. The seismic input uncertainty is considered as a random variable relevant to the problem, by employing a set of natural seismic records selected in compliance with L’Aquila (Italy) seismic hazard as provided from NTC08. Several non-linear dynamic analyses considering the three components of each ground motion have been performed with the aim to evaluate the seismic reliability of the superstructure, substructure, and isolation level, also taking into account the seizure event of the isolation devices. Finally, a design solution aimed at increasing the seismic robustness of the base-isolated systems with FPS is analyzed.

Keywords: FP devices, seismic reliability, seismic robustness, seizure

Procedia PDF Downloads 413
24776 Spatially Random Sampling for Retail Food Risk Factors Study

Authors: Guilan Huang

Abstract:

In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.

Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling

Procedia PDF Downloads 350
24775 Active Flutter Suppression of Sports Aircraft Tailplane by Supplementary Control Surface

Authors: Aleš Kratochvíl, Svatomír Slavík

Abstract:

The paper presents an aircraft flutter suppression by active damping of supplementary control surface at trailing edge. The mathematical model of thin oscillation airfoil with control surface driven by pilot is developed. The supplementary control surface driven by control law is added. Active damping of flutter by several control law is present. The structural model of tailplane with an aerodynamic strip theory based on the airfoil model is developed by a finite element method. The optimization process of stiffens parameters is carried out to match the structural model with results from a ground vibration test of a small sport airplane. The implementation of supplementary control surface driven by control law is present. The active damping of tailplane model is shown.

Keywords: active damping, finite element method, flutter, tailplane model

Procedia PDF Downloads 292
24774 Automatic MC/DC Test Data Generation from Software Module Description

Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau

Abstract:

Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that is highly recommended or required for safety-critical software coverage. Therefore, many testing standards include this criterion and require it to be satisfied at a particular level of testing (e.g. validation and unit levels). However, an important amount of time is needed to meet those requirements. In this paper we propose to automate MC/DC test data generation. Thus, we present an approach to automatically generate MC/DC test data, from software module description written over a dedicated language. We introduce a new merging approach that provides high MC/DC coverage for the description, with only a little number of test cases.

Keywords: domain-specific language, MC/DC, test data generation, safety-critical software coverage

Procedia PDF Downloads 441
24773 Blockchain-Based Approach on Security Enhancement of Distributed System in Healthcare Sector

Authors: Loong Qing Zhe, Foo Jing Heng

Abstract:

A variety of data files are now available on the internet due to the advancement of technology across the globe today. As more and more data are being uploaded on the internet, people are becoming more concerned that their private data, particularly medical health records, are being compromised and sold to others for money. Hence, the accessibility and confidentiality of patients' medical records have to be protected through electronic means. Blockchain technology is introduced to offer patients security against adversaries or unauthorised parties. In the blockchain network, only authorised personnel or organisations that have been validated as nodes may share information and data. For any change within the network, including adding a new block or modifying existing information about the block, a majority of two-thirds of the vote is required to confirm its legitimacy. Additionally, a consortium permission blockchain will connect all the entities within the same community. Consequently, all medical data in the network can be safely shared with all authorised entities. Also, synchronization can be performed within the cloud since the data is real-time. This paper discusses an efficient method for storing and sharing electronic health records (EHRs). It also examines the framework of roles within the blockchain and proposes a new approach to maintain EHRs with keyword indexes to search for patients' medical records while ensuring data privacy.

Keywords: healthcare sectors, distributed system, blockchain, electronic health records (EHR)

Procedia PDF Downloads 191
24772 Demographic Factors Influencing Employees’ Salary Expectations and Labor Turnover

Authors: M. Osipova

Abstract:

Thanks to informational technologies development every sphere of economics is becoming more and more data-centralized as people are generating huge datasets containing information on any aspect of their life. Applying research of such data to human resources management allows getting scarce statistics on labor market state including salary expectations and potential employees’ typical career behavior, and this information can become a reliable basis for management decisions. The following article presents results of career behavior research based on freely accessible resume data. Information used for study is much wider than one usually uses in human resources surveys. That is why there is enough data for statistically significant results even for subgroups analysis.

Keywords: human resources management, salary expectations, statistics, turnover

Procedia PDF Downloads 349