Search results for: data transfer
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27239

Search results for: data transfer

25439 The Role of Synthetic Data in Aerial Object Detection

Authors: Ava Dodd, Jonathan Adams

Abstract:

The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.

Keywords: computer vision, machine learning, synthetic data, YOLOv4

Procedia PDF Downloads 221
25438 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks

Authors: K. Indra Gandhi

Abstract:

Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.

Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks

Procedia PDF Downloads 434
25437 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network

Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka

Abstract:

Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.

Keywords: aggregation, consumption, data gathering, efficiency

Procedia PDF Downloads 497
25436 Status and Results from EXO-200

Authors: Ryan Maclellan

Abstract:

EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.

Keywords: double-beta, Majorana, neutrino, neutrinoless

Procedia PDF Downloads 413
25435 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model

Authors: Amit R. Bhende, G. K. Awari

Abstract:

Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.

Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis

Procedia PDF Downloads 434
25434 Spatio-Temporal Data Mining with Association Rules for Lake Van

Authors: Tolga Aydin, M. Fatih Alaeddinoğlu

Abstract:

People, throughout the history, have made estimates and inferences about the future by using their past experiences. Developing information technologies and the improvements in the database management systems make it possible to extract useful information from knowledge in hand for the strategic decisions. Therefore, different methods have been developed. Data mining by association rules learning is one of such methods. Apriori algorithm, one of the well-known association rules learning algorithms, is not commonly used in spatio-temporal data sets. However, it is possible to embed time and space features into the data sets and make Apriori algorithm a suitable data mining technique for learning spatio-temporal association rules. Lake Van, the largest lake of Turkey, is a closed basin. This feature causes the volume of the lake to increase or decrease as a result of change in water amount it holds. In this study, evaporation, humidity, lake altitude, amount of rainfall and temperature parameters recorded in Lake Van region throughout the years are used by the Apriori algorithm and a spatio-temporal data mining application is developed to identify overflows and newly-formed soil regions (underflows) occurring in the coastal parts of Lake Van. Identifying possible reasons of overflows and underflows may be used to alert the experts to take precautions and make the necessary investments.

Keywords: apriori algorithm, association rules, data mining, spatio-temporal data

Procedia PDF Downloads 372
25433 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 97
25432 Process Data-Driven Representation of Abnormalities for Efficient Process Control

Authors: Hyun-Woo Cho

Abstract:

Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.

Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces

Procedia PDF Downloads 245
25431 Rotection of Old Grant Communal Properties of Minorities in Cantonment of Pakistan: Issues and Problems

Authors: Nayer Fardows, Zarash Nayer, Sarah Nayer Jaffar, Daud Nayer

Abstract:

This paper analyses the issues related to communal properties of minorities in the cantonment area of Pakistan allotted in the mid-eighteenth centuries by the British Government to facilitate soldiers. These properties were old grants on which churches, institutes, hospitals, and residences were built. The ownership of these properties remained with British Government, but after the creation of Pakistan, changes by putting Government of Pakistan as the landlord of the property disturbed the inheritors as they remained as, holder of occupancy. The government of Pakistan issued a policy in 1997 to convert the status of old grant properties to regular lease. However, heavy taxes and high court’s decisions made it difficult to solve the issue. The study was conducted on six old grant properties of Edwardes College Peshawar cantonment situated in Khyber Pakhtunkhwa, Pakistan. The paper is descriptive research with a qualitative approach collecting data through government rules, acts, ordinance and decisions of the high courts. The result leads to three aspects; 1) holder of occupancy status of old grant properties in cantonment is similar as allotment of other properties by the government, 2) imposition of heavy taxes on conversion of property from old grant to regular lease restricted inheritors to further construct or transfer, 3) imposition of higher courts ban on conversion of communal properties contradict government policy of conversion. The paper recommends the Government of Pakistan a solution to maintain the status quo for communal properties that fall within the old grant.

Keywords: British Government, communal properties, cantonment, old grant, institutions

Procedia PDF Downloads 144
25430 Examination of Teacher Candidates Attitudes Towards Disabled Individuals Employment in terms of Various Variables

Authors: Tuna Şahsuvaroğlu

Abstract:

The concept of disability is a concept that has been the subject of many studies in national and international literature with its social, sociological, political, anthropological, economic and social dimensions as well as with individual and social consequences. A disabled person is defined as a person who has difficulties in adapting to social life and meeting daily needs due to loss of physical, mental, spiritual, sensory and social abilities to various degrees, either from birth or for any reason later, and they are in need of protection, care, rehabilitation, counseling and support services. The industrial revolution and the rapid industrialization it brought with it led to an increase in the rate of disabilities resulting from work accidents, in addition to congenital disabilities. This increase has resulted in disabled people included in the employment policies of nations as a disadvantaged group. Although the participation of disabled individuals in the workforce is of great importance in terms of both increasing their quality of life and their integration with society and although disabled individuals are willing to participate in the workforce, they encounter with many problems. One of these problems is the negative attitudes and prejudices that develop in society towards the employment of disabled individuals. One of the most powerful ways to turn these negative attitudes and prejudices into positive ones is education. Education is a way of guiding societies and transferring existing social characteristics to future generations. This can be maintained thanks to teachers, who are one of the most dynamic parts of society and act as the locomotive of education driven by the need to give direction and transfer and basically to help and teach. For this reason, there is a strong relationship between the teaching profession and the attitudes formed in society towards the employment of disabled individuals, as they can influence each other. Therefore, the purpose of this study is to examine teacher candidates' attitudes towards the employment of disabled individuals in terms of various variables. The participants of the study consist of 665 teacher candidates studying at various departments at Marmara University Faculty of Education in the 2022-2023 academic year. The descriptive survey model of the general survey model was used in this study as it intends to determine the attitudes of teacher candidates towards the employment of disabled individuals in terms of different variables. The Attitude Scale Towards Employment of Disabled People was used to collect data. The data were analyzed according to the variables of age, gender, marital status, the department, and whether there is a disabled relative in the family, and the findings were discussed in the context of further research.

Keywords: teacher candidates, disabled, attitudes towards the employment of disabled people, attitude scale towards the employment of disabled people

Procedia PDF Downloads 65
25429 Effect of the Fluid Temperature on the Crude Oil Fouling in the Heat Exchangers of Algiers Refinery

Authors: Rima Harche, Abdelkader Mouheb

Abstract:

The Algiers refinery as all the other refineries always suffers from the problem of stopping of the tubes of heat exchanger. For that a study experimental of this phenomenon was undertaken in site on the cell of heat exchangers E101 (E101 CBA and E101 EDF) intended for the heating of the crude before its fractionation, which are exposed to the problem of the fouling on the side tubes exchangers. It is of tube-calenders type with head floating. Each cell is made up of three heat exchangers, laid out in series.

Keywords: fouling, fluid temperatue , oil, tubular heat exchanger, fouling resistance, modeling, heat transfer coefficient

Procedia PDF Downloads 430
25428 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 418
25427 Novel Adomet Analogs as Tools for Nucleic Acids Labeling

Authors: Milda Nainyte, Viktoras Masevicius

Abstract:

Biological methylation is a methyl group transfer from S-adenosyl-L-methionine (AdoMet) onto N-, C-, O- or S-nucleophiles in DNA, RNA, proteins or small biomolecules. The reaction is catalyzed by enzymes called AdoMet-dependent methyltransferases (MTases), which represent more than 3 % of the proteins in the cell. As a general mechanism, the methyl group from AdoMet replaces a hydrogen atom of nucleophilic center producing methylated DNA and S-adenosyl-L-homocysteine (AdoHcy). Recently, DNA methyltransferases have been used for the sequence-specific, covalent labeling of biopolymers. Two types of MTase catalyzed labeling of biopolymers are known, referred as two-step and one-step. During two-step labeling, an alkylating fragment is transferred onto DNA in a sequence-specific manner and then the reporter group, such as biotin, is attached for selective visualization using suitable chemistries of coupling. This approach of labeling is quite difficult and the chemical hitching does not always proceed at 100 %, but in the second step the variety of reporter groups can be selected and that gives the flexibility for this labeling method. In the one-step labeling, AdoMet analog is designed with the reporter group already attached to the functional group. Thus, the one-step labeling method would be more comfortable tool for labeling of biopolymers in order to prevent additional chemical reactions and selection of reaction conditions. Also, time costs would be reduced. However, effective AdoMet analog appropriate for one-step labeling of biopolymers and containing cleavable bond, required for reduction of PCR interferation, is still not known. To expand the practical utility of this important enzymatic reaction, cofactors with activated sulfonium-bound side-chains have been produced and can serve as surrogate cofactors for a variety of wild-type and mutant DNA and RNA MTases enabling covalent attachment of these chains to their target sites in DNA, RNA or proteins (the approach named methyltransferase-directed Transfer of Activated Groups, mTAG). Compounds containing hex-2-yn-1-yl moiety has proved to be efficient alkylating agents for labeling of DNA. Herein we describe synthetic procedures for the preparation of N-biotinoyl-N’-(pent-4-ynoyl)cystamine starting from the coupling of cystamine with pentynoic acid and finally attaching the biotin as a reporter group. The synthesis of the first AdoMet based cofactor containing a cleavable reporter group and appropriate for one-step labeling was developed.

Keywords: adoMet analogs, DNA alkylation, cofactor, methyltransferases

Procedia PDF Downloads 193
25426 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks

Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode

Abstract:

The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.

Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control

Procedia PDF Downloads 83
25425 Quality of Age Reporting from Tanzania 2012 Census Results: An Assessment Using Whipple’s Index, Myer’s Blended Index, and Age-Sex Accuracy Index

Authors: A. Sathiya Susuman, Hamisi F. Hamisi

Abstract:

Background: Many socio-economic and demographic data are age-sex attributed. However, a variety of irregularities and misstatement are noted with respect to age-related data and less to sex data because of its biological differences between the genders. Noting the misstatement/misreporting of age data regardless of its significance importance in demographics and epidemiological studies, this study aims at assessing the quality of 2012 Tanzania Population and Housing Census Results. Methods: Data for the analysis are downloaded from Tanzania National Bureau of Statistics. Age heaping and digit preference were measured using summary indices viz., Whipple’s index, Myers’ blended index, and Age-Sex Accuracy index. Results: The recorded Whipple’s index for both sexes was 154.43; male has the lowest index of about 152.65 while female has the highest index of about 156.07. For Myers’ blended index, the preferences were at digits ‘0’ and ‘5’ while avoidance were at digits ‘1’ and ‘3’ for both sexes. Finally, Age-sex index stood at 59.8 where sex ratio score was 5.82 and age ratio scores were 20.89 and 21.4 for males and female respectively. Conclusion: The evaluation of the 2012 PHC data using the demographic techniques has qualified the data inaccurate as the results of systematic heaping and digit preferences/avoidances. Thus, innovative methods in data collection along with measuring and minimizing errors using statistical techniques should be used to ensure accuracy of age data.

Keywords: age heaping, digit preference/avoidance, summary indices, Whipple’s index, Myer’s index, age-sex accuracy index

Procedia PDF Downloads 474
25424 Energy Budget Equation of Superfluid HVBK Model: LES Simulation

Authors: M. Bakhtaoui, L. Merahi

Abstract:

The reliability of the filtered HVBK model is now investigated via some large eddy simulations of freely decaying isotropic superfluid turbulence. For homogeneous turbulence at very high Reynolds numbers, comparison of the terms in the spectral kinetic energy budget equation indicates, in the energy-containing range, that the production and energy transfer effects become significant except for dissipation. In the inertial range, where the two fluids are perfectly locked, the mutual friction maybe neglected with respect to other terms. Also the LES results for the other terms of the energy balance are presented.

Keywords: superfluid turbulence, HVBK, energy budget, Large Eddy Simulation

Procedia PDF Downloads 373
25423 Competences for Learning beyond the Academic Context

Authors: Cristina Galván-Fernández

Abstract:

Students differentiate the different contexts of their lives as well as employment, hobbies or studies. In higher education is needed to transfer the experiential knowledge to theory and viceversa. However, is difficult to achieve than students use their personal experiences and social readings for get the learning evidences. In an experience with 178 education students from Chile and Spain we have used an e-portfolio system and a methodology for 4 years with the aims of help them to: 1) self-regulate their learning process and 2) use social networks and professional experiences for make the learning evidences. These two objectives have been controlled by interviews to the same students in different moments and two questionnaires. The results of this study show that students recognize the ownership of their learning and progress in planning and reflection of their own learning.

Keywords: competences, e-portfolio, higher education, self-regulation

Procedia PDF Downloads 299
25422 Artificial Neural Networks Application on Nusselt Number and Pressure Drop Prediction in Triangular Corrugated Plate Heat Exchanger

Authors: Hany Elsaid Fawaz Abdallah

Abstract:

This study presents a new artificial neural network(ANN) model to predict the Nusselt Number and pressure drop for the turbulent flow in a triangular corrugated plate heat exchanger for forced air and turbulent water flow. An experimental investigation was performed to create a new dataset for the Nusselt Number and pressure drop values in the following range of dimensionless parameters: The plate corrugation angles (from 0° to 60°), the Reynolds number (from 10000 to 40000), pitch to height ratio (from 1 to 4), and Prandtl number (from 0.7 to 200). Based on the ANN performance graph, the three-layer structure with {12-8-6} hidden neurons has been chosen. The training procedure includes back-propagation with the biases and weight adjustment, the evaluation of the loss function for the training and validation dataset and feed-forward propagation of the input parameters. The linear function was used at the output layer as the activation function, while for the hidden layers, the rectified linear unit activation function was utilized. In order to accelerate the ANN training, the loss function minimization may be achieved by the adaptive moment estimation algorithm (ADAM). The ‘‘MinMax’’ normalization approach was utilized to avoid the increase in the training time due to drastic differences in the loss function gradients with respect to the values of weights. Since the test dataset is not being used for the ANN training, a cross-validation technique is applied to the ANN network using the new data. Such procedure was repeated until loss function convergence was achieved or for 4000 epochs with a batch size of 200 points. The program code was written in Python 3.0 using open-source ANN libraries such as Scikit learn, TensorFlow and Keras libraries. The mean average percent error values of 9.4% for the Nusselt number and 8.2% for pressure drop for the ANN model have been achieved. Therefore, higher accuracy compared to the generalized correlations was achieved. The performance validation of the obtained model was based on a comparison of predicted data with the experimental results yielding excellent accuracy.

Keywords: artificial neural networks, corrugated channel, heat transfer enhancement, Nusselt number, pressure drop, generalized correlations

Procedia PDF Downloads 86
25421 Natural Convection between Two Parallel Wavy Plates

Authors: Si Abdallah Mayouf

Abstract:

In this work, the effects of the wavy surface on free convection heat transfer boundary layer flow between two parallel wavy plates have been studied numerically. The two plates are considered at a constant temperature. The equations and the boundary conditions are discretized by the finite difference scheme and solved numerically using the Gauss-Seidel algorithm. The important parameters in this problem are the amplitude of the wavy surfaces and the distance between the two wavy plates. Results are presented as velocity profiles, temperature profiles and local Nusselt number according to the important parameters.

Keywords: free convection, wavy surface, parallel plates, fluid dynamics

Procedia PDF Downloads 306
25420 Numerical Modelling of Skin Tumor Diagnostics through Dynamic Thermography

Authors: Luiz Carlos Wrobel, Matjaz Hribersek, Jure Marn, Jurij Iljaz

Abstract:

Dynamic thermography has been clinically proven to be a valuable diagnostic technique for skin tumor detection as well as for other medical applications such as breast cancer diagnostics, diagnostics of vascular diseases, fever screening, dermatological and other applications. Thermography for medical screening can be done in two different ways, observing the temperature response under steady-state conditions (passive or static thermography), and by inducing thermal stresses by cooling or heating the observed tissue and measuring the thermal response during the recovery phase (active or dynamic thermography). The numerical modelling of heat transfer phenomena in biological tissue during dynamic thermography can aid the technique by improving process parameters or by estimating unknown tissue parameters based on measured data. This paper presents a nonlinear numerical model of multilayer skin tissue containing a skin tumor, together with the thermoregulation response of the tissue during the cooling-rewarming processes of dynamic thermography. The model is based on the Pennes bioheat equation and solved numerically by using a subdomain boundary element method which treats the problem as axisymmetric. The paper includes computational tests and numerical results for Clark II and Clark IV tumors, comparing the models using constant and temperature-dependent thermophysical properties, which showed noticeable differences and highlighted the importance of using a local thermoregulation model.

Keywords: boundary element method, dynamic thermography, static thermography, skin tumor diagnostic

Procedia PDF Downloads 105
25419 Building Information Management in Context of Urban Spaces, Analysis of Current Use and Possibilities

Authors: Lucie Jirotková, Daniel Macek, Andrea Palazzo, Veronika Malinová

Abstract:

Currently, the implementation of 3D models in the construction industry is gaining popularity. Countries around the world are developing their own modelling standards and implement the use of 3D models into their individual permitting processes. Another theme that needs to be addressed are public building spaces and their subsequent maintenance, where the usage of BIM methodology is directly offered. The significant benefit of the implementation of Building Information Management is the information transfer. The 3D model contains not only the spatial representation of the item shapes but also various parameters that are assigned to the individual elements, which are easily traceable, mainly because they are all stored in one place in the BIM model. However, it is important to keep the data in the models up to date to achieve useability of the model throughout the life cycle of the building. It is now becoming standard practice to use BIM models in the construction of buildings, however, the building environment is very often neglected. Especially in large-scale development projects, the public space of buildings is often forwarded to municipalities, which obtains the ownership and are in charge of its maintenance. A 3D model of the building surroundings would include both the above-ground visible elements of the development as well as the underground parts, such as the technological facilities of water features, electricity lines for public lighting, etc. The paper shows the possibilities of a model in the field of information for the handover of premises, the following maintenance and decision making. The attributes and spatial representation of the individual elements make the model a reliable foundation for the creation of "Smart Cities". The paper analyses the current use of the BIM methodology and presents the state-of-the-art possibilities of development.

Keywords: BIM model, urban space, BIM methodology, facility management

Procedia PDF Downloads 123
25418 Retrofitting of Historical Structures in Van City

Authors: Eylem Güzel, Mustafa Gülen

Abstract:

Historical structures are the most important symbols of a country that link the past with the future. In order to transfer them in their present conditions to the next generations, maintaining these historical structures are one of our main tasks. Seismic performance of historical structures damaged by the earthquake effects can be enhanced by repair and retrofitting applications. However, repair and retrofitting applications of historical structures are more complicated compared with the traditional structures. For this reason, they need much more attention in repair and retrofitting applications to preserve the spirit of historical structures. In this study, the present condition of selected historical structures built up in Van city that has a very rich historical heritage is given and the necessity of repair and retrofitting applications of historical structures are debated in detail.

Keywords: historical structures, repair, retrofitting, Van city

Procedia PDF Downloads 353
25417 Model for Introducing Products to New Customers through Decision Tree Using Algorithm C4.5 (J-48)

Authors: Komol Phaisarn, Anuphan Suttimarn, Vitchanan Keawtong, Kittisak Thongyoun, Chaiyos Jamsawang

Abstract:

This article is intended to analyze insurance information which contains information on the customer decision when purchasing life insurance pay package. The data were analyzed in order to present new customers with Life Insurance Perfect Pay package to meet new customers’ needs as much as possible. The basic data of insurance pay package were collect to get data mining; thus, reducing the scattering of information. The data were then classified in order to get decision model or decision tree using Algorithm C4.5 (J-48). In the classification, WEKA tools are used to form the model and testing datasets are used to test the decision tree for the accurate decision. The validation of this model in classifying showed that the accurate prediction was 68.43% while 31.25% were errors. The same set of data were then tested with other models, i.e. Naive Bayes and Zero R. The results showed that J-48 method could predict more accurately. So, the researcher applied the decision tree in writing the program used to introduce the product to new customers to persuade customers’ decision making in purchasing the insurance package that meets the new customers’ needs as much as possible.

Keywords: decision tree, data mining, customers, life insurance pay package

Procedia PDF Downloads 425
25416 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review

Authors: Faisal Muhibuddin, Ani Dijah Rahajoe

Abstract:

This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.

Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review

Procedia PDF Downloads 62
25415 Improve of Power Quality in Electrical Network Using STATCOM

Authors: A. R. Alesaadi

Abstract:

Flexible AC transmission system (FACTS) devices have an important rule on expended electrical transmission networks. These devices can provide control of one or more AC transmission system parameters to enhance controllability and increase power transfer capability. In this paper the effect of these devices on reliability of electrical networks is studied and it is shown that using of FACTS devices can improve the reliability of power networks and power quality in electrical networks, significantly.

Keywords: FACTS devices, power networks, power quality, STATCOM

Procedia PDF Downloads 667
25414 Reducing System Delay to Definitive Care For STEMI Patients, a Simulation of Two Different Strategies in the Brugge Area, Belgium

Authors: E. Steen, B. Dewulf, N. Müller, C. Vandycke, Y. Vandekerckhove

Abstract:

Introduction: The care for a ST-elevation myocardial infarction (STEMI) patient is time-critical. Reperfusion therapy within 90 minutes of initial medical contact is mandatory in the improvement of the outcome. Primary percutaneous coronary intervention (PCI) without previous fibrinolytic treatment, is the preferred reperfusion strategy in patients with STEMI, provided it can be performed within guideline-mandated times. Aim of the study: During a one year period (January 2013 to December 2013) the files of all consecutive STEMI patients with urgent referral from non-PCI facilities for primary PCI were reviewed. Special attention was given to a subgroup of patients with prior out-of-hospital medical contact generated by the 112-system. In an effort to reduce out-of-hospital system delay to definitive care a change in pre-hospital 112 dispatch strategies is proposed for these time-critical patients. Actual time recordings were compared with travel time simulations for two suggested scenarios. A first scenario (SC1) involves the decision by the on scene ground EMS (GEMS) team to transport the out-of-hospital diagnosed STEMI patient straight forward to a PCI centre bypassing the nearest non-PCI hospital. Another strategy (SC2) explored the potential role of helicopter EMS (HEMS) where the on scene GEMS team requests a PCI-centre based HEMS team for immediate medical transfer to the PCI centre. Methods and Results: 49 (29,1% of all) STEMI patients were referred to our hospital for emergency PCI by a non-PCI facility. 1 file was excluded because of insufficient data collection. Within this analysed group of 48 secondary referrals 21 patients had an out-of-hospital medical contact generated by the 112-system. The other 27 patients presented at the referring emergency department without prior contact with the 112-system. The table below shows the actual time data from first medical contact to definitive care as well as the simulated possible gain of time for both suggested strategies. The PCI-team was always alarmed upon departure from the referring centre excluding further in-hospital delay. Time simulation tools were similar to those used by the 112-dispatch centre. Conclusion: Our data analysis confirms prolonged reperfusion times in case of secondary emergency referrals for STEMI patients even with the use of HEMS. In our setting there was no statistical difference in gain of time between the two suggested strategies, both reducing the secondary referral generated delay with about one hour and by this offering all patients PCI within the guidelines mandated time. However, immediate HEMS activation by the on scene ground EMS team for transport purposes is preferred. This ensures a faster availability of the local GEMS-team for its community. In case these options are not available and the guideline-mandated times for primary PCI are expected to be exceeded, primary fibrinolysis should be considered in a non-PCI centre.

Keywords: STEMI, system delay, HEMS, emergency medicine

Procedia PDF Downloads 318
25413 Assessing Supply Chain Performance through Data Mining Techniques: A Case of Automotive Industry

Authors: Emin Gundogar, Burak Erkayman, Nusret Sazak

Abstract:

Providing effective management performance through the whole supply chain is critical issue and hard to applicate. The proper evaluation of integrated data may conclude with accurate information. Analysing the supply chain data through OLAP (On-Line Analytical Processing) technologies may provide multi-angle view of the work and consolidation. In this study, association rules and classification techniques are applied to measure the supply chain performance metrics of an automotive manufacturer in Turkey. Main criteria and important rules are determined. The comparison of the results of the algorithms is presented.

Keywords: supply chain performance, performance measurement, data mining, automotive

Procedia PDF Downloads 512
25412 The Used of Ceramic Stove Cover and It’s Gap to the Efficiency of Water Boiling System

Authors: Agung Sugeng Widodo

Abstract:

Water boiling system (WBS) using conventional gas stove (CGS) is relatively inefficient unless its mechanism being considered. In this study, an addition of ceramic stove cover (CSC) to a CGS and the gap between CSC and pan have been assessed. Parameters as energy produced by fuel, CSC temperature and water temperature were used to analyze the performance of a CGS. The gaps were varied by 1 – 7 mm in a step of 1 mm. The results showed that a CSC able to increase the performance of a CGS significantly. In certain fuel rate of 0.75 l/m, the efficiency of a CGS obtained in a gap of 4 mm. The best efficiency obtained in this study was 46.4 % due to the optimum condition that achieved simultaneously in convection and radiation heat transfer processes of the heating system. CSC also indicated a good characteristic for covering heat release at the initially of WBS.

Keywords: WBS, CSC, CGS, efficiency, gap

Procedia PDF Downloads 266
25411 Multimodal Data Fusion Techniques in Audiovisual Speech Recognition

Authors: Hadeer M. Sayed, Hesham E. El Deeb, Shereen A. Taie

Abstract:

In the big data era, we are facing a diversity of datasets from different sources in different domains that describe a single life event. These datasets consist of multiple modalities, each of which has a different representation, distribution, scale, and density. Multimodal fusion is the concept of integrating information from multiple modalities in a joint representation with the goal of predicting an outcome through a classification task or regression task. In this paper, multimodal fusion techniques are classified into two main classes: model-agnostic techniques and model-based approaches. It provides a comprehensive study of recent research in each class and outlines the benefits and limitations of each of them. Furthermore, the audiovisual speech recognition task is expressed as a case study of multimodal data fusion approaches, and the open issues through the limitations of the current studies are presented. This paper can be considered a powerful guide for interested researchers in the field of multimodal data fusion and audiovisual speech recognition particularly.

Keywords: multimodal data, data fusion, audio-visual speech recognition, neural networks

Procedia PDF Downloads 109
25410 Knowledge-Driven Decision Support System Based on Knowledge Warehouse and Data Mining by Improving Apriori Algorithm with Fuzzy Logic

Authors: Pejman Hosseinioun, Hasan Shakeri, Ghasem Ghorbanirostam

Abstract:

In recent years, we have seen an increasing importance of research and study on knowledge source, decision support systems, data mining and procedure of knowledge discovery in data bases and it is considered that each of these aspects affects the others. In this article, we have merged information source and knowledge source to suggest a knowledge based system within limits of management based on storing and restoring of knowledge to manage information and improve decision making and resources. In this article, we have used method of data mining and Apriori algorithm in procedure of knowledge discovery one of the problems of Apriori algorithm is that, a user should specify the minimum threshold for supporting the regularity. Imagine that a user wants to apply Apriori algorithm for a database with millions of transactions. Definitely, the user does not have necessary knowledge of all existing transactions in that database, and therefore cannot specify a suitable threshold. Our purpose in this article is to improve Apriori algorithm. To achieve our goal, we tried using fuzzy logic to put data in different clusters before applying the Apriori algorithm for existing data in the database and we also try to suggest the most suitable threshold to the user automatically.

Keywords: decision support system, data mining, knowledge discovery, data discovery, fuzzy logic

Procedia PDF Downloads 334