Search results for: stirling machine
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2703

Search results for: stirling machine

813 Automatic Verification Technology of Virtual Machine Software Patch on IaaS Cloud

Authors: Yoji Yamato

Abstract:

In this paper, we propose an automatic verification technology of software patches for user virtual environments on IaaS Cloud to decrease verification costs of patches. In these days, IaaS services have been spread and many users can customize virtual machines on IaaS Cloud like their own private servers. Regarding to software patches of OS or middleware installed on virtual machines, users need to adopt and verify these patches by themselves. This task increases operation costs of users. Our proposed method replicates user virtual environments, extracts verification test cases for user virtual environments from test case DB, distributes patches to virtual machines on replicated environments and conducts those test cases automatically on replicated environments. We have implemented the proposed method on OpenStack using Jenkins and confirmed the feasibility. Using the implementation, we confirmed the effectiveness of test case creation efforts by our proposed idea of 2-tier abstraction of software functions and test cases. We also evaluated the automatic verification performance of environment replications, test cases extractions and test cases conductions.

Keywords: OpenStack, cloud computing, automatic verification, jenkins

Procedia PDF Downloads 454
812 A Study of Traditional Mode in the Framework of Sustainable Urban Transportation

Authors: Juanita, B. Kombaitan, Iwan Pratoyo Kusumantoro

Abstract:

The traditional mode is a non-motorized vehicle powered by human or animal power. The objective of the study was to define the strategy of using traditional modes by the framework of sustainable urban transport in support of urban tourism activities. The study of the traditional mode does not include a modified mode using the engine power as motor tricycles are often called ‘bentor ‘in Indonesia. The use of non-motorized traditional mode in Indonesia has begun to shift, and its use began to be eliminated by the change of propulsion using the machine. In an effort to push back the use of traditional mode one of them with tourism activities. Strategies for the use of traditional modes within the framework of sustainable urban transport are seen from three dimensions: social, economic and environmental. The social dimension related to accessibility and livability, an economic dimension related to traditional modes can promote products and tourist attractions, while the environmental dimension related to the needs of the users/groups with respect to safety, comfort. The traditional mode is rarely noticed by the policy makers, and public opinion in its use needs attention. The involvement of policy-making between stakeholders and the community is needed in the development of sustainable traditional mode strategies in support of urban tourism activities.

Keywords: traditional mode, sustainable, urban, transportation

Procedia PDF Downloads 236
811 Massively Parallel Sequencing Improved Resolution for Paternity Testing

Authors: Xueying Zhao, Ke Ma, Hui Li, Yu Cao, Fan Yang, Qingwen Xu, Wenbin Liu

Abstract:

Massively parallel sequencing (MPS) technologies allow high-throughput sequencing analyses with a relatively affordable price and have gradually been applied to forensic casework. MPS technology identifies short tandem repeat (STR) loci based on sequence so that repeat motif variation within STRs can be detected, which may help one to infer the origin of the mutation in some cases. Here, we report on one case with one three-step mismatch (D18S51) in family trios based on both capillary electrophoresis (CE) and MPS typing. The alleles of the alleged father (AF) are [AGAA]₁₇AGAG[AGAA]₃ and [AGAA]₁₅. The mother’s alleles are [AGAA]₁₉ and [AGAA]₉AGGA[AGAA]₃. The questioned child’s (QC) alleles are [AGAA]₁₉ and [AGAA]₁₂. Given that the sequence variants in repeat regions of AF and mother are not observed in QC’s alleles, the QC’s allele [AGAA]₁₂ was likely inherited from the AF’s allele [AGAA]₁₅ by loss of three repeat [AGAA]. Besides, two new alleles of D18S51 in this study, [AGAA]₁₇AGAG[AGAA]₃ and [AGAA]₉AGGA[AGAA]₃, have not been reported before. All the results in this study were verified using Sanger-type sequencing. In summary, the MPS typing method can offer valuable information for forensic genetics research and play a promising role in paternity testing.

Keywords: family trios analysis, forensic casework, ion torrent personal genome machine (PGM), massively parallel sequencing (MPS)

Procedia PDF Downloads 275
810 A Review on Parametric Optimization of Casting Processes Using Optimization Techniques

Authors: Bhrugesh Radadiya, Jaydeep Shah

Abstract:

In Indian foundry industry, there is a need of defect free casting with minimum production cost in short lead time. Casting defect is a very large issue in foundry shop which increases the rejection rate of casting and wastage of materials. The various parameters influences on casting process such as mold machine related parameters, green sand related parameters, cast metal related parameters, mold related parameters and shake out related parameters. The mold related parameters are most influences on casting defects in sand casting process. This paper review the casting produced by foundry with shrinkage and blow holes as a major defects was analyzed and identified that mold related parameters such as mold temperature, pouring temperature and runner size were not properly set in sand casting process. These parameters were optimized using different optimization techniques such as Taguchi method, Response surface methodology, Genetic algorithm and Teaching-learning based optimization algorithm. Finally, concluded that a Teaching-learning based optimization algorithm give better result than other optimization techniques.

Keywords: casting defects, genetic algorithm, parametric optimization, Taguchi method, TLBO algorithm

Procedia PDF Downloads 700
809 Design and Implementation a Platform for Adaptive Online Learning Based on Fuzzy Logic

Authors: Budoor Al Abid

Abstract:

Educational systems are increasingly provided as open online services, providing guidance and support for individual learners. To adapt the learning systems, a proper evaluation must be made. This paper builds the evaluation model Fuzzy C Means Adaptive System (FCMAS) based on data mining techniques to assess the difficulty of the questions. The following steps are implemented; first using a dataset from an online international learning system called (slepemapy.cz) the dataset contains over 1300000 records with 9 features for students, questions and answers information with feedback evaluation. Next, a normalization process as preprocessing step was applied. Then FCM clustering algorithms are used to adaptive the difficulty of the questions. The result is three cluster labeled data depending on the higher Wight (easy, Intermediate, difficult). The FCM algorithm gives a label to all the questions one by one. Then Random Forest (RF) Classifier model is constructed on the clustered dataset uses 70% of the dataset for training and 30% for testing; the result of the model is a 99.9% accuracy rate. This approach improves the Adaptive E-learning system because it depends on the student behavior and gives accurate results in the evaluation process more than the evaluation system that depends on feedback only.

Keywords: machine learning, adaptive, fuzzy logic, data mining

Procedia PDF Downloads 163
808 Nilsson Model Performance in Estimating Bed Load Sediment, Case Study: Tale Zang Station

Authors: Nader Parsazadeh

Abstract:

The variety of bed sediment load relationships, insufficient information and data, and the influence of river conditions make the selection of an optimum relationship for a given river extremely difficult. Hence, in order to select the best formulae, the bed load equations should be evaluated. The affecting factors need to be scrutinized, and equations should be verified. Also, re-evaluation may be needed. In this research, sediment bed load of Dez Dam at Tal-e Zang Station has been studied. After reviewing the available references, the most common formulae were selected that included Meir-Peter and Muller, using MS Excel to compute and evaluate data. Then, 52 series of already measured data at the station were re-measured, and the sediment bed load was determined. 1. The calculated bed load obtained by different equations showed a great difference with that of measured data. 2. r difference ratio from 0.5 to 2.00 was 0% for all equations except for Nilsson and Shields equations while it was 61.5 and 59.6% for Nilsson and Shields equations, respectively. 3. By reviewing results and discarding probably erroneous measured data measurements (by human or machine), one may use Nilsson Equation due to its r value higher than 1 as an effective equation for estimating bed load at Tal-e Zang Station in order to predict activities that depend upon bed sediment load estimate to be determined. Also, since only few studies have been conducted so far, these results may be of assistance to the operators and consulting companies.

Keywords: bed load, empirical relation ship, sediment, Tale Zang Station

Procedia PDF Downloads 342
807 FLIME - Fast Low Light Image Enhancement for Real-Time Video

Authors: Vinay P., Srinivas K. S.

Abstract:

Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.

Keywords: low light image enhancement, real-time video, computer vision, machine learning

Procedia PDF Downloads 165
806 Short-Term Operation Planning for Energy Management of Exhibition Hall

Authors: Yooncheol Lee, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

This paper deals with the establishment of a short-term operational plan for an air conditioner for efficient energy management of exhibition hall. The short-term operational plan is composed of a time series of operational schedules, which we have searched using genetic algorithms. Establishing operational schedule should be considered the future trends of the variables affecting the exhibition hall environment. To reflect continuously changing factors such as external temperature and occupant, short-term operational plans should be updated in real time. But it takes too much time to evaluate a short-term operational plan using EnergyPlus, a building emulation tool. For that reason, it is difficult to update the operational plan in real time. To evaluate the short-term operational plan, we designed prediction models based on machine learning with fast evaluation speed. This model, which was created by learning the past operational data, is accurate and fast. The collection of operational data and the verification of operational plans were made using EnergyPlus. Experimental results show that the proposed method can save energy compared to the reactive control method.

Keywords: exhibition hall, energy management, predictive model, simulation-based optimization

Procedia PDF Downloads 305
805 Laying Hens' Feed Fortified with Pectin, Xanthan Gum and Guar Gum Aims to Reduce the Cholesterol in Muscle and Egg Yolk

Authors: Novia Dwi Prabandari, Diah Ayu Asmarani

Abstract:

Soluble fiber can accelerate the metabolism of cholesterol. Pectin and gum has been used in the form of substance additive for material stabilizer and emulsifier. Pectin supplementation in laying hens can decimate the cholesterol content in egg yolk and muscle. Therefore, this laying hens’ feed is regular feed chickens enriched with soluble fiber (Pectin, Xanthan gum, and Guar gum) to produce eggs and muscle with lower cholesterol than usual.The ingredients are mixed in the ratio of concentrate 45%, corn flour 25%, soybean meal 20%, and extract of soluble fiber 10%. Once all the ingredients are mixed and then evaporated with temperature < 80 °C. Then put in the grinding machine resulting in a circular shape with holes 2-3 mm in diameter, after it dried up the water content in the feed is less than 14%. Eggs from laying hen with soluble fiber fortification feed intake will have lower cholesterol levels in eggs than regular feed. So even with the cholesterol content in the muscle, it is because chicken feed fortified with soluble fiber will accelerate the metabolism of cholesterol and cause cholesterol deposits in the chicken less. The use of this kind of laying hens feed is produce eggs with high protein content can be consumed more for people who have hypercholesterolemia.

Keywords: pectin, xanthan gum, guar gum, laying hen, cholesterol

Procedia PDF Downloads 406
804 Heavy Metal Contamination in Soils: Detection and Assessment Using Machine Learning Algorithms Based on Hyperspectral Images

Authors: Reem El Chakik

Abstract:

The levels of heavy metals in agricultural lands in Lebanon have been witnessing a noticeable increase in the past few years, due to increased anthropogenic pollution sources. Heavy metals pose a serious threat to the environment for being non-biodegradable and persistent, accumulating thus to dangerous levels in the soil. Besides the traditional laboratory and chemical analysis methods, Hyperspectral Imaging (HSI) has proven its efficiency in the rapid detection of HMs contamination. In Lebanon, a continuous environmental monitoring, including the monitoring of levels of HMs in agricultural soils, is lacking. This is due in part to the high cost of analysis. Hence, this proposed research aims at defining the current national status of HMs contamination in agricultural soil, and to evaluate the effectiveness of using HSI in the detection of HM in contaminated agricultural fields. To achieve the two main objectives of this study, soil samples were collected from different areas throughout the country and were analyzed for HMs using Atomic Absorption Spectrophotometry (AAS). The results were compared to those obtained from the HSI technique that was applied using Hyspex SWIR-384 camera. The results showed that the Lebanese agricultural soils contain high contamination levels of Zn, and that the more clayey the soil is, the lower reflectance it has.

Keywords: agricultural soils in Lebanon, atomic absorption spectrophotometer, hyperspectral imaging., heavy metals contamination

Procedia PDF Downloads 78
803 Covid-19, Diagnosis with Computed Tomography and Artificial Intelligence, in a Few Simple Words

Authors: Angelis P. Barlampas

Abstract:

Target: The (SARS-CoV-2) is still a threat. AI software could be useful, categorizing the disease into different severities and indicate the extent of the lesions. Materials and methods: AI is a new revolutionary technique, which uses powered computerized systems, to do what a human being does more rapidly, more easily, as accurate and diagnostically safe as the original medical report and, in certain circumstances, even better, saving time and helping the health system to overcome problems, such as work overload and human fatigue. Results: It will be given an effort to describe to the inexperienced reader (see figures), as simple as possible, how an artificial intelligence system diagnoses computed tomography pictures. First, the computerized machine learns the physiologic motives of lung parenchyma by being feeded with normal structured images of the lung tissue. Having being used to recognizing normal structures, it can then easily indentify the pathologic ones, as their images do not fit to known normal picture motives. It is the same way as when someone spends his free time in reading magazines with quizzes, such as <> and <>. General conclusion: The AI mimics the physiological processes of the human mind, but it does that more efficiently and rapidly and provides results in a few seconds, whereas an experienced radiologist needs many days to do that, or even worse, he is unable to accomplish such a huge task.

Keywords: covid-19, artificial intelligence, automated imaging, CT, chest imaging

Procedia PDF Downloads 31
802 Inter Laboratory Comparison with Coordinate Measuring Machine and Uncertainty Analysis

Authors: Tugrul Torun, Ihsan A. Yuksel, Si̇nem On Aktan, Taha K. Vezi̇roglu

Abstract:

In the quality control processes in some industries, the usage of CMM has increased in recent years. Consequently, the CMMs play important roles in the acceptance or rejection of manufactured parts. For parts, it’s important to be able to make decisions by performing fast measurements. According to related technical drawing and its tolerances, measurement uncertainty should also be considered during assessment. Since uncertainty calculation is difficult and time-consuming, most companies ignore the uncertainty value in their routine inspection method. Although studies on measurement uncertainty have been carried out on CMM’s in recent years, there is still no applicable method for analyzing task-specific measurement uncertainty. There are some standard series for calculating measurement uncertainty (ISO-15530); it is not possible to use it in industrial measurement because it is not a practical method for standard measurement routine. In this study, the inter-laboratory comparison test has been carried out in the ROKETSAN A.Ş. with all dimensional inspection units. The reference part that we used is traceable to the national metrology institute TUBİTAK UME. Each unit has measured reference parts according to related technical drawings, and the task-specific measuring uncertainty has been calculated with related parameters. According to measurement results and uncertainty values, the En values have been calculated.

Keywords: coordinate measurement, CMM, comparison, uncertainty

Procedia PDF Downloads 173
801 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework

Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe

Abstract:

This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.

Keywords: IoT, fog, cloud, data analysis, data privacy

Procedia PDF Downloads 64
800 Detection of Cyberattacks on the Metaverse Based on First-Order Logic

Authors: Sulaiman Al Amro

Abstract:

There are currently considerable challenges concerning data security and privacy, particularly in relation to modern technologies. This includes the virtual world known as the Metaverse, which consists of a virtual space that integrates various technologies and is therefore susceptible to cyber threats such as malware, phishing, and identity theft. This has led recent studies to propose the development of Metaverse forensic frameworks and the integration of advanced technologies, including machine learning for intrusion detection and security. In this context, the application of first-order logic offers a formal and systematic approach to defining the conditions of cyberattacks, thereby contributing to the development of effective detection mechanisms. In addition, formalizing the rules and patterns of cyber threats has the potential to enhance the overall security posture of the Metaverse and, thus, the integrity and safety of this virtual environment. The current paper focuses on the primary actions employed by avatars for potential attacks, including Interval Temporal Logic (ITL) and behavior-based detection to detect an avatar’s abnormal activities within the Metaverse. The research established that the proposed framework attained an accuracy of 92.307%, resulting in the experimental results demonstrating the efficacy of ITL, including its superior performance in addressing the threats posed by avatars within the Metaverse domain.

Keywords: security, privacy, metaverse, cyberattacks, detection, first-order logic

Procedia PDF Downloads 10
799 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics

Authors: Mikheil Kalmakhelidze

Abstract:

Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.

Keywords: description logic, fuzzy logic, neural networks, record linkage

Procedia PDF Downloads 245
798 Robot Navigation and Localization Based on the Rat’s Brain Signals

Authors: Endri Rama, Genci Capi, Shigenori Kawahara

Abstract:

The mobile robot ability to navigate autonomously in its environment is very important. Even though the advances in technology, robot self-localization and goal directed navigation in complex environments are still challenging tasks. In this article, we propose a novel method for robot navigation based on rat’s brain signals (Local Field Potentials). It has been well known that rats accurately and rapidly navigate in a complex space by localizing themselves in reference to the surrounding environmental cues. As the first step to incorporate the rat’s navigation strategy into the robot control, we analyzed the rats’ strategies while it navigates in a multiple Y-maze, and recorded Local Field Potentials (LFPs) simultaneously from three brain regions. Next, we processed the LFPs, and the extracted features were used as an input in the artificial neural network to predict the rat’s next location, especially in the decision-making moment, in Y-junctions. We developed an algorithm by which the robot learned to imitate the rat’s decision-making by mapping the rat’s brain signals into its own actions. Finally, the robot learned to integrate the internal states as well as external sensors in order to localize and navigate in the complex environment.

Keywords: brain-machine interface, decision-making, mobile robot, neural network

Procedia PDF Downloads 274
797 Analysis of Noodle Production Process at Yan Hu Food Manufacturing: Basis for Production Improvement

Authors: Rhadinia Tayag-Relanes, Felina C. Young

Abstract:

This study was conducted to analyze the noodle production process at Yan Hu Food Manufacturing for the basis of production improvement. The study utilized the PDCA approach and record review in the gathering of data for the calendar year 2019 from August to October data of the noodle products miki, canton, and misua. Causal-comparative research was used in this study; it attempts to establish cause-effect relationships among the variables such as descriptive statistics and correlation, both were used to compute the data gathered. The study found that miki, canton, and misua production has different cycle time sets for each production and has different production outputs in every set of its production process and a different number of wastages. The company has not yet established its allowable rejection rate/ wastage; instead, this paper used a 1% wastage limit. The researcher recommended the following: machines used for each process of the noodle product must be consistently maintained and monitored; an assessment of all the production operators by checking their performance statistically based on the output and the machine performance; a root cause analysis for finding the solution must be conducted; and an improvement on the recording system of the input and output of the production process of noodle product should be established to eliminate the poor recording of data.

Keywords: production, continuous improvement, process, operations, PDCA

Procedia PDF Downloads 22
796 Advancement of Computer Science Research in Nigeria: A Bibliometric Analysis of the Past Three Decades

Authors: Temidayo O. Omotehinwa, David O. Oyewola, Friday J. Agbo

Abstract:

This study aims to gather a proper perspective of the development landscape of Computer Science research in Nigeria. Therefore, a bibliometric analysis of 4,333 bibliographic records of Computer Science research in Nigeria in the last 31 years (1991-2021) was carried out. The bibliographic data were extracted from the Scopus database and analyzed using VOSviewer and the bibliometrix R package through the biblioshiny web interface. The findings of this study revealed that Computer Science research in Nigeria has a growth rate of 24.19%. The most developed and well-studied research areas in the Computer Science field in Nigeria are machine learning, data mining, and deep learning. The social structure analysis result revealed that there is a need for improved international collaborations. Sparsely established collaborations are largely influenced by geographic proximity. The funding analysis result showed that Computer Science research in Nigeria is under-funded. The findings of this study will be useful for researchers conducting Computer Science related research. Experts can gain insights into how to develop a strategic framework that will advance the field in a more impactful manner. Government agencies and policymakers can also utilize the outcome of this research to develop strategies for improved funding for Computer Science research.

Keywords: bibliometric analysis, biblioshiny, computer science, Nigeria, science mapping

Procedia PDF Downloads 77
795 Building Safety Through Real-time Design Fire Protection Systems

Authors: Mohsin Ali Shaikh, Song Weiguo, Muhammad Kashan Surahio, Usman Shahid, Rehmat Karim

Abstract:

When the area of a structure that is threatened by a disaster affects personal safety, the effectiveness of disaster prevention, evacuation, and rescue operations can be summarized by three assessment indicators: personal safety, property preservation, and attribution of responsibility. These indicators are applicable regardless of the disaster that affects the building. People need to get out of the hazardous area and to a safe place as soon as possible because there's no other way to respond. The results of the tragedy are thus closely related to how quickly people are advised to evacuate and how quickly they are rescued. This study considers present fire prevention systems to address catastrophes and improve building safety. It proposes the methods of Prevention Level for Deployment in Advance and Spatial Transformation by Human-Machine Collaboration. We present and prototype a real-time fire protection system architecture for building disaster prevention, evacuation, and rescue operations. The design encourages the use of simulations to check the efficacy of evacuation, rescue, and disaster prevention procedures throughout the planning and design phase of the structure.

Keywords: prevention level, building information modeling, quality management system, simulated reality

Procedia PDF Downloads 24
794 Eco-Drive Predictive Analytics

Authors: Sharif Muddsair, Eisels Martin, Giesbrecht Eugenie

Abstract:

With development of society increase the demand for the movement of people also increases gradually. The various modes of the transport in different extent which expat impacts, which depends on mainly technical-operating conditions. The up-to-date telematics systems provide the transport industry a revolutionary. Appropriate use of these systems can help to substantially improve the efficiency. Vehicle monitoring and fleet tracking are among services used for improving efficiency and effectiveness of utility vehicle. There are many telematics systems which may contribute to eco-driving. Generally, they can be grouped according to their role in driving cycle. • Before driving - eco-route selection, • While driving – Advanced driver assistance, • After driving – remote analysis. Our point of interest is regulated in third point [after driving – remote analysis]. TS [Telematics-system] make it possible to record driving patterns in real time and analysis the data later on, So that driver- classification-specific hints [fast driver, slow driver, aggressive driver…)] are given to imitate eco-friendly driving style. Together with growing number of vehicle and development of information technology, telematics become an ‘active’ research subject in IT and the car industry. Telematics has gone a long way from providing navigation solution/assisting the driver to become an integral part of the vehicle. Today’s telematics ensure safety, comfort and become convenience of the driver.

Keywords: internet of things, iot, connected vehicle, cv, ts, telematics services, ml, machine learning

Procedia PDF Downloads 274
793 Image Inpainting Model with Small-Sample Size Based on Generative Adversary Network and Genetic Algorithm

Authors: Jiawen Wang, Qijun Chen

Abstract:

The performance of most machine-learning methods for image inpainting depends on the quantity and quality of the training samples. However, it is very expensive or even impossible to obtain a great number of training samples in many scenarios. In this paper, an image inpainting model based on a generative adversary network (GAN) is constructed for the cases when the number of training samples is small. Firstly, a feature extraction network (F-net) is incorporated into the GAN network to utilize the available information of the inpainting image. The weighted sum of the extracted feature and the random noise acts as the input to the generative network (G-net). The proposed network can be trained well even when the sample size is very small. Secondly, in the phase of the completion for each damaged image, a genetic algorithm is designed to search an optimized noise input for G-net; based on this optimized input, the parameters of the G-net and F-net are further learned (Once the completion for a certain damaged image ends, the parameters restore to its original values obtained in the training phase) to generate an image patch that not only can fill the missing part of the damaged image smoothly but also has visual semantics.

Keywords: image inpainting, generative adversary nets, genetic algorithm, small-sample size

Procedia PDF Downloads 100
792 Simulation of Particle Damping in Boring Tool Using Combined Particles

Authors: S. Chockalingam, U. Natarajan, D. M. Santhoshsarang

Abstract:

Particle damping is a promising vibration attenuating technique in boring tool than other type of damping with minimal effect on the strength, rigidity and stiffness ratio of the machine tool structure. Due to the cantilever nature of boring tool holder in operations, it suffers chatter when the slenderness ratio of the tool gets increased. In this study, Copper-Stainless steel (SS) particles were packed inside the boring tool which acts as a damper. Damper suppresses chatter generated during machining and also improves the machining efficiency of the tool with better slenderness ratio. In the first approach of particle damping, combined Cu-SS particles were packed inside the vibrating tool, whereas Copper and Stainless steel particles were selected separately and packed inside another tool and their effectiveness was analysed in this simulation. This study reveals that the efficiency of finite element simulation of the boring tools when equipped with particles such as copper, stainless steel and a combination of both. In this study, the newly modified boring tool holder with particle damping was simulated using ANSYS12.0 with and without particles. The aim of this study is to enhance the structural rigidity through particle damping thus avoiding the occurrence of resonance in the boring tool during machining.

Keywords: boring bar, copper-stainless steel, chatter, particle damping

Procedia PDF Downloads 424
791 Supervisor Controller-Based Colored Petri Nets for Deadlock Control and Machine Failures in Automated Manufacturing Systems

Authors: Husam Kaid, Abdulrahman Al-Ahmari, Zhiwu Li

Abstract:

This paper develops a robust deadlock control technique for shared and unreliable resources in automated manufacturing systems (AMSs) based on structural analysis and colored Petri nets, which consists of three steps. The first step involves using strict minimal siphon control to create a live (deadlock-free) system that does not consider resource failure. The second step uses an approach based on colored Petri net, in which all monitors designed in the first step are merged into a single monitor. The third step addresses the deadlock control problems caused by resource failures. For all resource failures in the Petri net model a common recovery subnet based on colored petri net is proposed. The common recovery subnet is added to the obtained system at the second step to make the system reliable. The proposed approach is evaluated using an AMS from the literature. The results show that the proposed approach can be applied to an unreliable complex Petri net model, has a simpler structure and less computational complexity, and can obtain one common recovery subnet to model all resource failures.

Keywords: automated manufacturing system, colored Petri net, deadlocks, siphon

Procedia PDF Downloads 101
790 Exploratory Study of the Influencing Factors for Hotels' Competitors

Authors: Asma Ameur, Dhafer Malouche

Abstract:

Hotel competitiveness research is an essential phase of the marketing strategy for any hotel. Certainly, knowing the hotels' competitors helps the hotelier to grasp its position in the market and the citizen to make the right choice in picking a hotel. Thus, competitiveness is an important indicator that can be influenced by various factors. In fact, the issue of competitiveness, this ability to cope with competition, remains a difficult and complex concept to define and to exploit. Therefore, the purpose of this article is to make an exploratory study to calculate a competitiveness indicator for hotels. Further on, this paper makes it possible to determine the criteria of direct or indirect effect on the image and the perception of a hotel. The actual research is used to look into the right model for hotel ‘competitiveness. For this reason, we exploit different theoretical contributions in the field of machine learning. Thus, we use some statistical techniques such as the Principal Component Analysis (PCA) to reduce the dimensions, as well as other techniques of statistical modeling. This paper presents a survey covering of the techniques and methods in hotel competitiveness research. Furthermore, this study allows us to deduct the significant variables that influence the determination of hotel’s competitors. Lastly, the discussed experiences in this article found that the hotel competitors are influenced by several factors with different rates.

Keywords: competitiveness, e-reputation, hotels' competitors, online hotel’ review, principal component analysis, statistical modeling

Procedia PDF Downloads 85
789 Violence Detection and Tracking on Moving Surveillance Video Using Machine Learning Approach

Authors: Abe Degale D., Cheng Jian

Abstract:

When creating automated video surveillance systems, violent action recognition is crucial. In recent years, hand-crafted feature detectors have been the primary method for achieving violence detection, such as the recognition of fighting activity. Researchers have also looked into learning-based representational models. On benchmark datasets created especially for the detection of violent sequences in sports and movies, these methods produced good accuracy results. The Hockey dataset's videos with surveillance camera motion present challenges for these algorithms for learning discriminating features. Image recognition and human activity detection challenges have shown success with deep representation-based methods. For the purpose of detecting violent images and identifying aggressive human behaviours, this research suggested a deep representation-based model using the transfer learning idea. The results show that the suggested approach outperforms state-of-the-art accuracy levels by learning the most discriminating features, attaining 99.34% and 99.98% accuracy levels on the Hockey and Movies datasets, respectively.

Keywords: violence detection, faster RCNN, transfer learning and, surveillance video

Procedia PDF Downloads 64
788 Data-Driven Approach to Predict Inpatient's Estimated Discharge Date

Authors: Ayliana Dharmawan, Heng Yong Sheng, Zhang Xiaojin, Tan Thai Lian

Abstract:

To facilitate discharge planning, doctors are presently required to assign an Estimated Discharge Date (EDD) for each patient admitted to the hospital. This assignment of the EDD is largely based on the doctor’s judgment. This can be difficult for cases which are complex or relatively new to the doctor. It is hypothesized that a data-driven approach would be able to facilitate the doctors to make accurate estimations of the discharge date. Making use of routinely collected data on inpatient discharges between January 2013 and May 2016, a predictive model was developed using machine learning techniques to predict the Length of Stay (and hence the EDD) of inpatients, at the point of admission. The predictive performance of the model was compared to that of the clinicians using accuracy measures. Overall, the best performing model was found to be able to predict EDD with an accuracy improvement in Average Squared Error (ASE) by -38% as compared to the first EDD determined by the present method. It was found that important predictors of the EDD include the provisional diagnosis code, patient’s age, attending doctor at admission, medical specialty at admission, accommodation type, and the mean length of stay of the patient in the past year. The predictive model can be used as a tool to accurately predict the EDD.

Keywords: inpatient, estimated discharge date, EDD, prediction, data-driven

Procedia PDF Downloads 142
787 Control Flow around NACA 4415 Airfoil Using Slot and Injection

Authors: Imine Zakaria, Meftah Sidi Mohamed El Amine

Abstract:

One of the most vital aerodynamic organs of a flying machine is the wing, which allows it to fly in the air efficiently. The flow around the wing is very sensitive to changes in the angle of attack. Beyond a value, there is a phenomenon of the boundary layer separation on the upper surface, which causes instability and total degradation of aerodynamic performance called a stall. However, controlling flow around an airfoil has become a researcher concern in the aeronautics field. There are two techniques for controlling flow around a wing to improve its aerodynamic performance: passive and active controls. Blowing and suction are among the active techniques that control the boundary layer separation around an airfoil. Their objective is to give energy to the air particles in the boundary layer separation zones and to create vortex structures that will homogenize the velocity near the wall and allow control. Blowing and suction have long been used as flow control actuators around obstacles. In 1904 Prandtl applied a permanent blowing to a cylinder to delay the boundary layer separation. In the present study, several numerical investigations have been developed to predict a turbulent flow around an aerodynamic profile. CFD code was used for several angles of attack in order to validate the present work with that of the literature in the case of a clean profile. The variation of the lift coefficient CL with the momentum coefficient

Keywords: CFD, control flow, lift, slot

Procedia PDF Downloads 155
786 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 39
785 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 445
784 The Reflection Framework to Enhance the User Experience for Cultural Heritage Spaces’ Websites in Post-Pandemic Times

Authors: Duyen Lam, Thuong Hoang, Atul Sajjanhar, Feifei Chen

Abstract:

With the emerging interactive technology applications helping users connect progressively with cultural artefacts in new approaches, the cultural heritage sector gains significantly. The interactive apps’ issues can be tested via several techniques, including usability surveys and usability evaluations. The severe usability problems for museums’ interactive technologies commonly involve interactions, control, and navigation processes. This study confirms the low quality of being immersive for audio guides in navigating the exhibition and involving experience in the virtual environment, which are the most vital features of new interactive technologies such as AR and VR. In addition, our usability surveys and heuristic evaluations disclosed many usability issues of these interactive technologies relating to interaction functions. Additionally, we use the Wayback Machine to examine what interactive apps/technologies were deployed on these websites during the physical visits limited due to the COVID-19 pandemic lockdown. Based on those inputs, we propose the reflection framework to enhance the UX in the cultural heritage domain with detailed guidelines.

Keywords: framework, user experience, cultural heritage, interactive technology, museum, COVID-19 pandemic, usability survey, heuristic evaluation, guidelines

Procedia PDF Downloads 23