Search results for: S/R machine
612 Design of SAE J2716 Single Edge Nibble Transmission Digital Sensor Interface for Automotive Applications
Authors: Jongbae Lee, Seongsoo Lee
Abstract:
Modern sensors often embed small-size digital controller for sensor control, value calibration, and signal processing. These sensors require digital data communication with host microprocessors, but conventional digital communication protocols are too heavy for price reduction. SAE J2716 SENT (single edge nibble transmission) protocol transmits direct digital waveforms instead of complicated analog modulated signals. In this paper, a SENT interface is designed in Verilog HDL (hardware description language) and implemented in FPGA (field-programmable gate array) evaluation board. The designed SENT interface consists of frame encoder/decoder, configuration register, tick period generator, CRC (cyclic redundancy code) generator/checker, and TX/RX (transmission/reception) buffer. Frame encoder/decoder is implemented as a finite state machine, and it controls whole SENT interface. Configuration register contains various parameters such as operation mode, tick length, CRC option, pause pulse option, and number of nibble data. Tick period generator generates tick signals from input clock. CRC generator/checker generates or checks CRC in the SENT data frame. TX/RX buffer stores transmission/received data. The designed SENT interface can send or receives digital data in 25~65 kbps at 3 us tick. Synthesized in 0.18 um fabrication technologies, it is implemented about 2,500 gates.Keywords: digital sensor interface, SAE J2716, SENT, verilog HDL
Procedia PDF Downloads 305611 Using Wearable Device with Neuron Network to Classify Severity of Sleep Disorder
Authors: Ru-Yin Yang, Chi Wu, Cheng-Yu Tsai, Yin-Tzu Lin, Wen-Te Liu
Abstract:
Background: Sleep breathing disorder (SDB) is a condition demonstrated by recurrent episodes of the airway obstruction leading to intermittent hypoxia and quality fragmentation during sleep time. However, the procedures for SDB severity examination remain complicated and costly. Objective: The objective of this study is to establish a simplified examination method for SDB by the respiratory impendence pattern sensor combining the signal processing and machine learning model. Methodologies: We records heart rate variability by the electrocardiogram and respiratory pattern by impendence. After the polysomnography (PSG) been done with the diagnosis of SDB by the apnea and hypopnea index (AHI), we calculate the episodes with the absence of flow and arousal index (AI) from device record. Subjects were divided into training and testing groups. Neuron network was used to establish a prediction model to classify the severity of the SDB by the AI, episodes, and body profiles. The performance was evaluated by classification in the testing group compared with PSG. Results: In this study, we enrolled 66 subjects (Male/Female: 37/29; Age:49.9±13.2) with the diagnosis of SDB in a sleep center in Taipei city, Taiwan, from 2015 to 2016. The accuracy from the confusion matrix on the test group by NN is 71.94 %. Conclusion: Based on the models, we established a prediction model for SDB by means of the wearable sensor. With more cases incoming and training, this system may be used to rapidly and automatically screen the risk of SDB in the future.Keywords: sleep breathing disorder, apnea and hypopnea index, body parameters, neuron network
Procedia PDF Downloads 150610 A Controlled Natural Language Assisted Approach for the Design and Automated Processing of Service Level Agreements
Authors: Christopher Schwarz, Katrin Riegler, Erwin Zinser
Abstract:
The management of outsourcing relationships between IT service providers and their customers proofs to be a critical issue that has to be stipulated by means of Service Level Agreements (SLAs). Since service requirements differ from customer to customer, SLA content and language structures vary largely, standardized SLA templates may not be used and an automated processing of SLA content is not possible. Hence, SLA management is usually a time-consuming and inefficient manual process. For overcoming these challenges, this paper presents an innovative and ITIL V3-conform approach for automated SLA design and management using controlled natural language in enterprise collaboration portals. The proposed novel concept is based on a self-developed controlled natural language that follows a subject-predicate-object approach to specify well-defined SLA content structures that act as templates for customized contracts and support automated SLA processing. The derived results eventually enable IT service providers to automate several SLA request, approval and negotiation processes by means of workflows and business rules within an enterprise collaboration portal. The illustrated prototypical realization gives evidence of the practical relevance in service-oriented scenarios as well as the high flexibility and adaptability of the presented model. Thus, the prototype enables the automated creation of well defined, customized SLA documents, providing a knowledge representation that is both human understandable and machine processable.Keywords: automated processing, controlled natural language, knowledge representation, information technology outsourcing, service level management
Procedia PDF Downloads 433609 Status of Production, Distribution and Determinants of Biomass Briquette Acceptability in Kampala, Uganda
Authors: David B. Kisakye, Paul Mugabi
Abstract:
Biomass briquettes have been identified as a plausible and close alternative to commonly used energy fuels such as charcoal and firewood, whose prices are escalating due to the dwindling natural resource base. However, briquettes do not seem to be as popular as would be expected. This study assessed the production, distribution, and acceptability of the briquettes in the Kampala district. A total of 60 respondents, 50 of whom were briquette users and 10 briquette producers, were sampled from five divisions of Kampala district to evaluate consumer acceptability, preference for briquette type and shape. Households and institutions were identified to be the major consumers of briquettes, while community-based organizations were the major distributors of briquettes. The Chi-square test of independence showed a significant association between briquette acceptability and briquette attributes of substitutability and low cost (p < 0,05). The Kruskal Wallis test showed that low-income class people preferred non-carbonized briquettes. Gender, marital status, and income level also cause variation in preference for spherical, stick, and honeycomb briquettes (p < 0,05). The major challenges faced by briquette users in Kampala were; production of a lot of ash, frequent crushing, and limited access to briquettes. The producers of briquettes were mainly challenged by regular machine breakdown, raw material scarcity, and poor carbonizing units. It was concluded that briquettes have a market and are generally accepted in Kampala. However, user preferences need to be taken into account by briquette produces, suitable cookstoves should be availed to users, and there is a need for standards to ensure the quality of briquettes.Keywords: consumer acceptability, biomass residues, briquettes, briquette producers, distribution, fuel, marketability, wood fuel
Procedia PDF Downloads 144608 Theoretical-Experimental Investigations on Free Vibration of Glass Fiber/Polyester Composite Conical Shells Containing Fluid
Authors: Tran Ich Thinh, Nguyen Manh Cuong
Abstract:
Free vibrations of partial fluid-filled composite truncated conical shells are investigated using the Dynamic Stiffness Method (DSM) or Continuous Element Method (CEM) based on the First Order Shear Deformation Theory (FSDT) and non-viscous incompressible fluid equations. Numerical examples are given for analyzing natural frequencies and harmonic responses of clamped-free conical shells partially and completely filled with fluid. To compare with the theoretical results, detailed experimental results have been obtained on the free vibration of a clamped-free conical shells partially filled with water by using a multi-vibration measuring machine (DEWEBOOK-DASYLab 5.61.10). Three glass fiber/polyester composite truncated cones with the radius of the larger end 285 mm, thickness 2 mm, and the cone lengths along the generators are 285 mm, 427.5 mm and 570 mm with the semi-vertex angles 27, 14 and 9 degrees respectively were used, and the filling ratio of the contained water was 0, 0.25, 0.50, 0.75 and 1.0. The results calculated by proposed computational model for studied composite conical shells are in good agreement with experiments. Obtained results indicate that the fluid filling can reduce significantly the natural frequencies of composite conical shells. Parametric studies including circumferential wave number, fluid depth and cone angles are carried out.Keywords: dynamic stiffness method, experimental study, free vibration, fluid-shell interaction, glass fiber/polyester composite conical shell
Procedia PDF Downloads 500607 Exclusive Value Adding by iCenter Analytics on Transient Condition
Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata
Abstract:
During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.Keywords: analytics, diagnostics, monitoring, turbomachinery
Procedia PDF Downloads 74606 Enhancing Email Security: A Multi-Layered Defense Strategy Approach and an AI-Powered Model for Identifying and Mitigating Phishing Attacks
Authors: Anastasios Papathanasiou, George Liontos, Athanasios Katsouras, Vasiliki Liagkou, Euripides Glavas
Abstract:
Email remains a crucial communication tool due to its efficiency, accessibility and cost-effectiveness, enabling rapid information exchange across global networks. However, the global adoption of email has also made it a prime target for cyber threats, including phishing, malware and Business Email Compromise (BEC) attacks, which exploit its integral role in personal and professional realms in order to perform fraud and data breaches. To combat these threats, this research advocates for a multi-layered defense strategy incorporating advanced technological tools such as anti-spam and anti-malware software, machine learning algorithms and authentication protocols. Moreover, we developed an artificial intelligence model specifically designed to analyze email headers and assess their security status. This AI-driven model examines various components of email headers, such as "From" addresses, ‘Received’ paths and the integrity of SPF, DKIM and DMARC records. Upon analysis, it generates comprehensive reports that indicate whether an email is likely to be malicious or benign. This capability empowers users to identify potentially dangerous emails promptly, enhancing their ability to avoid phishing attacks, malware infections and other cyber threats.Keywords: email security, artificial intelligence, header analysis, threat detection, phishing, DMARC, DKIM, SPF, ai model
Procedia PDF Downloads 61605 Recognition and Counting Algorithm for Sub-Regional Objects in a Handwritten Image through Image Sets
Authors: Kothuri Sriraman, Mattupalli Komal Teja
Abstract:
In this paper, a novel algorithm is proposed for the recognition of hulls in a hand written images that might be irregular or digit or character shape. Identification of objects and internal objects is quite difficult to extract, when the structure of the image is having bulk of clusters. The estimation results are easily obtained while going through identifying the sub-regional objects by using the SASK algorithm. Focusing mainly to recognize the number of internal objects exist in a given image, so as it is shadow-free and error-free. The hard clustering and density clustering process of obtained image rough set is used to recognize the differentiated internal objects, if any. In order to find out the internal hull regions it involves three steps pre-processing, Boundary Extraction and finally, apply the Hull Detection system. By detecting the sub-regional hulls it can increase the machine learning capability in detection of characters and it can also be extend in order to get the hull recognition even in irregular shape objects like wise black holes in the space exploration with their intensities. Layered hulls are those having the structured layers inside while it is useful in the Military Services and Traffic to identify the number of vehicles or persons. This proposed SASK algorithm is helpful in making of that kind of identifying the regions and can useful in undergo for the decision process (to clear the traffic, to identify the number of persons in the opponent’s in the war).Keywords: chain code, Hull regions, Hough transform, Hull recognition, Layered Outline Extraction, SASK algorithm
Procedia PDF Downloads 350604 A Communication Signal Recognition Algorithm Based on Holder Coefficient Characteristics
Authors: Hui Zhang, Ye Tian, Fang Ye, Ziming Guo
Abstract:
Communication signal modulation recognition technology is one of the key technologies in the field of modern information warfare. At present, communication signal automatic modulation recognition methods are mainly divided into two major categories. One is the maximum likelihood hypothesis testing method based on decision theory, the other is a statistical pattern recognition method based on feature extraction. Now, the most commonly used is a statistical pattern recognition method, which includes feature extraction and classifier design. With the increasingly complex electromagnetic environment of communications, how to effectively extract the features of various signals at low signal-to-noise ratio (SNR) is a hot topic for scholars in various countries. To solve this problem, this paper proposes a feature extraction algorithm for the communication signal based on the improved Holder cloud feature. And the extreme learning machine (ELM) is used which aims at the problem of the real-time in the modern warfare to classify the extracted features. The algorithm extracts the digital features of the improved cloud model without deterministic information in a low SNR environment, and uses the improved cloud model to obtain more stable Holder cloud features and the performance of the algorithm is improved. This algorithm addresses the problem that a simple feature extraction algorithm based on Holder coefficient feature is difficult to recognize at low SNR, and it also has a better recognition accuracy. The results of simulations show that the approach in this paper still has a good classification result at low SNR, even when the SNR is -15dB, the recognition accuracy still reaches 76%.Keywords: communication signal, feature extraction, Holder coefficient, improved cloud model
Procedia PDF Downloads 157603 Performance Study of Classification Algorithms for Consumer Online Shopping Attitudes and Behavior Using Data Mining
Authors: Rana Alaa El-Deen Ahmed, M. Elemam Shehab, Shereen Morsy, Nermeen Mekawie
Abstract:
With the growing popularity and acceptance of e-commerce platforms, users face an ever increasing burden in actually choosing the right product from the large number of online offers. Thus, techniques for personalization and shopping guides are needed by users. For a pleasant and successful shopping experience, users need to know easily which products to buy with high confidence. Since selling a wide variety of products has become easier due to the popularity of online stores, online retailers are able to sell more products than a physical store. The disadvantage is that the customers might not find products they need. In this research the customer will be able to find the products he is searching for, because recommender systems are used in some ecommerce web sites. Recommender system learns from the information about customers and products and provides appropriate personalized recommendations to customers to find the needed product. In this paper eleven classification algorithms are comparatively tested to find the best classifier fit for consumer online shopping attitudes and behavior in the experimented dataset. The WEKA knowledge analysis tool, which is an open source data mining workbench software used in comparing conventional classifiers to get the best classifier was used in this research. In this research by using the data mining tool (WEKA) with the experimented classifiers the results show that decision table and filtered classifier gives the highest accuracy and the lowest accuracy classification via clustering and simple cart.Keywords: classification, data mining, machine learning, online shopping, WEKA
Procedia PDF Downloads 352602 Normal Meniscal Extrusion Using Ultrasonography during the Different Range of Motion Running Head: Sonography for Meniscal Extrusion
Authors: Arash Sharafat Vaziri, Leila Aghaghazvini, Soodeh Jahangiri, Mohammad Tahami, Roham Borazjani, Mohammad Naghi Tahmasebi, Hamid Rabie, Hesan Jelodari Mamaghani, Fardis Vosoughi, Maryam Salimi
Abstract:
Aims: It is essential to know the normal extrusion measures in order to detect pathological ones. In this study, we aimed to define some normal reference values for meniscal extrusion in the normal knees during different ranges of motion. Methods: The amount of anterior and posterior portion of meniscal extrusion among twenty-one asymptomatic volunteers (42 knees) were tracked at 0, 45, and 90 degrees of knee flexion using an ultrasound machine. The repeated measures analysis of variance (ANOVA) was used to show the interaction between the amounts of meniscal extrusion and the different degrees of knee flexion. Result: The anterior portion of the lateral menisci at full knee extension (0.59±1.40) and the posterior portion of the medial menisci during 90° flexion (3.06±2.36) showed the smallest and the highest mean amount of extrusion, respectively. The normal average amounts of anterior extrusion were 1.12± 1.17 mm and 0.99± 1.34 mm for medial and lateral menisci, respectively. The posterior meniscal normal extrusions were significantly increasing in both medial and lateral menisci during the survey (F= 20.250 and 11.298; both P-values< 0.001) as they were measured at 2.37± 2.16 mm and 1.53± 2.18 mm in order. Conclusion: The medial meniscus can extrude 1.74± 1.84 mm normally, while this amount was 1.26± 1.82 mm for the lateral meniscus. These measures commonly increased with the rising of knee flexion motion. Likewise, the posterior portion showed more extrusion than the anterior portion on both sides. These measures commonly increased with higher knee flexion.Keywords: meniscal extrusion, ultrasonography, knee
Procedia PDF Downloads 91601 Determination of the Stability of Haloperidol Tablets and Phenytoin Capsules Stored in the Inpatient Dispensary System (Swisslog) by the Respective HPLC and Raman Spectroscopy Assay
Authors: Carol Yue-En Ong, Angelina Hui-Min Tan, Quan Liu, Paul Chi-Lui Ho
Abstract:
A public general hospital in Singapore has recently implemented an automated unit-dose machine in their inpatient dispensary, Swisslog, with the objective of reducing human error and improving patient safety. However, a concern in stability arises as tablets are removed from their original packaging (bottled loose tablets/capsules) and are repackaged into individual, clear plastic wrappers as unit doses in the system. Drugs that are light-sensitive and hygroscopic would be more susceptible to degradation as the wrapper does not offer full protection. Hence, this study was carried out to study the stability of haloperidol tablets and phenytoin capsules that are light-sensitive and hygroscopic respectively. Validated HPLC-UV assays were first established for quantification of these two compounds. The medications involved were put in the Swisslog and sampled every week for one month. The collected data was analysed and showed no degradation over time. This study also explored an alternative approach for drug stability determination-Raman spectroscopy. The advantage of Raman spectroscopy is its high time efficiency and non-destructive nature. The results suggest that drug degradation can indeed be detected using Raman microscopy, but further research is needed to establish this approach for quantification or qualification of compounds. NanoRam®, a portable Raman spectrocope was also used alongside Raman microscopy but was unsuccessful in detecting degradation in this study.Keywords: drug stability, haloperidol, HPLC, phenytoin, raman spectroscopy, Swisslog
Procedia PDF Downloads 350600 Neural Network Supervisory Proportional-Integral-Derivative Control of the Pressurized Water Reactor Core Power Load Following Operation
Authors: Derjew Ayele Ejigu, Houde Song, Xiaojing Liu
Abstract:
This work presents the particle swarm optimization trained neural network (PSO-NN) supervisory proportional integral derivative (PID) control method to monitor the pressurized water reactor (PWR) core power for safe operation. The proposed control approach is implemented on the transfer function of the PWR core, which is computed from the state-space model. The PWR core state-space model is designed from the neutronics, thermal-hydraulics, and reactivity models using perturbation around the equilibrium value. The proposed control approach computes the control rod speed to maneuver the core power to track the reference in a closed-loop scheme. The particle swarm optimization (PSO) algorithm is used to train the neural network (NN) and to tune the PID simultaneously. The controller performance is examined using integral absolute error, integral time absolute error, integral square error, and integral time square error functions, and the stability of the system is analyzed by using the Bode diagram. The simulation results indicated that the controller shows satisfactory performance to control and track the load power effectively and smoothly as compared to the PSO-PID control technique. This study will give benefit to design a supervisory controller for nuclear engineering research fields for control application.Keywords: machine learning, neural network, pressurized water reactor, supervisory controller
Procedia PDF Downloads 157599 An Optimal Path for Virtual Reality Education using Association Rules
Authors: Adam Patterson
Abstract:
This study analyzes the self-reported experiences of virtual reality users to develop insight into an optimal learning path for education within virtual reality. This research uses a sample of 1000 observations to statistically define factors influencing (i) immersion level and (ii) motion sickness rating for virtual reality experience respondents of college age. This paper recommends an efficient duration for each virtual reality session, to minimize sickness and maximize engagement, utilizing modern machine learning methods such as association rules. The goal of this research, in augmentation with previous literature, is to inform logistical decisions relating to implementation of pilot instruction for virtual reality at the collegiate level. Future research will include a Randomized Control Trial (RCT) to quantify the effect of virtual reality education on student learning outcomes and engagement measures. Current research aims to maximize the treatment effect within the RCT by optimizing the learning benefits of virtual reality. Results suggest significant gender heterogeneity amongst likelihood of reporting motion sickness. Females are 1.7 times more likely, than males, to report high levels of motion sickness resulting from a virtual reality experience. Regarding duration, respondents were 1.29 times more likely to select the lowest level of motion sickness after an engagement lasting between 24.3 and 42 minutes. Conversely, respondents between 42 to 60 minutes were 1.2 times more likely to select the higher levels of motion sickness.Keywords: applications and integration of e-education, practices and cases in e-education, systems and technologies in e-education, technology adoption and diffusion of e-learning
Procedia PDF Downloads 69598 The Analysis of Emergency Shutdown Valves Torque Data in Terms of Its Use as a Health Indicator for System Prognostics
Authors: Ewa M. Laskowska, Jorn Vatn
Abstract:
Industry 4.0 focuses on digital optimization of industrial processes. The idea is to use extracted data in order to build a decision support model enabling use of those data for real time decision making. In terms of predictive maintenance, the desired decision support tool would be a model enabling prognostics of system's health based on the current condition of considered equipment. Within area of system prognostics and health management, a commonly used health indicator is Remaining Useful Lifetime (RUL) of a system. Because the RUL is a random variable, it has to be estimated based on available health indicators. Health indicators can be of different types and come from different sources. They can be process variables, equipment performance variables, data related to number of experienced failures, etc. The aim of this study is the analysis of performance variables of emergency shutdown valves (ESV) used in oil and gas industry. ESV is inspected periodically, and at each inspection torque and time of valve operation are registered. The data will be analyzed by means of machine learning or statistical analysis. The purpose is to investigate whether the available data could be used as a health indicator for a prognostic purpose. The second objective is to examine what is the most efficient way to incorporate the data into predictive model. The idea is to check whether the data can be applied in form of explanatory variables in Markov process or whether other stochastic processes would be a more convenient to build an RUL model based on the information coming from registered data.Keywords: emergency shutdown valves, health indicator, prognostics, remaining useful lifetime, RUL
Procedia PDF Downloads 91597 Study of the Performances of an Environmental Concrete Based on Recycled Aggregates and Marble Waste Fillers Addition
Authors: Larbi Belagraa, Miloud Beddar, Abderrazak Bouzid
Abstract:
The needs of the construction sector still increasing for concrete. However, the shortage of natural resources of aggregate could be a problem for the concrete industry, in addition to the negative impact on the environment due to the demolition wastes. Recycling aggregate from construction and demolition (C&D) waste presents a major interest for users and researchers of concrete since this constituent can occupies more than 70% of concrete volume. The aim of the study here in is to assess the effect of sulfate resistant cement combined with the local mineral addition of marble waste fillers on the mechanical behavior of a recycled aggregate concrete (RAC). Physical and mechanical properties of RAC including the density, the flexural and the compressive strength were studied. The non destructive test methods (pulse-velocity, rebound hammer) were performed . The results obtained were compared to crushed aggregate concrete (CAC) using the normal compressive testing machine test method. The optimal content of 5% marble fillers showed an improvement for both used test methods (compression, flexion and NDT). Non-destructive methods (ultrasonic and rebound hammer test) can be used to assess the strength of RAC, but a correction coefficient is required to obtain a similar value to the compressive strength given by the compression tests. The study emphasizes that these waste materials can be successfully and economically utilized as additional inert filler in RAC formulation within similar performances compared to a conventional concrete.Keywords: marble waste fillers, mechanical strength, natural aggregate, non-destructive testing (NDT), recycled aggregate concrete
Procedia PDF Downloads 313596 Image Ranking to Assist Object Labeling for Training Detection Models
Authors: Tonislav Ivanov, Oleksii Nedashkivskyi, Denis Babeshko, Vadim Pinskiy, Matthew Putman
Abstract:
Training a machine learning model for object detection that generalizes well is known to benefit from a training dataset with diverse examples. However, training datasets usually contain many repeats of common examples of a class and lack rarely seen examples. This is due to the process commonly used during human annotation where a person would proceed sequentially through a list of images labeling a sufficiently high total number of examples. Instead, the method presented involves an active process where, after the initial labeling of several images is completed, the next subset of images for labeling is selected by an algorithm. This process of algorithmic image selection and manual labeling continues in an iterative fashion. The algorithm used for the image selection is a deep learning algorithm, based on the U-shaped architecture, which quantifies the presence of unseen data in each image in order to find images that contain the most novel examples. Moreover, the location of the unseen data in each image is highlighted, aiding the labeler in spotting these examples. Experiments performed using semiconductor wafer data show that labeling a subset of the data, curated by this algorithm, resulted in a model with a better performance than a model produced from sequentially labeling the same amount of data. Also, similar performance is achieved compared to a model trained on exhaustive labeling of the whole dataset. Overall, the proposed approach results in a dataset that has a diverse set of examples per class as well as more balanced classes, which proves beneficial when training a deep learning model.Keywords: computer vision, deep learning, object detection, semiconductor
Procedia PDF Downloads 137595 Biaxial Fatigue Specimen Design and Testing Rig Development
Authors: Ahmed H. Elkholy
Abstract:
An elastic analysis is developed to obtain the distribution of stresses, strains, bending moment and deformation for a thin hollow, variable thickness cylindrical specimen when subjected to different biaxial loadings. The specimen was subjected to a combination of internal pressure, axial tensile loading and external pressure. Several axial to circumferential stress ratios were investigated in detail. The analytical model was then validated using experimental results obtained from a test rig using several biaxial loadings. Based on the preliminary results obtained, the specimen was then modified geometrically to ensure uniform strain distribution through its wall thickness and along its gauge length. The new design of the specimen has a higher buckling strength and a maximum value of equivalent stress according to the maximum distortion energy theory. A cyclic function generator of the standard servo-controlled, electro-hydraulic testing machine is used to generate a specific signal shape (sine, square,…) at a certain frequency. The two independent controllers of the electronic circuit cause an independent movement to each servo-valve piston. The movement of each piston pressurizes the upper and lower sides of the actuators alternately. So, the specimen will be subjected to axial and diametral loads independent of each other. The hydraulic system has two different pressures: one pressure will be responsible for axial stress produced in the specimen and the other will be responsible for the tangential stress. Changing the two pressure ratios will change the stress ratios accordingly. The only restriction on the maximum stress obtained is the capacity of the testing system and specimen instability due to buckling.Keywords: biaxial, fatigue, stress, testing
Procedia PDF Downloads 130594 An In-Depth Conceptual Framework for the Development of Prosthetic Hands: Emphasizing Transradial Prostheses
Authors: Touil Issam, Bouraghda Skander
Abstract:
The human hand is a vital yet intricate organ, essential for tasks ranging from grasping to executing fine motor skills. It serves as the most advanced and natural interface for interaction between humans and their surroundings. Upper-limb deficiencies, caused by conditions such as illness, accidents, or congenital factors, are prevalent worldwide. These deficiencies are categorized into seven types: partial hand, wrist disarticulation, transradial, elbow disarticulation, transhumeral, shoulder disarticulation, and forequarter, with transradial amputations being the most common and often well-suited for prosthetic hands. Advancements in technology and healthcare services have amplified the need for affordable, user-friendly, and functional prosthetic hands capable of restoring essential hand and finger functions. As a critical subset of medical robotics, prosthetic hands have seen notable design and development progress. However, challenges remain in achieving widespread user acceptance and satisfaction, highlighting the need for a holistic approach to their design and implementation. This study aims to consolidate the various factors involved in the development of prosthetic hands, focusing particularly on transradial prosthetics. It considers all types of prosthetic hands, whether actively powered, passively powered, or nonpowered. By presenting a comprehensive concept map, we aim to integrate these factors into a cohesive framework, guiding the development of prosthetic hands that offer enhanced functionality, improved user acceptance, and better alignment with user.Keywords: prosthetic hands, user-centeren design, human machine interaction design, assistive technologies, meical robotics
Procedia PDF Downloads 3593 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects
Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed
Abstract:
Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis
Procedia PDF Downloads 376592 Tensile Behavior of Oil Palm Fiber Concrete (OPFC) with Different Fiber Volume
Authors: Khairul Zahreen Mohd Arof, Rahimah Muhamad
Abstract:
Oil palm fiber (OPF) is a fibrous material produced from the waste of palm oil industry which is suitable to be used in construction industry. The applications of OPF in concrete can reduce the material costs and enhance concrete behavior. Dog-bone test provides significant results for investigating the behavior of fiber reinforced concrete under tensile loading. It is able to provide stress-strain profile, modulus of elasticity, stress at cracking point and total crack width. In this research, dog-bone tests have been conducted to analyze total crack width, stress-strain profile, and modulus of elasticity of OPFC. Specimens are in a dog-bone shape with a long notch in the middle as compared to the end, to ensure cracks occur only within the notch. Tests were instrumented using a universal testing machine Shimadzu 300kN, a linear variable differential transformer and two strain gauges. A total of nine specimens with different fibers at fiber volume fractions of 0.75%, 1.00%, and 1.25% have been tested to analyze the behavior under tensile loading. Also, three specimens of plain concrete fiber have been tested as control specimens. The tensile test of all specimens have been carried out for concrete age exceed 28 days. It shows that OPFC able to reduce total crack width. In addition, OPFC has higher cracking stress than plain concrete. The study shows plain concrete can be improved with the addition of OPF.Keywords: cracks, crack width, dog-bone test, oil palm fiber concrete
Procedia PDF Downloads 344591 A Human Centered Design of an Exoskeleton Using Multibody Simulation
Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann
Abstract:
Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation
Procedia PDF Downloads 164590 Artificial Intelligence Based Abnormality Detection System and Real Valuᵀᴹ Product Design
Authors: Junbeom Lee, Jaehyuck Cho, Wookyeong Jeong, Jonghan Won, Jungmin Hwang, Youngseok Song, Taikyeong Jeong
Abstract:
This paper investigates and analyzes meta-learning technologies that use multiple-cameras to monitor and check abnormal behavior in people in real-time in the area of healthcare fields. Advances in artificial intelligence and computer vision technologies have confirmed that cameras can be useful for individual health monitoring and abnormal behavior detection. Through this, it is possible to establish a system that can respond early by automatically detecting abnormal behavior of the elderly, such as patients and the elderly. In this paper, we use a technique called meta-learning to analyze image data collected from cameras and develop a commercial product to determine abnormal behavior. Meta-learning applies machine learning algorithms to help systems learn and adapt quickly to new real data. Through this, the accuracy and reliability of the abnormal behavior discrimination system can be improved. In addition, this study proposes a meta-learning-based abnormal behavior detection system that includes steps such as data collection and preprocessing, feature extraction and selection, and classification model development. Various healthcare scenarios and experiments analyze the performance of the proposed system and demonstrate excellence compared to other existing methods. Through this study, we present the possibility that camera-based meta-learning technology can be useful for monitoring and testing abnormal behavior in the healthcare area.Keywords: artificial intelligence, abnormal behavior, early detection, health monitoring
Procedia PDF Downloads 89589 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques
Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet
Abstract:
5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics
Procedia PDF Downloads 64588 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 44587 The Methodology of Hand-Gesture Based Form Design in Digital Modeling
Authors: Sanghoon Shim, Jaehwan Jung, Sung-Ah Kim
Abstract:
As the digital technology develops, studies on the TUI (Tangible User Interface) that links the physical environment utilizing the human senses with the virtual environment through the computer are actively being conducted. In addition, there has been a tremendous advance in computer design making through the use of computer-aided design techniques, which enable optimized decision-making through comparison with machine learning and parallel comparison of alternatives. However, a complex design that can respond to user requirements or performance can emerge through the intuition of the designer, but it is difficult to actualize the emerged design by the designer's ability alone. Ancillary tools such as Gaudí's Sandbag can be an instrument to reinforce and evolve emerged ideas from designers. With the advent of many commercial tools that support 3D objects, designers' intentions are easily reflected in their designs, but the degree of their reflection reflects their intentions according to the proficiency of design tools. This study embodies the environment in which the form can be implemented by the fingers of the most basic designer in the initial design phase of the complex type building design. Leapmotion is used as a sensor to recognize the hand motions of the designer, and it is converted into digital information to realize an environment that can be linked in real time in virtual reality (VR). In addition, the implemented design can be linked with Rhino™, a 3D authoring tool, and its plug-in Grasshopper™ in real time. As a result, it is possible to design sensibly using TUI, and it can serve as a tool for assisting designer intuition.Keywords: design environment, digital modeling, hand gesture, TUI, virtual reality
Procedia PDF Downloads 368586 The Research of the Relationship between Triathlon Competition Results with Physical Fitness Performance
Authors: Chen Chan Wei
Abstract:
The purpose of this study was to investigate the impact of swim 1500m, 10000m run, VO2 max, and body fat on Olympic distance triathlon competition performance. The subjects were thirteen college triathletes with endurance training, with an average age, height and weight of 20.61±1.04 years (mean ± SD), 171.76±8.54 cm and 65.32±8.14 kg respectively. All subjects were required to take the tests of swim 1500m, run 10000m, VO2 max, body fat, and participate in the Olympic distance triathlon competition. First, the swim 1500m test was taken in the standardized 50m pool, with a depth of 2m, and the 10000m run test on the standardized 400m track. After three days, VO2 max was tested with the MetaMax 3B and body fat was measured with the DEXA machine. After two weeks, all 13 subjects joined the Olympic distance triathlon competition at the 2016 New Taipei City Asian Cup. The relationships between swim 1500m, 10000m run, VO2 max, body fat test, and Olympic distance triathlon competition performance were evaluated using Pearson's product-moment correlation. The results show that 10000m run and body fat had a significant positive correlation with Olympic distance triathlon performance (r=.830, .768), but VO2 max has a significant negative correlation with Olympic distance triathlon performance (r=-.735). In conclusion, for improved non-draft Olympic distance triathlon performance, triathletes should focus on running than swimming training and can be measure VO2 max to prediction triathlon performance. Also, managing body fat can improve Olympic distance triathlon performance. In addition, swimming performance was not significantly correlated to Olympic distance triathlon performance, possibly because the 2016 New Taipei City Asian Cup age group was not a drafting competition. The swimming race is the shortest component of Olympic distance triathlons. Therefore, in a non-draft competition, swimming ability is not significantly correlated with overall performance.Keywords: triathletes, olympic, non-drafting, correlation
Procedia PDF Downloads 250585 Moderation in Temperature Dependence on Counter Frictional Coefficient and Prevention of Wear of C/C Composites by Synthesizing SiC around Surface and Internal Vacancies
Authors: Noboru Wakamoto, Kiyotaka Obunai, Kazuya Okubo, Toru Fujii
Abstract:
The aim of this study is to moderate the dependence of counter frictional coefficient on temperature between counter surfaces and to reduce the wear of C/C composites at low temperature. To modify the C/C composites, Silica (SiO2) powders were added into phenolic resin for carbon precursor. The preform plate of the precursor of C/C composites was prepared by conventional filament winding method. The C/C composites plates were obtained by carbonizing preform plate at 2200 °C under an argon atmosphere. At that time, the silicon carbides (SiC) were synthesized around the surfaces and the internal vacancies of the C/C composites. The frictional coefficient on the counter surfaces and specific wear volumes of the C/C composites were measured by our developed frictional test machine like pin-on disk type. The XRD indicated that SiC was synthesized in the body of C/C composite fabricated by current method. The results of friction test showed that coefficient of friction of unmodified C/C composites have temperature dependence when the test condition was changed. In contrast, frictional coefficient of the C/C composite modified with SiO2 powders was almost constant at about 0.27 when the temperature condition was changed from Room Temperature (RT) to 300 °C. The specific wear rate decreased from 25×10-6 mm2/N to 0.1×10-6 mm2/N. The observations of the surfaces after friction tests showed that the frictional surface of the modified C/C composites was covered with a film produced by the friction. This study found that synthesizing SiC around surface and internal vacancies of C/C composites was effective to moderate the dependence on the frictional coefficient and reduce to the abrasion of C/C composites.Keywords: C/C composites, friction coefficient, wear, SiC
Procedia PDF Downloads 345584 TimeTune: Personalized Study Plans Generation with Google Calendar Integration
Authors: Chevon Fernando, Banuka Athuraliya
Abstract:
The purpose of this research is to provide a solution to the students’ time management, which usually becomes an issue because students must study and manage their personal commitments. "TimeTune," an AI-based study planner that provides an opportunity to maneuver study timeframes by incorporating modern machine learning algorithms with calendar applications, is unveiled as the ideal solution. The research is focused on the development of LSTM models that connect to the Google Calendar API in the process of developing learning paths that would be fit for a unique student's daily life experience and study history. A key finding of this research is the success in building the LSTM model to predict optimal study times, which, integrating with the real-time data of Google Calendar, will generate the timetables automatically in a personalized and customized manner. The methodology encompasses Agile development practices and Object-Oriented Analysis and Design (OOAD) principles, focusing on user-centric design and iterative development. By adopting this method, students can significantly reduce the tension associated with poor study habits and time management. In conclusion, "TimeTune" displays an advanced step in personalized education technology. The fact that its application of ML algorithms and calendar integration is quite innovative is slowly and steadily revolutionizing the lives of students. The excellence of maintaining a balanced academic and personal life is stress reduction, which the applications promise to provide for students when it comes to managing their studies.Keywords: personalized learning, study planner, time management, calendar integration
Procedia PDF Downloads 49583 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit
Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana
Abstract:
Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification
Procedia PDF Downloads 156