Search results for: multivariate time series data
34668 The Role of Synthetic Data in Aerial Object Detection
Authors: Ava Dodd, Jonathan Adams
Abstract:
The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.Keywords: computer vision, machine learning, synthetic data, YOLOv4
Procedia PDF Downloads 22534667 Time Truncated Group Acceptance Sampling Plans for Exponentiated Half Logistic Distribution
Authors: Srinivasa Rao Gadde
Abstract:
In this article, we considered a group acceptance sampling plans for exponentiated half logistic distribution when the life-test is truncated at a pre-specified time. It is assumed that the index parameter of the exponentiated half logistic distribution is known. The design parameters such as the number of groups and the acceptance number are obtained by satisfying the producer’s and consumer’s risks at the specified quality levels in terms of medians and 10th percentiles under the assumption that the termination time and the number of items in each group are pre-fixed. Finally, an example is given to illustration the methodology.Keywords: group acceptance sampling plan, operating characteristic, consumer and producer’s risks, truncated life-test
Procedia PDF Downloads 34034666 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Authors: K. Indra Gandhi
Abstract:
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks
Procedia PDF Downloads 43434665 Precise CNC Machine for Multi-Tasking
Authors: Haroon Jan Khan, Xian-Feng Xu, Syed Nasir Shah, Anooshay Niazi
Abstract:
CNC machines are not only used on a large scale but also now become a prominent necessity among households and smaller businesses. Printed Circuit Boards manufactured by the chemical process are not only risky and unsafe but also expensive and time-consuming. A 3-axis precise CNC machine has been developed, which not only fabricates PCB but has also been used for multi-tasks just by changing the materials used and tools, making it versatile. The advanced CNC machine takes data from CAM software. The TB-6560 controller is used in the CNC machine to adjust variation in the X, Y, and Z axes. The advanced machine is efficient in automatic drilling, engraving, and cutting.Keywords: CNC, G-code, CAD, CAM, Proteus, FLATCAM, Easel
Procedia PDF Downloads 16034664 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs
Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour
Procedia PDF Downloads 9534663 Structural and Electrochemical Characterization of Columnar-Structured Mn-Doped Bi26Mo10O69-d Electrolytes
Authors: Maria V. Morozova, Zoya A. Mikhaylovskaya, Elena S. Buyanova, Sofia A. Petrova, Ksenia V. Arishina, Robert G. Zaharov
Abstract:
The present work is devoted to the investigation of two series of doped bismuth molybdates: Bi₂₆-₂ₓMn₂ₓMo₁₀O₆₉-d and Bi₂₆Mo₁₀-₂yMn₂yO₆₉-d. Complex oxides were synthesized by conventional solid state technology and by co-precipitation method. The products were identified by powder diffraction. The powders and ceramic samples were examined by means of densitometry, laser diffraction, and electron microscopic methods. Porosity of the ceramic materials was estimated using the hydrostatic method. The electrical conductivity measurements were carried out using impedance spectroscopy method.Keywords: bismuth molybdate, columnar structures, impedance spectroscopy, oxygen ionic conductors
Procedia PDF Downloads 43634662 Torrefaction of Spelt Husks to Increase Its Fuel Properties
Authors: Abubakar Halidu, Paul E. Bilsborrow, Anh N. Phan
Abstract:
Torrefaction is a term that refers to the moderate pyrolysis of biomass at temperatures between 200 and 300oC in an oxygen-free environment to boost its heating value, grindability, and storability. This process can also be used as a pre-treatment for other thermochemical processes. The torrefaction of spelt husks was carried out at temperatures of 200, 250, and 300oC in an inert nitrogen environment with a heating rate of 20oC.min-1 and a residence time of 15–60 min, respectively. We examined the influence of torrefaction temperatures and residence time. The results indicated that increasing the torrefaction temperature increased the higher heating values (HHV) and improved grindability. Torrefied spelt husks at 300oC for 15 minutes exhibited the highest increase in HHV at 30.88 MJ kg-1, compared to non-torrefied spelt husks at 17.56 MJ kg-1.Keywords: grindability, higher heating value, residence time, temperature, torrefaction
Procedia PDF Downloads 18434661 Exposure Assessment to Airborne Particulate Matter in Agriculture
Authors: K. Rumchev, S. Gilbey
Abstract:
Airborne particulate matter is a known hazard to human health, with a considerable body of evidence linking agricultural dust exposures to adverse human health effects in exposed populations. It is also known that agricultural workers are exposed to high levels of soil dust and other types of airborne particulate matter within the farming environment. The aim of this study was to examine exposure to agricultural dust among farm workers during the seeding season. Twenty-one wheat-belt farms consented to participate in the study with 30 workers being monitored for dust exposure whilst seeding or undertaking seeding associated tasks. Each farm was visited once and farmers’ were asked to wear a personal air sampler for a 4-hour sampling period. Simultaneous, real-time, tractor cabin air quality monitoring was also undertaken. Data for this study was collected using real-time aerosol dust monitors to determine in-tractor cabin PM exposure to five size fractions (total, PM10, respirable, PM2.5 and PM1), and personal sampling was undertaken to establish individual exposure to inhalable and respirable dust concentrations. The study established a significant difference between personal exposures and simultaneous real-time in-cabin exposures for both inhalable and respirable fractions. No significant difference was shown between in-cabin and personal inhalable dust concentrations during seeding and spraying tasks, although both in-cabin and personal concentrations were two times greater for seeding than spraying. Future research should focus on educating and providing farm owners and workers with more information on adopting safe work practices to minimise harmful exposures to agricultural dust.Keywords: agriculture, air quality, Australia, particulate matter
Procedia PDF Downloads 21734660 Dynamics of Light Induced Current in 1D Coupled Quantum Dots
Authors: Tokuei Sako
Abstract:
Laser-induced current in a quasi-one-dimensional nanostructure has been studied by a model of a few electrons confined in a 1D electrostatic potential coupled to electrodes at both ends and subjected to a pulsed laser field. The time-propagation of the one- and two-electron wave packets has been calculated by integrating the time-dependent Schrödinger equation directly by the symplectic integrator method with uniform Fourier grid. The temporal behavior of the resultant light-induced current in the studied systems has been discussed with respect to the lifetime of the quasi-bound states formed when the static bias voltage is applied.Keywords: pulsed laser field, nanowire, electron wave packet, quantum dots, time-dependent Schrödinger equation
Procedia PDF Downloads 35734659 Variations in Breast Aesthetic Reconstruction Rates between Asian and Caucasian Patients Post Mastectomy in a UK Tertiary Breast Referral Centre: A Five-Year Institutional Review
Authors: Wisam Ismail, Chole Wright, Elizabeth Baker, Cathy Tait, Mohamed Salhab, Richard Linforth
Abstract:
Background: Post-mastectomy breast reconstruction is an important treatment option for women with breast cancer with psychosocial, emotional and quality of life benefits. Despite this, Asian patients are one-fifth as likely as Caucasian patients to undergo reconstruction after mastectomy. Aim: This study aimed to assess the difference in breast reconstruction rates between Asian and Caucasian patients treated at Bradford Teaching Hospitals between May 2011 – December 2015.The long-term goal is to equip healthcare professionals to improve breast cancer treatment outcome by increasing breast reconstruction rates in this sub-population. Methods: All patients undergoing mastectomy were identified using a prospectively collected departmental database. Further data was obtained via retrospective electronic case note review. Bradford city population is about 530.000 by the end of 2015, with 67.44% of the city's population was White ethnic groups and 26.83% Asian Ethnic Groups (UK population consensus). The majority of Asian population speaks Urdu, hence an Urdu speaking breast care nurse was appointed to facilitate communications and deliver a better understanding of the reconstruction options and pathways. Statistical analysis was undertaken using the SAS program. Patients were stratified by age, self-reported ethnicity, axillary surgery and reconstruction. Relative odds were calculated using univariate and multivariate logistic regression analyses with adjustment for known confounders. An Urdu speaking breast care nurse was employed throughout this period to facilitate communication and patient decision making. Results: 506 patients underwent Mastectomy over 5 years. 72 (14%) Asian v. 434 (85%) Caucasian. Overall median age is 64 years (SD1.1). Asian median age is 62 (SD0.9), versus Caucasian 65 (SD1.2). Total axillary clearance rate was 30% (42% Asian v.30% Caucasian). Overall reconstruction rate was 126 patients (28.9%).Only 6 of 72 Asian patients (<1%) underwent breast reconstruction versus 121of 434 Caucasian (28%) (p < 0.04), Odds ratio 0.68, (95% confidence interval 0.57-0.79). Conclusions: There is a significant difference in post-mastectomy reconstruction rates between Asian and Caucasian patients. This difference is likely to be multi-factorial. Higher rates of axillary clearance in Asian patients might suggest later disease presentation and/or higher rates of subsequent adjuvant therapy, both of which, can impact on the suitability of breast reconstruction. Strategies aimed at reducing racial disparities in breast reconstruction should include symptom awareness to enable earlier presentation and facilitated communication to ensure informed decision-making.Keywords: aesthetic, Asian, breast, reconstruction
Procedia PDF Downloads 27634658 A Motion Dictionary to Real-Time Recognition of Sign Language Alphabet Using Dynamic Time Warping and Artificial Neural Network
Authors: Marcio Leal, Marta Villamil
Abstract:
Computacional recognition of sign languages aims to allow a greater social and digital inclusion of deaf people through interpretation of their language by computer. This article presents a model of recognition of two of global parameters from sign languages; hand configurations and hand movements. Hand motion is captured through an infrared technology and its joints are built into a virtual three-dimensional space. A Multilayer Perceptron Neural Network (MLP) was used to classify hand configurations and Dynamic Time Warping (DWT) recognizes hand motion. Beyond of the method of sign recognition, we provide a dataset of hand configurations and motion capture built with help of fluent professionals in sign languages. Despite this technology can be used to translate any sign from any signs dictionary, Brazilian Sign Language (Libras) was used as case study. Finally, the model presented in this paper achieved a recognition rate of 80.4%.Keywords: artificial neural network, computer vision, dynamic time warping, infrared, sign language recognition
Procedia PDF Downloads 21734657 Performance Evaluation of Discrete Fourier Transform Algorithm Based PMU for Wide Area Measurement System
Authors: Alpesh Adeshara, Rajendrasinh Jadeja, Praghnesh Bhatt
Abstract:
Implementation of advanced technologies requires sophisticated instruments that deal with the operation, control, restoration and protection of rapidly growing power system network under normal and abnormal conditions. Presently, the applications of Phasor Measurement Unit (PMU) are widely found in real time operation, monitoring, controlling and analysis of power system network as it eliminates the various limitations of Supervisory Control and Data Acquisition System (SCADA) conventionally used in power system. The use of PMU data is very rapidly increasing its importance for online and offline analysis. Wide Area Measurement System (WAMS) is developed as new technology by use of multiple PMUs in power system. The present paper proposes a model of MATLAB based PMU using Discrete Fourier Transform (DFT) algorithm and evaluation of its operation under different contingencies. In this paper, PMU based two bus system having WAMS network is presented as a case study.Keywords: GPS global positioning system, PMU phasor measurement system, WAMS wide area monitoring system, DFT, PDC
Procedia PDF Downloads 49634656 Electromyography Pattern Classification with Laplacian Eigenmaps in Human Running
Authors: Elnaz Lashgari, Emel Demircan
Abstract:
Electromyography (EMG) is one of the most important interfaces between humans and robots for rehabilitation. Decoding this signal helps to recognize muscle activation and converts it into smooth motion for the robots. Detecting each muscle’s pattern during walking and running is vital for improving the quality of a patient’s life. In this study, EMG data from 10 muscles in 10 subjects at 4 different speeds were analyzed. EMG signals are nonlinear with high dimensionality. To deal with this challenge, we extracted some features in time-frequency domain and used manifold learning and Laplacian Eigenmaps algorithm to find the intrinsic features that represent data in low-dimensional space. We then used the Bayesian classifier to identify various patterns of EMG signals for different muscles across a range of running speeds. The best result for vastus medialis muscle corresponds to 97.87±0.69 for sensitivity and 88.37±0.79 for specificity with 97.07±0.29 accuracy using Bayesian classifier. The results of this study provide important insight into human movement and its application for robotics research.Keywords: electromyography, manifold learning, ISOMAP, Laplacian Eigenmaps, locally linear embedding
Procedia PDF Downloads 36334655 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 10334654 On Dynamic Chaotic S-BOX Based Advanced Encryption Standard Algorithm for Image Encryption
Authors: Ajish Sreedharan
Abstract:
Security in transmission and storage of digital images has its importance in today’s image communications and confidential video conferencing. Due to the increasing use of images in industrial process, it is essential to protect the confidential image data from unauthorized access. Advanced Encryption Standard (AES) is a well known block cipher that has several advantages in data encryption. However, it is not suitable for real-time applications. This paper presents modifications to the Advanced Encryption Standard to reflect a high level security and better image encryption. The modifications are done by adjusting the ShiftRow Transformation and using On Dynamic chaotic S-BOX. In AES the Substitute bytes, Shift row and Mix columns by themselves would provide no security because they do not use the key. In Dynamic chaotic S-BOX Based AES the Substitute bytes provide security because the S-Box is constructed from the key. Experimental results verify and prove that the proposed modification to image cryptosystem is highly secure from the cryptographic viewpoint. The results also prove that with a comparison to original AES encryption algorithm the modified algorithm gives better encryption results in terms of security against statistical attacks.Keywords: advanced encryption standard (AES), on dynamic chaotic S-BOX, image encryption, security analysis, ShiftRow transformation
Procedia PDF Downloads 43734653 A Dataset of Program Educational Objectives Mapped to ABET Outcomes: Data Cleansing, Exploratory Data Analysis and Modeling
Authors: Addin Osman, Anwar Ali Yahya, Mohammed Basit Kamal
Abstract:
Datasets or collections are becoming important assets by themselves and now they can be accepted as a primary intellectual output of a research. The quality and usage of the datasets depend mainly on the context under which they have been collected, processed, analyzed, validated, and interpreted. This paper aims to present a collection of program educational objectives mapped to student’s outcomes collected from self-study reports prepared by 32 engineering programs accredited by ABET. The manual mapping (classification) of this data is a notoriously tedious, time consuming process. In addition, it requires experts in the area, which are mostly not available. It has been shown the operational settings under which the collection has been produced. The collection has been cleansed, preprocessed, some features have been selected and preliminary exploratory data analysis has been performed so as to illustrate the properties and usefulness of the collection. At the end, the collection has been benchmarked using nine of the most widely used supervised multiclass classification techniques (Binary Relevance, Label Powerset, Classifier Chains, Pruned Sets, Random k-label sets, Ensemble of Classifier Chains, Ensemble of Pruned Sets, Multi-Label k-Nearest Neighbors and Back-Propagation Multi-Label Learning). The techniques have been compared to each other using five well-known measurements (Accuracy, Hamming Loss, Micro-F, Macro-F, and Macro-F). The Ensemble of Classifier Chains and Ensemble of Pruned Sets have achieved encouraging performance compared to other experimented multi-label classification methods. The Classifier Chains method has shown the worst performance. To recap, the benchmark has achieved promising results by utilizing preliminary exploratory data analysis performed on the collection, proposing new trends for research and providing a baseline for future studies.Keywords: ABET, accreditation, benchmark collection, machine learning, program educational objectives, student outcomes, supervised multi-class classification, text mining
Procedia PDF Downloads 17334652 Effect of Water Absorption on the Fatigue Behavior of Glass/Polyester Composite
Authors: Djamel Djeghader, Bachir Redjel
Abstract:
The composite materials of glass fibers can be used as a repair material for damage elements under repeated stresses, and in various environments. A cyclic bending characterization of a glass/polyester composite material was carried out with consideration of the period of immersion in water. These tests describe the behavior of materials and identify the mechanical fatigue characteristics using the Wohler Curve for different immersion time: 0, 90, 180 and 270 days in water. These curves are characterized by a dispersion in the lifetimes were modeled by straight whose intercepts are very similar and comparable to the static strength. This material deteriorates fatigue at a constant rate, which increases with increasing immersion time in water at a constant speed. The endurance limit seems to be independent of the immersion time in the water.Keywords: fatigue, composite, glass, polyester, immersion, wohler
Procedia PDF Downloads 31434651 Transient Heat Conduction in Nonuniform Hollow Cylinders with Time Dependent Boundary Condition at One Surface
Authors: Sen Yung Lee, Chih Cheng Huang, Te Wen Tu
Abstract:
A solution methodology without using integral transformation is proposed to develop analytical solutions for transient heat conduction in nonuniform hollow cylinders with time-dependent boundary condition at the outer surface. It is shown that if the thermal conductivity and the specific heat of the medium are in arbitrary polynomial function forms, the closed solutions of the system can be developed. The influence of physical properties on the temperature distribution of the system is studied. A numerical example is given to illustrate the efficiency and the accuracy of the solution methodology.Keywords: analytical solution, nonuniform hollow cylinder, time-dependent boundary condition, transient heat conduction
Procedia PDF Downloads 50534650 Remaining Useful Life (RUL) Assessment Using Progressive Bearing Degradation Data and ANN Model
Authors: Amit R. Bhende, G. K. Awari
Abstract:
Remaining useful life (RUL) prediction is one of key technologies to realize prognostics and health management that is being widely applied in many industrial systems to ensure high system availability over their life cycles. The present work proposes a data-driven method of RUL prediction based on multiple health state assessment for rolling element bearings. Bearing degradation data at three different conditions from run to failure is used. A RUL prediction model is separately built in each condition. Feed forward back propagation neural network models are developed for prediction modeling.Keywords: bearing degradation data, remaining useful life (RUL), back propagation, prognosis
Procedia PDF Downloads 43634649 H.263 Based Video Transceiver for Wireless Camera System
Authors: Won-Ho Kim
Abstract:
In this paper, a design of H.263 based wireless video transceiver is presented for wireless camera system. It uses standard WIFI transceiver and the covering area is up to 100m. Furthermore the standard H.263 video encoding technique is used for video compression since wireless video transmitter is unable to transmit high capacity raw data in real time and the implemented system is capable of streaming at speed of less than 1Mbps using NTSC 720x480 video.Keywords: wireless video transceiver, video surveillance camera, H.263 video encoding digital signal processing
Procedia PDF Downloads 36534648 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language
Authors: Ghazal Faraj, András Micsik
Abstract:
Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping
Procedia PDF Downloads 25334647 Optimization of Transmission Loss on a Series-Coupled Muffler by Taguchi Method
Authors: Jing-Fung Lin, Jer-Jia Sheu
Abstract:
In this study, an approach has been developed for the noise reduction of a muffler. The transmission loss (TL) in the muffler is maximized by the use of a double-chamber muffler, and a baffle with a hole is inserted between chambers. Taguchi method is used to optimize the design for the acoustical performance of the muffler. The TL performance is evaluated by COMSOL software. The excellent parameter combination for the maximum TL is attained as high as 35.30 dB in a wide frequency range from 10 Hz to 1400 Hz. The influence sequence of four parameters on TL is determined by the range analysis. The effects of length and expansion ratio of the first chamber on TL performance for the excellent program were discussed. Comparisons of the TL results from different designs are made.Keywords: acoustics, baffle, chamber, muffler, Taguchi method, transmission loss
Procedia PDF Downloads 11434646 A Rationale to Describe Ambident Reactivity
Authors: David Ryan, Martin Breugst, Turlough Downes, Peter A. Byrne, Gerard P. McGlacken
Abstract:
An ambident nucleophile is a nucleophile that possesses two or more distinct nucleophilic sites that are linked through resonance and are effectively “in competition” for reaction with an electrophile. Examples include enolates, pyridone anions, and nitrite anions, among many others. Reactions of ambident nucleophiles and electrophiles are extremely prevalent at all levels of organic synthesis. The principle of hard and soft acids and bases (the “HSAB principle”) is most commonly cited in the explanation of selectivities in such reactions. Although this rationale is pervasive in any discussion on ambident reactivity, the HSAB principle has received considerable criticism. As a result, the principle’s supplantation has become an area of active interest in recent years. This project focuses on developing a model for rationalizing ambident reactivity. Presented here is an approach that incorporates computational calculations and experimental kinetic data to construct Gibbs energy profile diagrams. The preferred site of alkylation of nitrite anion with a range of ‘hard’ and ‘soft’ alkylating agents was established by ¹H NMR spectroscopy. Pseudo-first-order rate constants were measured directly by ¹H NMR reaction monitoring, and the corresponding second-order constants and Gibbs energies of activation were derived. These, in combination with computationally derived standard Gibbs energies of reaction, were sufficient to construct Gibbs energy wells. By representing the ambident system as a series of overlapping Gibbs energy wells, a more intuitive picture of ambident reactivity emerges. Here, previously unexplained switches in reactivity in reactions involving closely related electrophiles are elucidated.Keywords: ambident, Gibbs, nucleophile, rates
Procedia PDF Downloads 8434645 Process Data-Driven Representation of Abnormalities for Efficient Process Control
Authors: Hyun-Woo Cho
Abstract:
Unexpected operational events or abnormalities of industrial processes have a serious impact on the quality of final product of interest. In terms of statistical process control, fault detection and diagnosis of processes is one of the essential tasks needed to run the process safely. In this work, nonlinear representation of process measurement data is presented and evaluated using a simulation process. The effect of using different representation methods on the diagnosis performance is tested in terms of computational efficiency and data handling. The results have shown that the nonlinear representation technique produced more reliable diagnosis results and outperforms linear methods. The use of data filtering step improved computational speed and diagnosis performance for test data sets. The presented scheme is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. Thus this scheme helps to reduce the sensitivity of empirical models to noise.Keywords: fault diagnosis, nonlinear technique, process data, reduced spaces
Procedia PDF Downloads 24734644 Plackett-Burman Design to Evaluate the Influence of Operating Parameters on Anaerobic Orthophosphate Release from Enhanced Biological Phosphorus Removal Sludge
Authors: Reza Salehi, Peter L. Dold, Yves Comeau
Abstract:
The aim of the present study was to investigate the effect of a total of 6 operating parameters including pH (X1), temperature (X2), stirring speed (X3), chemical oxygen demand (COD) (X4), volatile suspended solids (VSS) (X5) and time (X6) on anaerobic orthophosphate release from enhanced biological phosphorus removal (EBPR) sludge. An 8-run Plackett Burman design was applied and the statistical analysis of the experimental data was performed using Minitab16.2.4 software package. The Analysis of variance (ANOVA) results revealed that temperature, COD, VSS and time had a significant effect with p-values of less than 0.05 whereas pH and stirring speed were identified as non-significant parameters, but influenced orthophosphate release from the EBPR sludge. The mathematic expression obtained by the first-order multiple linear regression model between orthophosphate release from the EBPR sludge (Y) and the operating parameters (X1-X6) was Y=18.59+1.16X1-3.11X2-0.81X3+3.79X4+9.89X5+4.01X6. The model p-value and coefficient of determination (R2) value were 0.026 and of 99.87%, respectively, which indicates the model is significant and the predicted values of orthophosphate release from the EBPR sludge have been excellently correlated with the observed values.Keywords: anaerobic, operating parameters, orthophosphate release, Plackett-Burman design
Procedia PDF Downloads 27934643 Impact of Television on the Coverage of Lassa Fever Disease in Nigeria
Authors: H. Shola Adeosun, F. Ajoke Adebiyi
Abstract:
This study appraises the impact of television on the coverage of Lassa Fever disease. The objectives of the study are to find out whether television is an effective tool for raising awareness about Lassa fever shapes the perception of members of the public. The research work was based on the theoretical foundation of Agenda – setting and reinforcement theory. Survey research method was adopted in the study to elicit data from the residents of Obafemi Owode Local Government, area of Ogun state. Questionnaire and oral interview were adopted as a tool for data gathering. Simple random sampling techniques were used to draw a sample for this study. Out of filled 400 questionnaires distributed to the respondents. 37 of them were incorrectly filled and returned at the stipulated time. This is about (92.5% Tables, percentages, and figures were used to analyse and interpret the data and hypothesis formulation for this study revealed that Lassa fever diseases with higher media coverage were considered more serious and more representative of a disease and estimated to have lower incidents, than diseases less frequently found in the media. Thus, 92% of the respondents agree that they have access to television coverage of Lassa fever disease led to exaggerated perceptions of personal vulnerability. It, therefore, concludes that there is a need for relevant stakeholders to ensure better community health education and improved housing conditions in southwestern Nigeria, with an emphasis on slum areas and that Nigeria need to focus on the immediate response, while preparing for the future because a society or community is all about the people who inhabit. Therefore every effort must be geared towards their society and survival.Keywords: impact, television, coverage, Lassa fever disease
Procedia PDF Downloads 21234642 Using Optimal Control Method to Investigate the Stability and Transparency of a Nonlinear Teleoperation System with Time Varying Delay
Authors: Abasali Amini, Alireza Mirbagheri, Amir Homayoun Jafari
Abstract:
In this paper, a new structure for teleoperation systems with time varying delay has been modeled and proposed. A random time varying the delay of up to 150 msec is simulated in teleoperation channel of both masters to slave and vice versa. The system stability and transparency have been investigated, comparing the result of a PID controller and an optimal controller on each master and slave sub-systems separately. The controllers have been designed in slave subsystem for reducing position errors between master and slave, and another controller has been designed in the master subsystem to establish stability, transparency and force tracking. Results have been compared together. The results showed PID controller is appropriate in position tracking, but force response oscillates in contact with the environment. We showed the optimal control established position tracking properly. Also, force tracking is achieved in this controller appropriately.Keywords: optimal control, time varying delay, teleoperation systems, stability and transparency
Procedia PDF Downloads 25734641 Energy-Aware Scheduling in Real-Time Systems: An Analysis of Fair Share Scheduling and Priority-Driven Preemptive Scheduling
Authors: Su Xiaohan, Jin Chicheng, Liu Yijing, Burra Venkata Durga Kumar
Abstract:
Energy-aware scheduling in real-time systems aims to minimize energy consumption, but issues related to resource reservation and timing constraints remain challenges. This study focuses on analyzing two scheduling algorithms, Fair-Share Scheduling (FFS) and Priority-Driven Preemptive Scheduling (PDPS), for solving these issues and energy-aware scheduling in real-time systems. Based on research on both algorithms and the processes of solving two problems, it can be found that Fair-Share Scheduling ensures fair allocation of resources but needs to improve with an imbalanced system load, and Priority-Driven Preemptive Scheduling prioritizes tasks based on criticality to meet timing constraints through preemption but relies heavily on task prioritization and may not be energy efficient. Therefore, improvements to both algorithms with energy-aware features will be proposed. Future work should focus on developing hybrid scheduling techniques that minimize energy consumption through intelligent task prioritization, resource allocation, and meeting time constraints.Keywords: energy-aware scheduling, fair-share scheduling, priority-driven preemptive scheduling, real-time systems, optimization, resource reservation, timing constraints
Procedia PDF Downloads 11934640 Response Surface Methodology to Obtain Disopyramide Phosphate Loaded Controlled Release Ethyl Cellulose Microspheres
Authors: Krutika K. Sawant, Anil Solanki
Abstract:
The present study deals with the preparation and optimization of ethyl cellulose-containing disopyramide phosphate loaded microspheres using solvent evaporation technique. A central composite design consisting of a two-level full factorial design superimposed on a star design was employed for optimizing the preparation microspheres. The drug:polymer ratio (X1) and speed of the stirrer (X2) were chosen as the independent variables. The cumulative release of the drug at a different time (2, 6, 10, 14, and 18 hr) was selected as the dependent variable. An optimum polynomial equation was generated for the prediction of the response variable at time 10 hr. Based on the results of multiple linear regression analysis and F statistics, it was concluded that sustained action can be obtained when X1 and X2 are kept at high levels. The X1X2 interaction was found to be statistically significant. The drug release pattern fitted the Higuchi model well. The data of a selected batch were subjected to an optimization study using Box-Behnken design, and an optimal formulation was fabricated. Good agreement was observed between the predicted and the observed dissolution profiles of the optimal formulation.Keywords: disopyramide phosphate, ethyl cellulose, microspheres, controlled release, Box-Behnken design, factorial design
Procedia PDF Downloads 45834639 Presenting a Model in the Analysis of Supply Chain Management Components by Using Statistical Distribution Functions
Authors: Ramin Rostamkhani, Thurasamy Ramayah
Abstract:
One of the most important topics of today’s industrial organizations is the challenging issue of supply chain management. In this field, scientists and researchers have published numerous practical articles and models, especially in the last decade. In this research, to our best knowledge, the discussion of data modeling of supply chain management components using well-known statistical distribution functions has been considered. The world of science owns mathematics, and showing the behavior of supply chain data based on the characteristics of statistical distribution functions is innovative research that has not been published anywhere until the moment of doing this research. In an analytical process, describing different aspects of functions including probability density, cumulative distribution, reliability, and failure function can reach the suitable statistical distribution function for each of the components of the supply chain management. It can be applied to predict the behavior data of the relevant component in the future. Providing a model to adapt the best statistical distribution function in the supply chain management components will be a big revolution in the field of the behavior of the supply chain management elements in today's industrial organizations. Demonstrating the final results of the proposed model by introducing the process capability indices before and after implementing it alongside verifying the approach through the relevant assessment as an acceptable verification is a final step. The introduced approach can save the required time and cost to achieve the organizational goals. Moreover, it can increase added value in the organization.Keywords: analyzing, process capability indices, statistical distribution functions, supply chain management components
Procedia PDF Downloads 87