Search results for: tool validation
4774 Social Media Marketing Efforts and Hospital Brand Equity: An Empirical Investigation
Authors: Abrar R. Al-Hasan
Abstract:
Despite the widespread use of social media by consumers and marketers, empirical research investigating their economic value in the healthcare industry still lags. This study explores the impact of the use of social media marketing efforts on a hospital's brand equity and, ultimately, consumer response. Using social media data from Twitter and Facebook, along with an online and offline survey methodology, data is analyzed using logistic regression models. A random sample of (728) residents of the Kuwaiti population is used. The results of this study found that social media marketing efforts (SMME) in terms of use and validation lead to higher hospital brand equity and in turn, patient loyalty and patient visit. The study highlights the impact of SMME on hospital brand equity and patient response. Healthcare organizations should guide their marketing efforts to better manage this new way of marketing and communicating with patients to enhance their consumer loyalty and financial performance.Keywords: brand equity, healthcare marketing, patient visit, social media, SMME
Procedia PDF Downloads 1734773 Performance Investigation of UAV Attitude Control Based on Modified PI-D and Nonlinear Dynamic Inversion
Authors: Ebrahim Hassan Kapeel, Ahmed Mohsen Kamel, Hossan Hendy, Yehia Z. Elhalwagy
Abstract:
Interest in autopilot design has been raised intensely as a result of recent advancements in Unmanned Aerial vehicles (UAVs). Due to the enormous number of applications that UAVs can achieve, the number of applied control theories used for them has increased in recent years. These small fixed-wing UAVs are suffering high non-linearity, sensitivity to disturbances, and coupling effects between their channels. In this work, the nonlinear dynamic inversion (NDI) control lawisdesigned for a nonlinear small fixed-wing UAV model. The NDI is preferable for varied operating conditions, there is no need for a scheduling controller. Moreover, it’s applicable for high angles of attack. For the designed flight controller validation, a nonlinear Modified PI-D controller is performed with our model. A comparative study between both controllers is achieved to evaluate the NDI performance. Simulation results and analysis are proposed to illustrate the effectiveness of the designed controller based on NDI.Keywords: UAV dynamic model, attitude control, nonlinear PID, dynamic inversion
Procedia PDF Downloads 1104772 Correlation Results Based on Magnetic Susceptibility Measurements by in-situ and Ex-Situ Measurements as Indicators of Environmental Changes Due to the Fertilizer Industry
Authors: Nurin Amalina Widityani, Adinda Syifa Azhari, Twin Aji Kusumagiani, Eleonora Agustine
Abstract:
Fertilizer industry activities contribute to environmental changes. Changes to the environment became one of a few problems in this era of globalization. Parameters that can be seen as criteria to identify changes in the environment can be seen from the aspects of physics, chemistry, and biology. One aspect that can be assessed quickly and efficiently to describe environmental change is the aspect of physics, one of which is the value of magnetic susceptibility (χ). The rock magnetism method can be used as a proxy indicator of environmental changes, seen from the value of magnetic susceptibility. The rock magnetism method is based on magnetic susceptibility studies to measure and classify the degree of pollutant elements that cause changes in the environment. This research was conducted in the area around the fertilizer plant, with five coring points on each track, each coring point a depth of 15 cm. Magnetic susceptibility measurements were performed by in-situ and ex-situ. In-situ measurements were carried out directly by using the SM30 tool by putting the tools on the soil surface at each measurement point and by that obtaining the value of the magnetic susceptibility. Meanwhile, ex-situ measurements are performed in the laboratory by using the Bartington MS2B tool’s susceptibility, which is done on a coring sample which is taken every 5 cm. In-situ measurement shows results that the value of magnetic susceptibility at the surface varies, with the lowest score on the second and fifth points with the -0.81 value and the highest value at the third point, with the score of 0,345. Ex-situ measurements can find out the variations of magnetic susceptibility values at each depth point of coring. At a depth of 0-5 cm, the value of the highest XLF = 494.8 (x10-8m³/kg) is at the third point, while the value of the lowest XLF = 187.1 (x10-8m³/kg) at first. At a depth of 6-10 cm, the highest value of the XLF was at the second point, which was 832.7 (x10-8m³/kg) while the lowest XLF is at the first point, at 211 (x10-8m³/kg). At a depth of 11-15 cm, the XLF’s highest value = 857.7 (x10-8m³/kg) is at the second point, whereas the value of the lowest XLF = 83.3 (x10-8m³/kg) is at the fifth point. Based on the in situ and exsit measurements, it can be seen that the highest magnetic susceptibility values from the surface samples are at the third point.Keywords: magnetic susceptibility, fertilizer plant, Bartington MS2B, SM30
Procedia PDF Downloads 3424771 Machine Learning Driven Analysis of Kepler Objects of Interest to Identify Exoplanets
Authors: Akshat Kumar, Vidushi
Abstract:
This paper identifies 27 KOIs, 26 of which are currently classified as candidates and one as false positives that have a high probability of being confirmed. For this purpose, 11 machine learning algorithms were implemented on the cumulative kepler dataset sourced from the NASA exoplanet archive; it was observed that the best-performing model was HistGradientBoosting and XGBoost with a test accuracy of 93.5%, and the lowest-performing model was Gaussian NB with a test accuracy of 54%, to test model performance F1, cross-validation score and RUC curve was calculated. Based on the learned models, the significant characteristics for confirm exoplanets were identified, putting emphasis on the object’s transit and stellar properties; these characteristics were namely koi_count, koi_prad, koi_period, koi_dor, koi_ror, and koi_smass, which were later considered to filter out the potential KOIs. The paper also calculates the Earth similarity index based on the planetary radius and equilibrium temperature for each KOI identified to aid in their classification.Keywords: Kepler objects of interest, exoplanets, space exploration, machine learning, earth similarity index, transit photometry
Procedia PDF Downloads 754770 A Distinct Reversed-Phase High-Performance Liquid Chromatography Method for Simultaneous Quantification of Evogliptin Tartrate and Metformin HCl in Pharmaceutical Dosage Forms
Authors: Rajeshkumar Kanubhai Patel, Neha Sudhirkumar Mochi
Abstract:
A simple and accurate stability-indicating, reversed-phase high-performance liquid chromatography (RP-HPLC) method was developed and validated for the simultaneous quantitation of Evogliptin tartrate and Metformin HCl in pharmaceutical dosage forms, following ICH guidelines. Forced degradation was performed under various stress conditions including acid, base, oxidation, thermal, and photodegradation. The method utilized an Eclipse C18 column (250 mm × 4.6 mm, 5 µm) with a mobile phase of 5 mM 1-hexane sulfonic acid sodium salt in water and 0.2% v/v TEA (45:55 %v/v), adjusted to pH 3.0 with OPA, at a flow rate of 1.0 mL/min. Detection at 254.4 nm using a PDA detector showed good resolution of degradation products and both drugs. Linearity was observed within 1-5 µg/mL for Evogliptin tartrate and 100-500 µg/mL for Metformin HCl, with % recovery between 99-100% and precision within acceptable limits (%RSD < 2%). The method proved to be specific, precise, accurate, and robust for routine analysis of these drugs.Keywords: stability indicating RP-HPLC, evogliptin tartrate, metformin HCl, validation
Procedia PDF Downloads 244769 Ship Detection Requirements Analysis for Different Sea States: Validation on Real SAR Data
Authors: Jaime Martín-de-Nicolás, David Mata-Moya, Nerea del-Rey-Maestre, Pedro Gómez-del-Hoyo, María-Pilar Jarabo-Amores
Abstract:
Ship detection is nowadays quite an important issue in tasks related to sea traffic control, fishery management and ship search and rescue. Although it has traditionally been carried out by patrol ships or aircrafts, coverage and weather conditions and sea state can become a problem. Synthetic aperture radars can surpass these coverage limitations and work under any climatological condition. A fast CFAR ship detector based on a robust statistical modeling of sea clutter with respect to sea states in SAR images is used. In this paper, the minimum SNR required to obtain a given detection probability with a given false alarm rate for any sea state is determined. A Gaussian target model using real SAR data is considered. Results show that SNR does not depend heavily on the class considered. Provided there is some variation in the backscattering of targets in SAR imagery, the detection probability is limited and a post-processing stage based on morphology would be suitable.Keywords: SAR, generalized gamma distribution, detection curves, radar detection
Procedia PDF Downloads 4534768 Enhancing Inservice Education Training Effectiveness Using a Mobile Based E-Learning Model
Authors: Richard Patrick Kabuye
Abstract:
This study focuses on the addressing the enhancement of in-service training programs as a tool of transforming the existing traditional approaches of formal lectures/contact hours. This will be supported with a more versatile, robust, and remotely accessible means of mobile based e-learning, as a support tool for the traditional means. A combination of various factors in education and incorporation of the eLearning strategy proves to be a key factor in effective in-service education. Key factor needs to be factored in so as to maintain a credible co-existence of the programs, with the prevailing social, economic and political environments. Effective in-service education focuses on having immediate transformation of knowledge into practice for a good time period, active participation of attendees, enable before training planning, in training assessment and post training feedback training analysis which will yield knowledge to the trainers of the applicability of knowledge given out. All the above require a more robust approach to attain success in implementation. Incorporating mobile technology in eLearning will enable the above to be factored together in a more coherent manner, as it is evident that participants have to take time off their duties and attend to these training programs. Making it mobile, will save a lot of time since participants would be in position to follow certain modules while away from lecture rooms, get continuous program updates after completing the program, send feedback to instructors on knowledge gaps, and a wholly conclusive evaluation of the entire program on a learn as you work platform. This study will follow both qualitative and quantitative approaches in data collection, and this will be compounded incorporating a mobile eLearning application using Android.Keywords: in service, training, mobile, e- learning, model
Procedia PDF Downloads 2194767 Comparison of Classical Computer Vision vs. Convolutional Neural Networks Approaches for Weed Mapping in Aerial Images
Authors: Paulo Cesar Pereira Junior, Alexandre Monteiro, Rafael da Luz Ribeiro, Antonio Carlos Sobieranski, Aldo von Wangenheim
Abstract:
In this paper, we present a comparison between convolutional neural networks and classical computer vision approaches, for the specific precision agriculture problem of weed mapping on sugarcane fields aerial images. A systematic literature review was conducted to find which computer vision methods are being used on this specific problem. The most cited methods were implemented, as well as four models of convolutional neural networks. All implemented approaches were tested using the same dataset, and their results were quantitatively and qualitatively analyzed. The obtained results were compared to a human expert made ground truth for validation. The results indicate that the convolutional neural networks present better precision and generalize better than the classical models.Keywords: convolutional neural networks, deep learning, digital image processing, precision agriculture, semantic segmentation, unmanned aerial vehicles
Procedia PDF Downloads 2604766 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria
Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova
Abstract:
Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.Keywords: cross-validation, decision tree, lagged variables, short-term forecasting
Procedia PDF Downloads 1944765 Scoring System for the Prognosis of Sepsis Patients in Intensive Care Units
Authors: Javier E. García-Gallo, Nelson J. Fonseca-Ruiz, John F. Duitama-Munoz
Abstract:
Sepsis is a syndrome that occurs with physiological and biochemical abnormalities induced by severe infection and carries a high mortality and morbidity, therefore the severity of its condition must be interpreted quickly. After patient admission in an intensive care unit (ICU), it is necessary to synthesize the large volume of information that is collected from patients in a value that represents the severity of their condition. Traditional severity of illness scores seeks to be applicable to all patient populations, and usually assess in-hospital mortality. However, the use of machine learning techniques and the data of a population that shares a common characteristic could lead to the development of customized mortality prediction scores with better performance. This study presents the development of a score for the one-year mortality prediction of the patients that are admitted to an ICU with a sepsis diagnosis. 5650 ICU admissions extracted from the MIMICIII database were evaluated, divided into two groups: 70% to develop the score and 30% to validate it. Comorbidities, demographics and clinical information of the first 24 hours after the ICU admission were used to develop a mortality prediction score. LASSO (least absolute shrinkage and selection operator) and SGB (Stochastic Gradient Boosting) variable importance methodologies were used to select the set of variables that make up the developed score; each of this variables was dichotomized and a cut-off point that divides the population into two groups with different mean mortalities was found; if the patient is in the group that presents a higher mortality a one is assigned to the particular variable, otherwise a zero is assigned. These binary variables are used in a logistic regression (LR) model, and its coefficients were rounded to the nearest integer. The resulting integers are the point values that make up the score when multiplied with each binary variables and summed. The one-year mortality probability was estimated using the score as the only variable in a LR model. Predictive power of the score, was evaluated using the 1695 admissions of the validation subset obtaining an area under the receiver operating characteristic curve of 0.7528, which outperforms the results obtained with Sequential Organ Failure Assessment (SOFA), Oxford Acute Severity of Illness Score (OASIS) and Simplified Acute Physiology Score II (SAPSII) scores on the same validation subset. Observed and predicted mortality rates within estimated probabilities deciles were compared graphically and found to be similar, indicating that the risk estimate obtained with the score is close to the observed mortality, it is also observed that the number of events (deaths) is indeed increasing as the outcome go from the decile with the lowest probabilities to the decile with the highest probabilities. Sepsis is a syndrome that carries a high mortality, 43.3% for the patients included in this study; therefore, tools that help clinicians to quickly and accurately predict a worse prognosis are needed. This work demonstrates the importance of customization of mortality prediction scores since the developed score provides better performance than traditional scoring systems.Keywords: intensive care, logistic regression model, mortality prediction, sepsis, severity of illness, stochastic gradient boosting
Procedia PDF Downloads 2224764 Design and Implementation of a Geodatabase and WebGIS
Authors: Sajid Ali, Dietrich Schröder
Abstract:
The merging of internet and Web has created many disciplines and Web GIS is one these disciplines which is effectively dealing with the geospatial data in a proficient way. Web GIS technologies have provided an easy accessing and sharing of geospatial data over the internet. However, there is a single platform for easy and multiple accesses of the data lacks for the European Caribbean Association (Europaische Karibische Gesselschaft - EKG) to assist their members and other research community. The technique presented in this paper deals with designing of a geodatabase using PostgreSQL/PostGIS as an object oriented relational database management system (ORDBMS) for competent dissemination and management of spatial data and Web GIS by using OpenGeo Suite for the fast sharing and distribution of the data over the internet. The characteristics of the required design for the geodatabase have been studied and a specific methodology is given for the purpose of designing the Web GIS. At the end, validation of this Web based geodatabase has been performed over two Desktop GIS software and a web map application and it is also discussed that the contribution has all the desired modules to expedite further research in the area as per the requirements.Keywords: desktop GISSoftware, European Caribbean association, geodatabase, OpenGeo suite, postgreSQL/PostGIS, webGIS, web map application
Procedia PDF Downloads 3414763 The Application of a Neural Network in the Reworking of Accu-Chek to Wrist Bands to Monitor Blood Glucose in the Human Body
Authors: J. K Adedeji, O. H Olowomofe, C. O Alo, S.T Ijatuyi
Abstract:
The issue of high blood sugar level, the effects of which might end up as diabetes mellitus, is now becoming a rampant cardiovascular disorder in our community. In recent times, a lack of awareness among most people makes this disease a silent killer. The situation calls for urgency, hence the need to design a device that serves as a monitoring tool such as a wrist watch to give an alert of the danger a head of time to those living with high blood glucose, as well as to introduce a mechanism for checks and balances. The neural network architecture assumed 8-15-10 configuration with eight neurons at the input stage including a bias, 15 neurons at the hidden layer at the processing stage, and 10 neurons at the output stage indicating likely symptoms cases. The inputs are formed using the exclusive OR (XOR), with the expectation of getting an XOR output as the threshold value for diabetic symptom cases. The neural algorithm is coded in Java language with 1000 epoch runs to bring the errors into the barest minimum. The internal circuitry of the device comprises the compatible hardware requirement that matches the nature of each of the input neurons. The light emitting diodes (LED) of red, green, and yellow colors are used as the output for the neural network to show pattern recognition for severe cases, pre-hypertensive cases and normal without the traces of diabetes mellitus. The research concluded that neural network is an efficient Accu-Chek design tool for the proper monitoring of high glucose levels than the conventional methods of carrying out blood test.Keywords: Accu-Check, diabetes, neural network, pattern recognition
Procedia PDF Downloads 1474762 'Low Electronic Noise' Detector Technology in Computed Tomography
Authors: A. Ikhlef
Abstract:
Image noise in computed tomography, is mainly caused by the statistical noise, system noise reconstruction algorithm filters. Since last few years, low dose x-ray imaging became more and more desired and looked as a technical differentiating technology among CT manufacturers. In order to achieve this goal, several technologies and techniques are being investigated, including both hardware (integrated electronics and photon counting) and software (artificial intelligence and machine learning) based solutions. From a hardware point of view, electronic noise could indeed be a potential driver for low and ultra-low dose imaging. We demonstrated that the reduction or elimination of this term could lead to a reduction of dose without affecting image quality. Also, in this study, we will show that we can achieve this goal using conventional electronics (low cost and affordable technology), designed carefully and optimized for maximum detective quantum efficiency. We have conducted the tests using large imaging objects such as 30 cm water and 43 cm polyethylene phantoms. We compared the image quality with conventional imaging protocols with radiation as low as 10 mAs (<< 1 mGy). Clinical validation of such results has been performed as well.Keywords: computed tomography, electronic noise, scintillation detector, x-ray detector
Procedia PDF Downloads 1264761 Experimental Study of Impregnated Diamond Bit Wear During Sharpening
Authors: Rui Huang, Thomas Richard, Masood Mostofi
Abstract:
The lifetime of impregnated diamond bits and their drilling efficiency are in part governed by the bit wear conditions, not only the extent of the diamonds’ wear but also their exposure or protrusion out of the matrix bonding. As much as individual diamonds wear, the bonding matrix does also wear through two-body abrasion (direct matrix-rock contact) and three-body erosion (cuttings trapped in the space between rock and matrix). Although there is some work dedicated to the study of diamond bit wear, there is still a lack of understanding on how matrix erosion and diamond exposure relate to the bit drilling response and drilling efficiency, as well as no literature on the process that governs bit sharpening a procedure commonly implemented by drillers when the extent of diamond polishing yield extremely low rate of penetration. The aim of this research is (i) to derive a correlation between the wear state of the bit and the drilling performance but also (ii) to gain a better understanding of the process associated with tool sharpening. The research effort combines specific drilling experiments and precise mapping of the tool-cutting face (impregnated diamond bits and segments). Bit wear is produced by drilling through a rock sample at a fixed rate of penetration for a given period of time. Before and after each wear test, the bit drilling response and thus efficiency is mapped out using a tailored design experimental protocol. After each drilling test, the bit or segment cutting face is scanned with an optical microscope. The test results show that, under the fixed rate of penetration, diamond exposure increases with drilling distance but at a decreasing rate, up to a threshold exposure that corresponds to the optimum drilling condition for this feed rate. The data further shows that the threshold exposure scale with the rate of penetration up to a point where exposure reaches a maximum beyond which no more matrix can be eroded under normal drilling conditions. The second phase of this research focuses on the wear process referred as bit sharpening. Drillers rely on different approaches (increase feed rate or decrease flow rate) with the aim of tearing worn diamonds away from the bit matrix, wearing out some of the matrix, and thus exposing fresh sharp diamonds and recovering a higher rate of penetration. Although a common procedure, there is no rigorous methodology to sharpen the bit and avoid excessive wear or bit damage. This paper aims to gain some insight into the mechanisms that accompany bit sharpening by carefully tracking diamond fracturing, matrix wear, and erosion and how they relate to drilling parameters recorded while sharpening the tool. The results show that there exist optimal conditions (operating parameters and duration of the procedure) for sharpening that minimize overall bit wear and that the extent of bit sharpening can be monitored in real-time.Keywords: bit sharpening, diamond exposure, drilling response, impregnated diamond bit, matrix erosion, wear rate
Procedia PDF Downloads 994760 3D Text Toys: Creative Approach to Experiential and Immersive Learning for World Literacy
Authors: Azyz Sharafy
Abstract:
3D Text Toys is an innovative and creative approach that utilizes 3D text objects to enhance creativity, literacy, and basic learning in an enjoyable and gamified manner. By using 3D Text Toys, children can develop their creativity, visually learn words and texts, and apply their artistic talents within their creative abilities. This process incorporates haptic engagement with 2D and 3D texts, word building, and mechanical construction of everyday objects, thereby facilitating better word and text retention. The concept involves constructing visual objects made entirely out of 3D text/words, where each component of the object represents a word or text element. For instance, a bird can be recreated using words or text shaped like its wings, beak, legs, head, and body, resulting in a 3D representation of the bird purely composed of text. This can serve as an art piece or a learning tool in the form of a 3D text toy. These 3D text objects or toys can be crafted using natural materials such as leaves, twigs, strings, or ropes, or they can be made from various physical materials using traditional crafting tools. Digital versions of these objects can be created using 2D or 3D software on devices like phones, laptops, iPads, or computers. To transform digital designs into physical objects, computerized machines such as CNC routers, laser cutters, and 3D printers can be utilized. Once the parts are printed or cut out, students can assemble the 3D texts by gluing them together, resulting in natural or everyday 3D text objects. These objects can be painted to create artistic pieces or text toys, and the addition of wheels can transform them into moving toys. One of the significant advantages of this visual and creative object-based learning process is that students not only learn words but also derive enjoyment from the process of creating, painting, and playing with these objects. The ownership and creation process further enhances comprehension and word retention. Moreover, for individuals with learning disabilities such as dyslexia, ADD (Attention Deficit Disorder), or other learning difficulties, the visual and haptic approach of 3D Text Toys can serve as an additional creative and personalized learning aid. The application of 3D Text Toys extends to both the English language and any other global written language. The adaptation and creative application may vary depending on the country, space, and native written language. Furthermore, the implementation of this visual and haptic learning tool can be tailored to teach foreign languages based on age level and comprehension requirements. In summary, this creative, haptic, and visual approach has the potential to serve as a global literacy tool.Keywords: 3D text toys, creative, artistic, visual learning for world literacy
Procedia PDF Downloads 644759 Investigation of Martensitic Transformation Zone at the Crack Tip of NiTi under Mode-I Loading Using Microscopic Image Correlation
Authors: Nima Shafaghi, Gunay Anlaş, C. Can Aydiner
Abstract:
A realistic understanding of martensitic phase transition under complex stress states is key for accurately describing the mechanical behavior of shape memory alloys (SMAs). Particularly regarding the sharply changing stress fields at the tip of a crack, the size, nature and shape of transformed zones are of great interest. There is significant variation among various analytical models in their predictions of the size and shape of the transformation zone. As the fully transformed region remains inside a very small boundary at the tip of the crack, experimental validation requires microscopic resolution. Here, the crack tip vicinity of NiTi compact tension specimen has been monitored in situ with microscopic image correlation with 20x magnification. With nominal 15 micrometer grains and 0.2 micrometer per pixel optical resolution, the strains at the crack tip are mapped with intra-grain detail. The transformation regions are then deduced using an equivalent strain formulation.Keywords: digital image correlation, fracture, martensitic phase transition, mode I, NiTi, transformation zone
Procedia PDF Downloads 3534758 Functional Instruction Set Simulator of a Neural Network IP with Native Brain Float-16 Generator
Authors: Debajyoti Mukherjee, Arathy B. S., Arpita Sahu, Saranga P. Pogula
Abstract:
A functional model to mimic the functional correctness of a neural network compute accelerator IP is very crucial for design validation. Neural network workloads are based on a Brain Floating Point (BF-16) data type. The major challenge we were facing was the incompatibility of GCC compilers to the BF-16 datatype, which we addressed with a native BF-16 generator integrated into our functional model. Moreover, working with big GEMM (General Matrix Multiplication) or SpMM (Sparse Matrix Multiplication) Work Loads (Dense or Sparse) and debugging the failures related to data integrity is highly painstaking. In this paper, we are addressing the quality challenge of such a complex neural network accelerator design by proposing a functional model-based scoreboard or software model using SystemC. The proposed functional model executes the assembly code based on the ISA of the processor IP, decodes all instructions, and executes as expected to be done by the DUT. The said model would give a lot of visibility and debug capability in the DUT, bringing up micro-steps of execution.Keywords: ISA, neural network, Brain Float-16, DUT
Procedia PDF Downloads 944757 Statistical Quality Control on Assignable Causes of Variation on Cement Production in Ashaka Cement PLC Gombe State
Authors: Hamisu Idi
Abstract:
The present study focuses on studying the impact of influencer recommendation in the quality of cement production. Exploratory research was done on monthly basis, where data were obtained from secondary source i.e. the record kept by an automated recompilation machine. The machine keeps all the records of the mills downtime which the process manager checks for validation and refer the fault (if any) to the department responsible for maintenance or measurement taking so as to prevent future occurrence. The findings indicated that the product of the Ashaka Cement Plc. were considered as qualitative, since all the production processes were found to be in control (preset specifications) with the exception of the natural cause of variation which is normal in the production process as it will not affect the outcome of the product. It is reduced to the bearest minimum since it cannot be totally eliminated. It is also hopeful that the findings of this study would be of great assistance to the management of Ashaka cement factory and the process manager in particular at various levels in the monitoring and implementation of statistical process control. This study is therefore of great contribution to the knowledge in this regard and it is hopeful that it would open more research in that direction.Keywords: cement, quality, variation, assignable cause, common cause
Procedia PDF Downloads 2614756 Optimized Preprocessing for Accurate and Efficient Bioassay Prediction with Machine Learning Algorithms
Authors: Jeff Clarine, Chang-Shyh Peng, Daisy Sang
Abstract:
Bioassay is the measurement of the potency of a chemical substance by its effect on a living animal or plant tissue. Bioassay data and chemical structures from pharmacokinetic and drug metabolism screening are mined from and housed in multiple databases. Bioassay prediction is calculated accordingly to determine further advancement. This paper proposes a four-step preprocessing of datasets for improving the bioassay predictions. The first step is instance selection in which dataset is categorized into training, testing, and validation sets. The second step is discretization that partitions the data in consideration of accuracy vs. precision. The third step is normalization where data are normalized between 0 and 1 for subsequent machine learning processing. The fourth step is feature selection where key chemical properties and attributes are generated. The streamlined results are then analyzed for the prediction of effectiveness by various machine learning algorithms including Pipeline Pilot, R, Weka, and Excel. Experiments and evaluations reveal the effectiveness of various combination of preprocessing steps and machine learning algorithms in more consistent and accurate prediction.Keywords: bioassay, machine learning, preprocessing, virtual screen
Procedia PDF Downloads 2744755 A Visual Analytics Tool for the Structural Health Monitoring of an Aircraft Panel
Authors: F. M. Pisano, M. Ciminello
Abstract:
Aerospace, mechanical, and civil engineering infrastructures can take advantages from damage detection and identification strategies in terms of maintenance cost reduction and operational life improvements, as well for safety scopes. The challenge is to detect so called “barely visible impact damage” (BVID), due to low/medium energy impacts, that can progressively compromise the structure integrity. The occurrence of any local change in material properties, that can degrade the structure performance, is to be monitored using so called Structural Health Monitoring (SHM) systems, in charge of comparing the structure states before and after damage occurs. SHM seeks for any "anomalous" response collected by means of sensor networks and then analyzed using appropriate algorithms. Independently of the specific analysis approach adopted for structural damage detection and localization, textual reports, tables and graphs describing possible outlier coordinates and damage severity are usually provided as artifacts to be elaborated for information extraction about the current health conditions of the structure under investigation. Visual Analytics can support the processing of monitored measurements offering data navigation and exploration tools leveraging the native human capabilities of understanding images faster than texts and tables. Herein, a SHM system enrichment by integration of a Visual Analytics component is investigated. Analytical dashboards have been created by combining worksheets, so that a useful Visual Analytics tool is provided to structural analysts for exploring the structure health conditions examined by a Principal Component Analysis based algorithm.Keywords: interactive dashboards, optical fibers, structural health monitoring, visual analytics
Procedia PDF Downloads 1244754 Digital Platform of Crops for Smart Agriculture
Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye
Abstract:
In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.Keywords: prediction, machine learning, artificial intelligence, digital agriculture
Procedia PDF Downloads 804753 Technology Maps in Energy Applications Based on Patent Trends: A Case Study
Authors: Juan David Sepulveda
Abstract:
This article reflects the current stage of progress in the project “Determining technological trends in energy generation”. At first it was oriented towards finding out those trends by employing such tools as the scientometrics community had proved and accepted as effective for getting reliable results. Because a documented methodological guide for this purpose could not be found, the decision was made to reorient the scope and aim of this project, changing the degree of interest in pursuing the objectives. Therefore it was decided to propose and implement a novel guide from the elements and techniques found in the available literature. This article begins by explaining the elements and considerations taken into account when implementing and applying this methodology, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.Keywords: energy, technology mapping, patents, univariate analysis
Procedia PDF Downloads 4764752 Evaluation of the Ability of COVID-19 Infected Sera to Induce Netosis Using an Ex-Vivo NETosis Monitoring Tool
Authors: Constant Gillot, Pauline Michaux, Julien Favresse, Jean-Michel Dogné, Jonathan Douxfils
Abstract:
Introduction: NETosis has emerged as a crucial yet paradoxical factor in severe COVID-19 cases. While neutrophil extracellular traps (NETs) help contain and eliminate viral particles, excessive NET formation can lead to hyperinflammation, exacerbating tissue damage and acute respiratory distress syndrome (ARDS). Aims: This study evaluates the relationship between COVID-19-infected sera and NETosis using an ex-vivo model. Methods: Sera from 8 post-admission COVID-19 patients, after receiving corticoid therapy, were used to induce NETosis in neutrophils from a healthy donor. NET formation was tracked using fluorescent markers for DNA and neutrophil elastase (NE) every 2 minutes for 8 hours. The results were expressed as a percentage of DNA/NE released over time. Key metrics, including T50 (time to 50% release) and AUC (area under the curve), representing total NETosis potential), were calculated. A 27-cytokine screening kit was used to assess the cytokine composition of the sera. Results: COVID-19 sera induced NETosis based on their cytokine profile. The AUC of NE and DNA release decreased with time following corticoid therapy, showing a significant reduction in 6 of the 8 patients (p<0.05). T50 also decreased in parallel with AUC for both markers. Cytokines concentration decrease with time after therapy administration. There is correlation between 14 cytokines concentration and NE release. Conclusion: This ex-vivo model successfully demonstrated the induction of NETosis by COVID-19 sera using two markers. A clear decrease in NETosis potential was observed over time with glucocorticoid therapy. This model can be a valuable tool for monitoring NETosis and investigating potential NETosis inducers and inhibitors.Keywords: NETosis, COVID-19, cytokine storm, biomarkers
Procedia PDF Downloads 204751 Engineering Method to Measure the Impact Sound Improvement with Floor Coverings
Authors: Katarzyna Baruch, Agata Szelag, Jaroslaw Rubacha, Bartlomiej Chojnacki, Tadeusz Kamisinski
Abstract:
Methodology used to measure the reduction of transmitted impact sound by floor coverings situated on a massive floor is described in ISO 10140-3: 2010. To carry out such tests, the standardised reverberation room separated by a standard floor from the second measuring room are required. The need to have a special laboratory results in high cost and low accessibility of this measurement. The authors propose their own engineering method to measure the impact sound improvement with floor coverings. This method does not require standard rooms and floor. This paper describes the measurement procedure of proposed engineering method. Further, verification tests were performed. Validation of the proposed method was based on the analytical model, Statistical Energy Analysis (SEA) model and empirical measurements. The received results were related to corresponding ones obtained from ISO 10140-3:2010 measurements. The study confirmed the usefulness of the engineering method.Keywords: building acoustic, impact noise, impact sound insulation, impact sound transmission, reduction of impact sound
Procedia PDF Downloads 3244750 Experimental Study of the Behavior of Elongated Non-spherical Particles in Wall-Bounded Turbulent Flows
Authors: Manuel Alejandro Taborda Ceballos, Martin Sommerfeld
Abstract:
Transport phenomena and dispersion of non-spherical particle in turbulent flows are found everywhere in industrial application and processes. Powder handling, pollution control, pneumatic transport, particle separation are just some examples where the particle encountered are not only spherical. These types of multiphase flows are wall bounded and mostly highly turbulent. The particles found in these processes are rarely spherical but may have various shapes (e.g., fibers, and rods). Although research related to the behavior of regular non-spherical particles in turbulent flows has been carried out for many years, it is still necessary to refine models, especially near walls where the interaction fiber-wall changes completely its behavior. Imaging-based experimental studies on dispersed particle-laden flows have been applied for many decades for a detailed experimental analysis. These techniques have the advantages that they provide field information in two or three dimensions, but have a lower temporal resolution compared to point-wise techniques such as PDA (phase-Doppler anemometry) and derivations therefrom. The applied imaging techniques in dispersed two-phase flows are extensions from classical PIV (particle image velocimetry) and PTV (particle tracking velocimetry) and the main emphasis was simultaneous measurement of the velocity fields of both phases. In a similar way, such data should also provide adequate information for validating the proposed models. Available experimental studies on the behavior of non-spherical particles are uncommon and mostly based on planar light-sheet measurements. Especially for elongated non-spherical particles, however, three-dimensional measurements are needed to fully describe their motion and to provide sufficient information for validation of numerical computations. For further providing detailed experimental results allowing a validation of numerical calculations of non-spherical particle dispersion in turbulent flows, a water channel test facility was built around a horizontal closed water channel. Into this horizontal main flow, a small cross-jet laden with fiber-like particles was injected, which was also solely driven by gravity. The dispersion of the fibers was measured by applying imaging techniques based on a LED array for backlighting and high-speed cameras. For obtaining the fluid velocity fields, almost neutrally buoyant tracer was used. The discrimination between tracer and fibers was done based on image size which was also the basis to determine fiber orientation with respect to the inertial coordinate system. The synchronous measurement of fluid velocity and fiber properties also allow the collection of statistics of fiber orientation, velocity fields of tracer and fibers, the angular velocity of the fibers and the orientation between fiber and instantaneous relative velocity. Consequently, an experimental study the behavior of elongated non-spherical particles in wall bounded turbulent flows was achieved. The development of a comprehensive analysis was succeeded, especially near the wall region, where exists hydrodynamic wall interaction effects (e.g., collision or lubrication) and abrupt changes of particle rotational velocity. This allowed us to predict numerically afterwards the behavior of non-spherical particles within the frame of the Euler/Lagrange approach, where the particles are therein treated as “point-particles”.Keywords: crossflow, non-spherical particles, particle tracking velocimetry, PIV
Procedia PDF Downloads 864749 Self-Care and Emotional Wellbeing of Nurses Using Playback Theatre and Expressive Arts
Authors: Radhika Jain
Abstract:
The nursing community in India face unique challenges ranging from lack of adequate career progression, low social status attached to the profession, poor nurse-to-patient ratio leading to heavy workload resulting in stress and burnout, lack of general recognition and the responsibility of often having to deal with the ire of the patients and their families. This study explores how a combination of Playback Theatre and Expressive Arts could be used as a very powerful tool to understand the concerns, and consequently as a self-care tool to bring about the sense of well-being and emotional awareness for the nurses. For the purpose of this study, Playback Theatre was used as an entry tool to understand the thoughts, feelings and concerns. Playback theatre is a unique improvisational form of theatre developed by Jonathan Fox and Jo Salas in 1975, in which audience share their own stories from their lives and the performers play them back through a range of improv techniques such as metaphor, poetry, music and movement. Playback Theatre helped in first warming them up to the idea of sharing and then gave them the confidence of a safe space to collectively go deeper into their emotional experiences. As the next step, structured sessions of Expressive Arts were conducted with the same set of nurses, for them to work on the issues and concerns they have (and which they shared during the Playback performance). These sessions were to enable longer engagements as many of the concerns expressed were related to perceptions and beliefs that have been ingrained over a period of time and hence it needs a longer engagement to be worked on in detail. The Expressive Art sessions helped in this regard. Expressive arts therapy combines psychology and the creative process to promote emotional growth and healing. The study was conducted at two places: one a geriatric centre and the other, a palliative care centre. The study revealed that concerns and challenges would not be identical across the nursing community or across similar types of health care organizations but would be specific to each organization or centre as the circumstances and set-up at each place would be different. At the geriatric centre, stress and burnout emerged as the main concerns while at the palliative care centre, the main concern that came up was around the difficulty the nurses faced in expressing emotions and in communicating their feelings. The objective analysis of the results of the study indicated how longer-term engagements using Expressive Arts as the modality helped the nurses have better awareness of their emotions and helped them develop tools of self-care tools while also tapping into their emotions to express and experience. The process of eliciting the main concerns from the nurses using a Playback Theatre performance and then following that with subsequent sessions of expressive arts helped the nurses in the way nurses approached their job and the reduced level of overwhelm that they felt.Keywords: palliative care, nurses, self-care, expressive arts, playback theatre
Procedia PDF Downloads 1204748 Enhancing Organizational Performance through Adaptive Learning: A Case Study of ASML
Authors: Ramin Shadani
Abstract:
This study introduces adaptive performance as a key organizational performance dimension and explores the relationship between the dimensions of a learning organization and adaptive performance. A survey was therefore conducted using the dimensions of the Learning Organization Questionnaire (DLOQ), followed by factor analysis and structural equation modeling in order to investigate the dynamics between learning organization practices and adaptive performance. Results confirm that adaptive performance is indeed one important dimension of organizational performance. The study also shows that perceived knowledge and adaptive performance mediate the positive relationship between the practices of a learning organization with perceived financial performance. We extend existing DLOQ research by demonstrating that adaptive performance, as a nonfinancial organizational learning outcome, has a significant impact on financial performance. Our study also provides additional validation of the measures of DLOQ's performance. Indeed, organizations need to take a glance at how the activities of learning and development can provide better overall improvement in performance, especially in enhancing adaptive capability. The study has provided requisite empirical support that activities of learning and development within organizations allow much-improved intangible performance outcomes, especially through adaptive performance.Keywords: adaptive performance, continuous learning, financial performance, leadership style, organizational learning, organizational performance
Procedia PDF Downloads 304747 Performance Analysis on the Smoke Management System of the Weiwuying Center for the Arts Using Hot Smoke Tests
Authors: K. H. Yang, T. C. Yeh, P. S. Lu, F. C. Yang, T. Y. Wu, W. J. Sung
Abstract:
In this study, a series of full-scale hot smoke tests has been conducted to validate the performances of the smoke management system in the WWY center for arts before grand opening. Totaled 19 scenarios has been established and experimented with fire sizes ranging from 2 MW to 10 MW. The measured ASET data provided by the smoke management system experimentation were compared with the computer-simulated RSET values for egress during the design phase. The experimental result indicated that this system could successfully provide a safety margin of 200% and ensure a safe evacuation in case of fire in the WWY project, including worst-cases and fail-safe scenarios. The methodology developed and results obtained in this project can provide a useful reference for future applications, such as for the large-scale indoor sports dome and arena, stadium, shopping malls, airport terminals, and stations or tunnels for railway and subway systems.Keywords: building hot smoke tests, performance-based smoke management system designs, full-scale experimental validation, tenable condition criteria
Procedia PDF Downloads 4454746 A Statistical Approach to Rationalise the Number of Working Load Test for Quality Control of Pile Installation in Singapore Jurong Formation
Authors: Nuo Xu, Kok Hun Goh, Jeyatharan Kumarasamy
Abstract:
Pile load testing is significant during foundation construction due to its traditional role of design validation and routine quality control of the piling works. In order to verify whether piles can take loadings at specified settlements, piles will have to undergo working load test where the test load should normally up to 150% of the working load of a pile. Selection or sampling of piles for the working load test is done subject to the number specified in Singapore National Annex to Eurocode 7 SS EN 1997-1:2010. This paper presents an innovative way to rationalize the number of pile load test by adopting statistical analysis approach and looking at the coefficient of variance of pile elastic modulus using a case study at Singapore Tuas depot. Results are very promising and have shown that it is possible to reduce the number of working load test without influencing the reliability and confidence on the pile quality. Moving forward, it is suggested that more load test data from other geological formations to be examined to compare with the findings from this paper.Keywords: elastic modulus of pile under soil interaction, jurong formation, kentledge test, pile load test
Procedia PDF Downloads 3844745 Performance Investigation of Unmanned Aerial Vehicles Attitude Control Based on Modified PI-D and Nonlinear Dynamic Inversion
Authors: Ebrahim H. Kapeel, Ahmed M. Kamel, Hossam Hendy, Yehia Z. Elhalwagy
Abstract:
Interest in autopilot design has been raised intensely as a result of recent advancements in Unmanned Aerial vehicles (UAVs). Due to the enormous number of applications that UAVs can achieve, the number of applied control theories used for them has increased in recent years. These small fixed-wing UAVs are suffering high non-linearity, sensitivity to disturbances, and coupling effects between their channels. In this work, the nonlinear dynamic inversion (NDI) control law is designed for a nonlinear small fixed-wing UAV model. The NDI is preferable for varied operating conditions, there is no need for a scheduling controller. Moreover, it’s applicable for high angles of attack. For the designed flight controller validation, a nonlinear Modified PI-D controller is performed with our model. A comparative study between both controllers is achieved to evaluate the NDI performance. Simulation results and analysis are proposed to illustrate the effectiveness of the designed controller based on NDI.Keywords: attitude control, nonlinear PID, dynamic inversion
Procedia PDF Downloads 111