Search results for: skin error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2872

Search results for: skin error

1012 Maximum Deformation Estimation for Reinforced Concrete Buildings Using Equivalent Linearization Method

Authors: Chien-Kuo Chiu

Abstract:

In the displacement-based seismic design and evaluation, equivalent linearization method is one of the approximation methods to estimate the maximum inelastic displacement response of a system. In this study, the accuracy of two equivalent linearization methods are investigated. The investigation consists of three soil condition in Taiwan (Taipei Basin 1, 2, and 3) and five different heights of building (H_r= 10, 20, 30, 40, and 50 m). The first method is the Taiwan equivalent linearization method (TELM) which was proposed based on Japanese equivalent linear method considering the modification factor, α_T= 0.85. On the basis of Lin and Miranda study, the second method is proposed with some modification considering Taiwan soil conditions. From this study, it is shown that Taiwanese equivalent linearization method gives better estimation compared to the modified Lin and Miranda method (MLM). The error index for the Taiwanese equivalent linearization method are 16%, 13%, and 12% for Taipei Basin 1, 2, and 3, respectively. Furthermore, a ductility demand spectrum of single-degree-of-freedom (SDOF) system is presented in this study as a guide for engineers to estimate the ductility demand of a structure.

Keywords: displacement-based design, ductility demand spectrum, equivalent linearization method, RC buildings, single-degree-of-freedom

Procedia PDF Downloads 146
1011 Optimal Sensing Technique for Estimating Stress Distribution of 2-D Steel Frame Structure Using Genetic Algorithm

Authors: Jun Su Park, Byung Kwan Oh, Jin Woo Hwang, Yousok Kim, Hyo Seon Park

Abstract:

For the structural safety, the maximum stress calculated from the stress distribution of a structure is widely used. The stress distribution can be estimated by deformed shape of the structure obtained from measurement. Although the estimation of stress is strongly affected by the location and number of sensing points, most studies have conducted the stress estimation without reasonable basis on sensing plan such as the location and number of sensors. In this paper, an optimal sensing technique for estimating the stress distribution is proposed. This technique proposes the optimal location and number of sensing points for a 2-D frame structure while minimizing the error of stress distribution between analytical model and estimation by cubic smoothing splines using genetic algorithm. To verify the proposed method, the optimal sensor measurement technique is applied to simulation tests on 2-D steel frame structure. The simulation tests are performed under various loading scenarios. Through those tests, the optimal sensing plan for the structure is suggested and verified.

Keywords: genetic algorithm, optimal sensing, optimizing sensor placements, steel frame structure

Procedia PDF Downloads 514
1010 A Multigrid Approach for Three-Dimensional Inverse Heat Conduction Problems

Authors: Jianhua Zhou, Yuwen Zhang

Abstract:

A two-step multigrid approach is proposed to solve the inverse heat conduction problem in a 3-D object under laser irradiation. In the first step, the location of the laser center is estimated using a coarse and uniform grid system. In the second step, the front-surface temperature is recovered in good accuracy using a multiple grid system in which fine mesh is used at laser spot center to capture the drastic temperature rise in this region but coarse mesh is employed in the peripheral region to reduce the total number of sensors required. The effectiveness of the two-step approach and the multiple grid system are demonstrated by the illustrative inverse solutions. If the measurement data for the temperature and heat flux on the back surface do not contain random error, the proposed multigrid approach can yield more accurate inverse solutions. When the back-surface measurement data contain random noise, accurate inverse solutions cannot be obtained if both temperature and heat flux are measured on the back surface.

Keywords: conduction, inverse problems, conjugated gradient method, laser

Procedia PDF Downloads 349
1009 Implementation of Data Science in Field of Homologation

Authors: Shubham Bhonde, Nekzad Doctor, Shashwat Gawande

Abstract:

For the use and the import of Keys and ID Transmitter as well as Body Control Modules with radio transmission in a lot of countries, homologation is required. Final deliverables in homologation of the product are certificates. In considering the world of homologation, there are approximately 200 certificates per product, with most of the certificates in local languages. It is challenging to manually investigate each certificate and extract relevant data from the certificate, such as expiry date, approval date, etc. It is most important to get accurate data from the certificate as inaccuracy may lead to missing re-homologation of certificates that will result in an incompliance situation. There is a scope of automation in reading the certificate data in the field of homologation. We are using deep learning as a tool for automation. We have first trained a model using machine learning by providing all country's basic data. We have trained this model only once. We trained the model by feeding pdf and jpg files using the ETL process. Eventually, that trained model will give more accurate results later. As an outcome, we will get the expiry date and approval date of the certificate with a single click. This will eventually help to implement automation features on a broader level in the database where certificates are stored. This automation will help to minimize human error to almost negligible.

Keywords: homologation, re-homologation, data science, deep learning, machine learning, ETL (extract transform loading)

Procedia PDF Downloads 144
1008 Acne Vulgaris Association with Smoking and Body Mass Index in Jordanian Young Adults

Authors: Almutazballlah Bassam Qablan, Jihan M. Muhaidat, bana Abu Rajab

Abstract:

Background: Acne vulgaris is considered one of the most common skin conditions encountered by dermatologists. It is a chronic inflammation affecting the pilosebaceous unit. Although acne vulgaris is not fatal, it leads to permanent scarring and disfigurement, and even without scarring, it has a huge effect on patients, causing negative health outcomes. Acne vulgaris patients experience psychological, and emotional ramifications as those with chronic health problems; they feel depressed, angry, anxious, and confused. Although acne is a popular disease, many thoughts and myths are still discussed about its origins and triggering factors. These myths can make you feel guilt as if you were somehow responsible for your acne. In this case control study, we want to define the relationship between two modifiable risk factors ;BMI and smoking, with acne vulgaris. Methods: A case-control study was conducted at King Abdullah University Hospital in Ramtha, Jordan in 2019/2020. A total number of 325 participants between 14 and 33 years of age were interviewed by the authors; including 163 acne vulgaris cases and 162 controls without acne vulgaris. Anthropometric measures and smoking for Acne patients and control participants were the independent variables used to assess acne. Univariate and multivariate analysis were used to compare the characteristics of people who reported acne with those with no acne. The collected data analyzed by using the Statistical Package for Social Sciences (SPSS). Results: Cigarette smoking was highly associated with controls; odds ratio 0.4 (95% CI: 0.2–0.9) , P-value = 0.018. BMI and waterpipe smoking were statistically insignificant with acne in the multivariate analysis. Conclusion: We found that cigarette smoking was protective against Acne. There was a statistically insignificant relation between BMI, waterpipe smoking and the development of Acne Vulgaris.

Keywords: acne, adolescents, BMI, smoking, case-control, risk factors

Procedia PDF Downloads 76
1007 University of Sciences and Technology of Oran Mohamed Boudiaf (USTO-MB)

Authors: Patricia Mikchaela D. L. Feliciano, Ciela Kadeshka A. Fuentes, Bea Trixia B. Gales, Ethel Princess A. Gepulango, Martin R. Hernandez, Elina Andrea S. Lantion, Jhoe Cynder P. Legaspi, Peter F. Quilala, Gina C. Castro

Abstract:

Propolis is a resin-like material used by bees to fill large gap holes in the beehive. It has been found to possess anti-inflammatory property, which stimulates hair growth in rats by inducing hair keratinocytes proliferation, causing water retention and preventing damage caused by heat, ultraviolet rays, and other microorganisms without abnormalities in hair follicles. The present study aimed to formulate 10% and 30% Propolis Hair Cream for use in enhancing hair properties. Raw propolis sample was tested for heavy metals using Atomic Absorption Spectroscopy; zinc and chromium were found to be present. Likewise, propolis was extracted in a percolator using 70% ethanol and concentrated under vacuum using a rotary evaporator. The propolis extract was analyzed for total flavonoid content. Compatibility of the propolis extract with excipients was evaluated using Differential Scanning Calorimetry (DSC). No significant changes in organoleptic properties, pH and viscosity of the formulated creams were noted after four weeks of storage at 2-8°C, 30°C, and 40°C. The formulated creams were found to be non-irritating based on the Modified Draize Rabbit Test. In vivo efficacy was evaluated based on thickness and tensile strength of hair grown on previously shaved rat skin. Results show that the formulated 30% propolis-based cream had greater hair enhancing properties than the 10% propolis cream, which had a comparable effect with minoxidil.

Keywords: atomic absorption spectroscopy, differential scanning calorimetry (DSC), modified draize rabbit test, propolis

Procedia PDF Downloads 316
1006 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation

Authors: Mohammad Anwar, Shah Waliullah

Abstract:

This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.

Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model

Procedia PDF Downloads 58
1005 Outcome Analysis of Various Management Strategies for Ileal Perforation

Authors: Ashvamedh, Chandra Bhushan Singh, Anil Kumar Sarda

Abstract:

Introduction: Ileal perforation is a common cause for peritonitis in developing countries. Surgery is the ideal treatment as it eliminates soilage of peritoneal cavity in an effort to lessen the toxaemia and enhance the recovery of the patient. However, there is no uniformity of standardized operative procedure that is most effective for management. Material and method: The study was conducted on 66 patients of perforation peritonitis from November 2013 to February 2015 in Lok Nayak Hospital. Data of each patient were recorded on a pre-determined proforma. The methods used for repair were Primary repair, Resection anastomosis (RA) and Ileostomy. Result: Male preponderance was noticed among the patients with majority in their third decade. Of all perforations 40.9% were tubercular and 34.8% were typhoid. Amongst operated cases 27.3% underwent primary repair, RA was performed in 45.5%, Ileostomy in 27.3%patients. The average time taken for RA and ileostomy was more than primary repair. The type of repair bear no significance to size or no of perforation but was significant statistically for distance from I/C valve(P=.005) and edema of bowel wall(p=.002) when analysed for post op complications. Wound infection, dehiscence, intra-abdominal collections were complications observed bearing no significance to type of repair. Ileostomy per se has its own complications peristomal skin excoriation seen in 83.3%, electrolyte imbalance in 33.3%, duration for closure averaged 188 days (median 150 days, range 85-400 days). Conclusion: Primary closure is preferable in patients with single, small perforations. RA is advocated in patients with multiple or large perforation, perforation proximal to stricture. Ileostomy should not be considered as primary definitive procedure and reserved only for moribund patients as a lifesaving procedure. It has more morbidity and requires a second surgery for closure increasing the cost of treatment as well.

Keywords: ileal perforation, ileostomy, perforation peritonitis, typhoid perforation management

Procedia PDF Downloads 231
1004 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 52
1003 A Rare Case of Metastatic Basal Cell Carcinoma

Authors: Nitesh Kumar, Eoin Twohig, jasparl cheema, Sadiq mawji, Yousif al najjar

Abstract:

Basal cell carcinoma (BCC) is the commonest cutaneous malignancy affecting humans. Despite this, distant spread is exceptionally rare. Metastatic BCC (mBCC) is estimated to occur in 0.0028 - 0.5%. it aim to illustrate with the aid of histological slides, a case of mBCC occurring in a fit and well 67-year-old. Initial diagnosis of desmoplastic BCC was made in 2006 from a scalp biopsy with the lesion then being excised. Re-excision of local recurrence was undertaken the following year. In 2014 the patient presented with an ipsilateral level 2a mass. Fine Needle Aspiration raised the suspicion of metastatic carcinoma. The patient had excision of two nodes from the left neck alongside pharyngeal tonsillectomy and tongue base biopsies. Histologically, the nodes closely resembled the immunophenotype of the initial scalp lesion. The patient subsequently had a modified radical neck dissection, and residual mBCC was excised from the left Sternocleidomastoid muscle. In 2023 the patient developed haematuria. On further investigation bilateral lung lesions on CT were noted with subsequent biopsy confirming mBCC. Spinal and renal lesions have also been found. Histopathology showed clear resemblance of the lung metastases to both those in the neck and the primary (scalp BCC) – with no squamous differentiation seen. The time span from primary to occurrence of lung metastasis (18 years) affirms the indolent and slow growing nature of BCC.  This case fulfils Lattes and Kessler diagnostic criteria. High risk cases are described as those with advanced local presentation, primary tumour on the Head and Neck and locally recurrent lesions.

Keywords: BCC, metastasis, rare, skin cancer

Procedia PDF Downloads 36
1002 Image Features Comparison-Based Position Estimation Method Using a Camera Sensor

Authors: Jinseon Song, Yongwan Park

Abstract:

In this paper, propose method that can user’s position that based on database is built from single camera. Previous positioning calculate distance by arrival-time of signal like GPS (Global Positioning System), RF(Radio Frequency). However, these previous method have weakness because these have large error range according to signal interference. Method for solution estimate position by camera sensor. But, signal camera is difficult to obtain relative position data and stereo camera is difficult to provide real-time position data because of a lot of image data, too. First of all, in this research we build image database at space that able to provide positioning service with single camera. Next, we judge similarity through image matching of database image and transmission image from user. Finally, we decide position of user through position of most similar database image. For verification of propose method, we experiment at real-environment like indoor and outdoor. Propose method is wide positioning range and this method can verify not only position of user but also direction.

Keywords: positioning, distance, camera, features, SURF(Speed-Up Robust Features), database, estimation

Procedia PDF Downloads 328
1001 Design of a Low Cost Programmable LED Lighting System

Authors: S. Abeysekera, M. Bazghaleh, M. P. L. Ooi, Y. C. Kuang, V. Kalavally

Abstract:

Smart LED-based lighting systems have significant advantages over traditional lighting systems due to their capability of producing tunable light spectrums on demand. The main challenge in the design of smart lighting systems is to produce sufficient luminous flux and uniformly accurate output spectrum for sufficiently broad area. This paper outlines the programmable LED lighting system design principles of design to achieve the two aims. In this paper, a seven-channel design using low-cost discrete LEDs is presented. Optimization algorithms are used to calculate the number of required LEDs, LEDs arrangements and optimum LED separation distance. The results show the illumination uniformity for each channel. The results also show that the maximum color error is below 0.0808 on the CIE1976 chromaticity scale. In conclusion, this paper considered the simulation and design of a seven-channel programmable lighting system using low-cost discrete LEDs to produce sufficient luminous flux and uniformly accurate output spectrum for sufficiently broad area.

Keywords: light spectrum control, LEDs, smart lighting, programmable LED lighting system

Procedia PDF Downloads 169
1000 Hydro-Gravimetric Ann Model for Prediction of Groundwater Level

Authors: Jayanta Kumar Ghosh, Swastik Sunil Goriwale, Himangshu Sarkar

Abstract:

Groundwater is one of the most valuable natural resources that society consumes for its domestic, industrial, and agricultural water supply. Its bulk and indiscriminate consumption affects the groundwater resource. Often, it has been found that the groundwater recharge rate is much lower than its demand. Thus, to maintain water and food security, it is necessary to monitor and management of groundwater storage. However, it is challenging to estimate groundwater storage (GWS) by making use of existing hydrological models. To overcome the difficulties, machine learning (ML) models are being introduced for the evaluation of groundwater level (GWL). Thus, the objective of this research work is to develop an ML-based model for the prediction of GWL. This objective has been realized through the development of an artificial neural network (ANN) model based on hydro-gravimetry. The model has been developed using training samples from field observations spread over 8 months. The developed model has been tested for the prediction of GWL in an observation well. The root means square error (RMSE) for the test samples has been found to be 0.390 meters. Thus, it can be concluded that the hydro-gravimetric-based ANN model can be used for the prediction of GWL. However, to improve the accuracy, more hydro-gravimetric parameter/s may be considered and tested in future.

Keywords: machine learning, hydro-gravimetry, ground water level, predictive model

Procedia PDF Downloads 107
999 Modelling the Long Rune of Aggregate Import Demand in Libya

Authors: Said Yousif Khairi

Abstract:

Being a developing economy, imports of capital, raw materials and manufactories goods are vital for sustainable economic growth. In 2006, Libya imported LD 8 billion (US$ 6.25 billion) which composed of mainly machinery and transport equipment (49.3%), raw material (18%), and food products and live animals (13%). This represented about 10% of GDP. Thus, it is pertinent to investigate factors affecting the amount of Libyan imports. An econometric model representing the aggregate import demand for Libya was developed and estimated using the bounds test procedure, which based on an unrestricted error correction model (UECM). The data employed for the estimation was from 1970–2010. The results of the bounds test revealed that the volume of imports and its determinants namely real income, consumer price index and exchange rate are co-integrated. The findings indicate that the demand for imports is inelastic with respect to income, index price level and The exchange rate variable in the short run is statistically significant. In the long run, the income elasticity is elastic while the price elasticity and the exchange rate remains inelastic. This indicates that imports are important elements for Libyan economic growth in the long run.

Keywords: import demand, UECM, bounds test, Libya

Procedia PDF Downloads 342
998 Experimental and Numerical Investigation on Delaminated Composite Plate

Authors: Sreekanth T. G., Kishorekumar S., Sowndhariya Kumar J., Karthick R., Shanmugasuriyan S.

Abstract:

Composites are increasingly being used in industries due to their unique properties, such as high specific stiffness and specific strength, higher fatigue and wear resistances, and higher damage tolerance capability. Composites are prone to failures or damages that are difficult to identify, locate, and characterize due to their complex design features and complicated loading conditions. The lack of understanding of the damage mechanism of the composites leads to the uncertainties in the structural integrity and durability. Delamination is one of the most critical failure mechanisms in laminated composites because it progressively affects the mechanical performance of fiber-reinforced polymer composite structures over time. The identification and severity characterization of delamination in engineering fields such as the aviation industry is critical for both safety and economic concerns. The presence of delamination alters the vibration properties of composites, such as natural frequencies, mode shapes, and so on. In this study, numerical analysis and experimental analysis were performed on delaminated and non-delaminated glass fiber reinforced polymer (GFRP) plate, and the numerical and experimental analysis results were compared, and error percentage has been found out.

Keywords: composites, delamination, natural frequency, mode shapes

Procedia PDF Downloads 90
997 Survival Analysis Based Delivery Time Estimates for Display FAB

Authors: Paul Han, Jun-Geol Baek

Abstract:

In the flat panel display industry, the scheduler and dispatching system to meet production target quantities and the deadline of production are the major production management system which controls each facility production order and distribution of WIP (Work in Process). In dispatching system, delivery time is a key factor for the time when a lot can be supplied to the facility. In this paper, we use survival analysis methods to identify main factors and a forecasting model of delivery time. Of survival analysis techniques to select important explanatory variables, the cox proportional hazard model is used to. To make a prediction model, the Accelerated Failure Time (AFT) model was used. Performance comparisons were conducted with two other models, which are the technical statistics model based on transfer history and the linear regression model using same explanatory variables with AFT model. As a result, the Mean Square Error (MSE) criteria, the AFT model decreased by 33.8% compared to the existing prediction model, decreased by 5.3% compared to the linear regression model. This survival analysis approach is applicable to implementing a delivery time estimator in display manufacturing. And it can contribute to improve the productivity and reliability of production management system.

Keywords: delivery time, survival analysis, Cox PH model, accelerated failure time model

Procedia PDF Downloads 518
996 M-Machine Assembly Scheduling Problem to Minimize Total Tardiness with Non-Zero Setup Times

Authors: Harun Aydilek, Asiye Aydilek, Ali Allahverdi

Abstract:

Our objective is to minimize the total tardiness in an m-machine two-stage assembly flowshop scheduling problem. The objective is an important performance measure because of the fact that the fulfillment of due dates of customers has to be taken into account while making scheduling decisions. In the literature, the problem is considered with zero setup times which may not be realistic and appropriate for some scheduling environments. Considering separate setup times from processing times increases machine utilization by decreasing the idle time and reduces total tardiness. We propose two new algorithms and adapt four existing algorithms in the literature which are different versions of simulated annealing and genetic algorithms. Moreover, a dominance relation is developed based on the mathematical formulation of the problem. The developed dominance relation is incorporated in our proposed algorithms. Computational experiments are conducted to investigate the performance of the newly proposed algorithms. We find that one of the proposed algorithms performs significantly better than the others, i.e., the error of the best algorithm is less than those of the other algorithms by minimum 50%. The newly proposed algorithm is also efficient for the case of zero setup times and performs better than the best existing algorithm in the literature.

Keywords: algorithm, assembly flowshop, scheduling, simulation, total tardiness

Procedia PDF Downloads 307
995 A Stochastic Volatility Model for Optimal Market-Making

Authors: Zubier Arfan, Paul Johnson

Abstract:

The electronification of financial markets and the rise of algorithmic trading has sparked a lot of interest from the mathematical community, for the market making-problem in particular. The research presented in this short paper solves the classic stochastic control problem in order to derive the strategy for a market-maker. It also shows how to calibrate and simulate the strategy with real limit order book data for back-testing. The ambiguity of limit-order priority in back-testing is dealt with by considering optimistic and pessimistic priority scenarios. The model, although it does outperform a naive strategy, assumes constant volatility, therefore, is not best suited to the LOB data. The Heston model is introduced to describe the price and variance process of the asset. The Trader's constant absolute risk aversion utility function is optimised by numerically solving a 3-dimensional Hamilton-Jacobi-Bellman partial differential equation to find the optimal limit order quotes. The results show that the stochastic volatility market-making model is more suitable for a risk-averse trader and is also less sensitive to calibration error than the constant volatility model.

Keywords: market-making, market-microsctrucure, stochastic volatility, quantitative trading

Procedia PDF Downloads 130
994 The Effects of “Never Pressure Injury” on the Incidence of Pressure Injuries in Critically Ill Patients

Authors: Nuchjaree Kidjawan, Orapan Thosingha, Pawinee Vaipatama, Prakrankiat Youngkong, Sirinapha Malangputhong, Kitti Thamrongaphichartkul, Phatcharaporn Phetcharat

Abstract:

NPI uses technology sensorization of things and processed by AI system. The main features are an individual interface pressure sensor system in contact with the mattress and a position management system where the sensor detects the determined pressure with automatic pressure reduction and distribution. The role of NPI is to monitor, identify the risk and manage the interface pressure automatically when the determined pressure is detected. This study aims to evaluate the effects of “Never Pressure Injury (NPI),” an innovative mattress, on the incidence of pressure injuries in critically ill patients. An observational case-control study was employed to compare the incidence of pressure injury between the case and the control group. The control group comprised 80 critically ill patients admitted to a critical care unit of Phyathai3 Hospital, receiving standard care with the use of memory foam according to intensive care unit guidelines. The case group comprised 80 critically ill patients receiving standard care and with the use of the Never Pressure Injury (NPI) innovation mattress. The patients who were over 20 years old and showed scores of less than 18 on the Risk Assessment Pressure Ulcer Scale – ICU and stayed in ICU for more than 24 hours were selected for the study. The patients’ skin was assessed for the occurrence of pressure injury once a day for five consecutive days or until the patients were discharged from ICU. The sample comprised 160 patients with ages ranging from 30-102 (mean = 70.1 years), and the Body Mass Index ranged from 13.69- 49.01 (mean = 24.63). The case and the control group were not different in their sex, age, Body Mass Index, Pressure Ulcer Risk Scores, and length of ICU stay. Twenty-two patients (27.5%) in the control group had pressure injuries, while no pressure injury was found in the case group.

Keywords: pressure injury, never pressure injury, innovation mattress, critically ill patients, prevent pressure injury

Procedia PDF Downloads 95
993 Tracking Filtering Algorithm Based on ConvLSTM

Authors: Ailing Yang, Penghan Song, Aihua Cai

Abstract:

The nonlinear maneuvering target tracking problem is mainly a state estimation problem when the target motion model is uncertain. Traditional solutions include Kalman filtering based on Bayesian filtering framework and extended Kalman filtering. However, these methods need prior knowledge such as kinematics model and state system distribution, and their performance is poor in state estimation of nonprior complex dynamic systems. Therefore, in view of the problems existing in traditional algorithms, a convolution LSTM target state estimation (SAConvLSTM-SE) algorithm based on Self-Attention memory (SAM) is proposed to learn the historical motion state of the target and the error distribution information measured at the current time. The measured track point data of airborne radar are processed into data sets. After supervised training, the data-driven deep neural network based on SAConvLSTM can directly obtain the target state at the next moment. Through experiments on two different maneuvering targets, we find that the network has stronger robustness and better tracking accuracy than the existing tracking methods.

Keywords: maneuvering target, state estimation, Kalman filter, LSTM, self-attention

Procedia PDF Downloads 133
992 Improving Traditional Methods of Handling Fish from Integrated Pond Culture Systems in Monai Village, New Bussa, Nigeria

Authors: Olokor O. Julius, Ngwu E. Onyebuchi, Ajani K. Emmanuel, Omitoyin O. Bamidele, Olokor O. Linda, Akomas Stella

Abstract:

The study assessed the quality changes of Clarias gariepenus obtained from integrated culture systems (rice, poultry and fish) which were displayed at 31-33oC average daily temperature on the traditional market table used by local fish farmers to sell fish harvested from their ponds and those on an improved table designed for this study. Unlike the conventional table, the improved table was screened against flies and indiscriminate touch by customers. The fishes were displayed on both tables for 9 hours and quality attributes were monitored hourly by trained panelists. For C. gariepinus, the gills, and intestine recorded faster deterioration starting from the fourth and fifth hours while those on the improved table were prolonged by one hour. Scores for skin brightness and texture did not indicate quality deterioration throughout the display period. However, at the end of the storage time, samples on the improved table recorded 1.5 x 104 cfu/g while samples in unscreened table recorded 3.7 x 10 7 cfu/g. The study shows how simple modifications of a traditional practice can help extend keeping qualities of farmed fish, reduce health hazards in local communities where there is no electricity to preserve fish in whatever form despite a boom in aquaculture. Monai community has a fish farm estate of over 200 small holder farmers with annual output capacity of over $10 million dollars. The simple improvement made to farmers practice in this study is to ensure Community hygiene and boost income of peasant fish farmers by improving the market quality of their products.

Keywords: fish spoilage, improved handling, income generation, retail table

Procedia PDF Downloads 427
991 Real Time Implementation of Efficient DFIG-Variable Speed Wind Turbine Control

Authors: Fayssal Amrane, Azeddine Chaiba, Bruno Francois

Abstract:

In this paper, design and experimental study based on Direct Power Control (DPC) of DFIG is proposed for Stand-alone mode in Variable Speed Wind Energy Conversion System (VS-WECS). The proposed IDPC method based on robust IP (Integral-Proportional) controllers in order to control the Rotor Side Converter (RSC) by the means of the rotor current d-q axes components (Ird* and Irq*) of Doubly Fed Induction Generator (DFIG) through AC-DC-AC converter. The implementation is realized using dSPACE dS1103 card under Sub and Super-synchronous operations (means < and > of the synchronous speed “1500 rpm”). Finally, experimental results demonstrate that the proposed control using IP provides improved dynamic responses, and decoupled control of the wind turbine has driven DFIG with high performances (good reference tracking, short response time and low power error) despite for sudden variation of wind speed and rotor references currents.

Keywords: Direct Power Control (DPC), Doubly fed induction generator (DFIG), Wind Energy Conversion System (WECS), Experimental study.

Procedia PDF Downloads 114
990 Enhancing a Recidivism Prediction Tool with Machine Learning: Effectiveness and Algorithmic Fairness

Authors: Marzieh Karimihaghighi, Carlos Castillo

Abstract:

This work studies how Machine Learning (ML) may be used to increase the effectiveness of a criminal recidivism risk assessment tool, RisCanvi. The two key dimensions of this analysis are predictive accuracy and algorithmic fairness. ML-based prediction models obtained in this study are more accurate at predicting criminal recidivism than the manually-created formula used in RisCanvi, achieving an AUC of 0.76 and 0.73 in predicting violent and general recidivism respectively. However, the improvements are small, and it is noticed that algorithmic discrimination can easily be introduced between groups such as national vs foreigner, or young vs old. It is described how effectiveness and algorithmic fairness objectives can be balanced, applying a method in which a single error disparity in terms of generalized false positive rate is minimized, while calibration is maintained across groups. Obtained results show that this bias mitigation procedure can substantially reduce generalized false positive rate disparities across multiple groups. Based on these results, it is proposed that ML-based criminal recidivism risk prediction should not be introduced without applying algorithmic bias mitigation procedures.

Keywords: algorithmic fairness, criminal risk assessment, equalized odds, recidivism

Procedia PDF Downloads 133
989 Walmart Sales Forecasting using Machine Learning in Python

Authors: Niyati Sharma, Om Anand, Sanjeev Kumar Prasad

Abstract:

Assuming future sale value for any of the organizations is one of the major essential characteristics of tactical development. Walmart Sales Forecasting is the finest illustration to work with as a beginner; subsequently, it has the major retail data set. Walmart uses this sales estimate problem for hiring purposes also. We would like to analyzing how the internal and external effects of one of the largest companies in the US can walk out their Weekly Sales in the future. Demand forecasting is the planned prerequisite of products or services in the imminent on the basis of present and previous data and different stages of the market. Since all associations is facing the anonymous future and we do not distinguish in the future good demand. Hence, through exploring former statistics and recent market statistics, we envisage the forthcoming claim and building of individual goods, which are extra challenging in the near future. As a result of this, we are producing the required products in pursuance of the petition of the souk in advance. We will be using several machine learning models to test the exactness and then lastly, train the whole data by Using linear regression and fitting the training data into it. Accuracy is 8.88%. The extra trees regression model gives the best accuracy of 97.15%.

Keywords: random forest algorithm, linear regression algorithm, extra trees classifier, mean absolute error

Procedia PDF Downloads 124
988 Uniqueness of Fingerprint Biometrics to Human Dynasty: A Review

Authors: Siddharatha Sharma

Abstract:

With the advent of technology and machines, the role of biometrics in society is taking an important place for secured living. Security issues are the major concern in today’s world and continue to grow in intensity and complexity. Biometrics based recognition, which involves precise measurement of the characteristics of living beings, is not a new method. Fingerprints are being used for several years by law enforcement and forensic agencies to identify the culprits and apprehend them. Biometrics is based on four basic principles i.e. (i) uniqueness, (ii) accuracy, (iii) permanency and (iv) peculiarity. In today’s world fingerprints are the most popular and unique biometrics method claiming a social benefit in the government sponsored programs. A remarkable example of the same is UIDAI (Unique Identification Authority of India) in India. In case of fingerprint biometrics the matching accuracy is very high. It has been observed empirically that even the identical twins also do not have similar prints. With the passage of time there has been an immense progress in the techniques of sensing computational speed, operating environment and the storage capabilities and it has become more user convenient. Only a small fraction of the population may be unsuitable for automatic identification because of genetic factors, aging, environmental or occupational reasons for example workers who have cuts and bruises on their hands which keep fingerprints changing. Fingerprints are limited to human beings only because of the presence of volar skin with corrugated ridges which are unique to this species. Fingerprint biometrics has proved to be a high level authentication system for identification of the human beings. Though it has limitations, for example it may be inefficient and ineffective if ridges of finger(s) or palm are moist authentication becomes difficult. This paper would focus on uniqueness of fingerprints to the human beings in comparison to other living beings and review the advancement in emerging technologies and their limitations.

Keywords: fingerprinting, biometrics, human beings, authentication

Procedia PDF Downloads 304
987 Machine Learning Approach for Mutation Testing

Authors: Michael Stewart

Abstract:

Mutation testing is a type of software testing proposed in the 1970s where program statements are deliberately changed to introduce simple errors so that test cases can be validated to determine if they can detect the errors. Test cases are executed against the mutant code to determine if one fails, detects the error and ensures the program is correct. One major issue with this type of testing was it became intensive computationally to generate and test all possible mutations for complex programs. This paper used reinforcement learning and parallel processing within the context of mutation testing for the selection of mutation operators and test cases that reduced the computational cost of testing and improved test suite effectiveness. Experiments were conducted using sample programs to determine how well the reinforcement learning-based algorithm performed with one live mutation, multiple live mutations and no live mutations. The experiments, measured by mutation score, were used to update the algorithm and improved accuracy for predictions. The performance was then evaluated on multiple processor computers. With reinforcement learning, the mutation operators utilized were reduced by 50 – 100%.

Keywords: automated-testing, machine learning, mutation testing, parallel processing, reinforcement learning, software engineering, software testing

Procedia PDF Downloads 175
986 Next-Generation Lunar and Martian Laser Retro-Reflectors

Authors: Simone Dell'Agnello

Abstract:

There are laser retroreflectors on the Moon and no laser retroreflectors on Mars. Here we describe the design, construction, qualification and imminent deployment of next-generation, optimized laser retroreflectors on the Moon and on Mars (where they will be the first ones). These instruments are positioned by time-of-flight measurements of short laser pulses, the so-called 'laser ranging' technique. Data analysis is carried out with PEP, the Planetary Ephemeris Program of CfA (Center for Astrophysics). Since 1969 Lunar Laser Ranging (LLR) to Apollo/Lunokhod laser retro-reflector (CCR) arrays supplied accurate tests of General Relativity (GR) and new gravitational physics: possible changes of the gravitational constant Gdot/G, weak and strong equivalence principle, gravitational self-energy (Parametrized Post Newtonian parameter beta), geodetic precession, inverse-square force-law; it can also constraint gravitomagnetism. Some of these measurements also allowed for testing extensions of GR, including spacetime torsion, non-minimally coupled gravity. LLR has also provides significant information on the composition of the deep interior of the Moon. In fact, LLR first provided evidence of the existence of a fluid component of the deep lunar interior. In 1969 CCR arrays contributed a negligible fraction of the LLR error budget. Since laser station range accuracy improved by more than a factor 100, now, because of lunar librations, current array dominate the error due to their multi-CCR geometry. We developed a next-generation, single, large CCR, MoonLIGHT (Moon Laser Instrumentation for General relativity high-accuracy test) unaffected by librations that supports an improvement of the space segment of the LLR accuracy up to a factor 100. INFN also developed INRRI (INstrument for landing-Roving laser Retro-reflector Investigations), a microreflector to be laser-ranged by orbiters. Their performance is characterized at the SCF_Lab (Satellite/lunar laser ranging Characterization Facilities Lab, INFN-LNF, Frascati, Italy) for their deployment on the lunar surface or the cislunar space. They will be used to accurately position landers, rovers, hoppers, orbiters of Google Lunar X Prize and space agency missions, thanks to LLR observations from station of the International Laser Ranging Service in the USA, in France and in Italy. INRRI was launched in 2016 with the ESA mission ExoMars (Exobiology on Mars) EDM (Entry, descent and landing Demonstration Module), deployed on the Schiaparelli lander and is proposed for the ExoMars 2020 Rover. Based on an agreement between NASA and ASI (Agenzia Spaziale Italiana), another microreflector, LaRRI (Laser Retro-Reflector for InSight), was delivered to JPL (Jet Propulsion Laboratory) and integrated on NASA’s InSight Mars Lander in August 2017 (launch scheduled in May 2018). Another microreflector, LaRA (Laser Retro-reflector Array) will be delivered to JPL for deployment on the NASA Mars 2020 Rover. The first lunar landing opportunities will be from early 2018 (with TeamIndus) to late 2018 with commercial missions, followed by opportunities with space agency missions, including the proposed deployment of MoonLIGHT and INRRI on NASA’s Resource Prospectors and its evolutions. In conclusion, we will extend significantly the CCR Lunar Geophysical Network and populate the Mars Geophysical Network. These networks will enable very significantly improved tests of GR.

Keywords: general relativity, laser retroreflectors, lunar laser ranging, Mars geodesy

Procedia PDF Downloads 249
985 Modifications in Design of Lap Joint of Fiber Metal Laminates

Authors: Shaher Bano, Samia Fida, Asif Israr

Abstract:

The continuous development and exploitation of materials and designs have diverted the attention of the world towards the use of robust composite materials known as fiber-metal laminates in many high-performance applications. The hybrid structure of fiber metal laminates makes them a material of choice for various applications such as aircraft skin panels, fuselage floorings, door panels and other load bearing applications. The synergistic effect of properties of metals and fibers reinforced laminates are responsible for their high damage tolerance as the metal element provides better fatigue and impact properties, while high stiffness and better corrosion properties are inherited from the fiber reinforced matrix systems. They are mostly used as a layered structure in different joint configurations such as lap and but joints. The FML layers are usually bonded with each other using either mechanical fasteners or adhesive bonds. This research work is also focused on modification of an adhesive bonded joint as a single lap joint of carbon fibers based CARALL FML has been modified to increase interlaminar shear strength and avoid delamination. For this purpose different joint modification techniques such as the introduction of spews and shoulder to modify the bond shape and use of nanofillers such as carbon nano-tubes as a reinforcement in the adhesive materials, have been utilized to improve shear strength of lap joint of the adhesively bonded FML layers. Both the simulation and experimental results showed that lap joint with spews and shoulders configuration have better properties due to stress distribution over a large area at the corner of the joint. The introduction of carbon nanotubes has also shown a positive effect on shear stress and joint strength as they act as reinforcement in the adhesive bond material.

Keywords: adhesive joint, Carbon Reinforced Aluminium Laminate (CARALL), fiber metal laminates, spews

Procedia PDF Downloads 284
984 Interlingual Interference in Students’ Writing

Authors: Zakaria Khatraoui

Abstract:

Interlanguage has transcendentally capitalized its central role over a considerable metropolitan landscape. Either academically driven or pedagogically oriented, Interlanguage has principally floated as important than ever before. It academically probes theoretical and linguistic issues in the turf and further malleably flows from idea to reality to vindicate a bridging philosophy between theory and educational rehearsal. Characteristically, the present research grants a prolifically developed theoretical framework that is conversely sustained by empirical teaching practices, along with teasing apart the narrowly confined implementation. The focus of this interlingual study is placed stridently on syntactic errors projected in students’ writing as performance. To attain this endeavor, the paper appropriates qualitatively a plethora of focal methodological choices sponsored by a solid design. The steadily undeniable ipso facto to be examined is the creative sense of syntactic errors unequivocally endorsed by the tangible dominance of cognitively intralingual errors over linguistically interlingual ones. Subsequently, this paper attempts earnestly to highlight transferable implications worth indicating both theoretical and pedagogically professional principles. In particular, results are fundamentally relative to the scholarly community in a multidimensional sense to recommend actions of educational value.

Keywords: interlanguage, interference, error, writing

Procedia PDF Downloads 46
983 Structural Equation Modeling Semiparametric Truncated Spline Using Simulation Data

Authors: Adji Achmad Rinaldo Fernandes

Abstract:

SEM analysis is a complex multivariate analysis because it involves a number of exogenous and endogenous variables that are interconnected to form a model. The measurement model is divided into two, namely, the reflective model (reflecting) and the formative model (forming). Before carrying out further tests on SEM, there are assumptions that must be met, namely the linearity assumption, to determine the form of the relationship. There are three modeling approaches to path analysis, including parametric, nonparametric and semiparametric approaches. The aim of this research is to develop semiparametric SEM and obtain the best model. The data used in the research is secondary data as the basis for the process of obtaining simulation data. Simulation data was generated with various sample sizes of 100, 300, and 500. In the semiparametric SEM analysis, the form of the relationship studied was determined, namely linear and quadratic and determined one and two knot points with various levels of error variance (EV=0.5; 1; 5). There are three levels of closeness of relationship for the analysis process in the measurement model consisting of low (0.1-0.3), medium (0.4-0.6) and high (0.7-0.9) levels of closeness. The best model lies in the form of the relationship X1Y1 linear, and. In the measurement model, a characteristic of the reflective model is obtained, namely that the higher the closeness of the relationship, the better the model obtained. The originality of this research is the development of semiparametric SEM, which has not been widely studied by researchers.

Keywords: semiparametric SEM, measurement model, structural model, reflective model, formative model

Procedia PDF Downloads 13