Search results for: accuracy improvement
5721 Non-Linear Causality Inference Using BAMLSS and Bi-CAM in Finance
Authors: Flora Babongo, Valerie Chavez
Abstract:
Inferring causality from observational data is one of the fundamental subjects, especially in quantitative finance. So far most of the papers analyze additive noise models with either linearity, nonlinearity or Gaussian noise. We fill in the gap by providing a nonlinear and non-gaussian causal multiplicative noise model that aims to distinguish the cause from the effect using a two steps method based on Bayesian additive models for location, scale and shape (BAMLSS) and on causal additive models (CAM). We have tested our method on simulated and real data and we reached an accuracy of 0.86 on average. As real data, we considered the causality between financial indices such as S&P 500, Nasdaq, CAC 40 and Nikkei, and companies' log-returns. Our results can be useful in inferring causality when the data is heteroskedastic or non-injective.Keywords: causal inference, DAGs, BAMLSS, financial index
Procedia PDF Downloads 1535720 Handloom Weaving Quality and Fashion Development Process for Traditional Costumes in the Contemporary Global Fashion Market in Ethiopia
Authors: Adiyam Amare
Abstract:
This research explores the handloom weaving quality and fashion development process for traditional Ethiopian costumes, particularly focusing on the challenges and opportunities within the contemporary global fashion market. Through a qualitative approach, including interviews and direct observations, the study identifies key factors affecting the handloom industry, such as quality improvement, market integration, and cultural preservation. The findings suggest that enhancing production quality, modernizing techniques, and fostering global market participation can significantly improve the competitiveness of Ethiopian traditional garments in the global fashion industry.Keywords: fashion, culture, design, textile
Procedia PDF Downloads 275719 Volatility Model with Markov Regime Switching to Forecast Baht/USD
Authors: Nop Sopipan
Abstract:
In this paper, we forecast the volatility of Baht/USDs using Markov Regime Switching GARCH (MRS-GARCH) models. These models allow volatility to have different dynamics according to unobserved regime variables. The main purpose of this paper is to find out whether MRS-GARCH models are an improvement on the GARCH type models in terms of modeling and forecasting Baht/USD volatility. The MRS-GARCH is the best performance model for Baht/USD volatility in short term but the GARCH model is best perform for long term.Keywords: volatility, Markov Regime Switching, forecasting, Baht/USD
Procedia PDF Downloads 3055718 An Analytical Method for Bending Rectangular Plates with All Edges Clamped Supported
Authors: Yang Zhong, Heng Liu
Abstract:
The decoupling method and the modified Naiver method are combined for accurate bending analysis of rectangular thick plates with all edges clamped supported. The basic governing equations for Mindlin plates are first decoupled into independent partial differential equations which can be solved separately. Using modified Navier method, the analytic solution of rectangular thick plate with all edges clamped supported is then derived. The solution method used in this paper leave out the complicated derivation for calculating coefficients and obtain the solution to problems directly. Numerical comparisons show the correctness and accuracy of the results at last.Keywords: Mindlin plates, decoupling method, modified Navier method, bending rectangular plates
Procedia PDF Downloads 6055717 Energy Efficiency Measures in Canada’s Iron and Steel Industry
Authors: A. Talaei, M. Ahiduzzaman, A. Kumar
Abstract:
In Canada, an increase in the production of iron and steel is anticipated for satisfying the increasing demand of iron and steel in the oil sands and automobile industries. It is predicted that GHG emissions from iron and steel sector will show a continuous increase till 2030 and, with emissions of 20 million tonnes of carbon dioxide equivalent, the sector will account for more than 2% of total national GHG emissions, or 12% of industrial emissions (i.e. 25% increase from 2010 levels). Therefore, there is an urgent need to improve the energy intensity and to implement energy efficiency measures in the industry to reduce the GHG footprint. This paper analyzes the current energy consumption in the Canadian iron and steel industries and identifies energy efficiency opportunities to improve the energy intensity and mitigate greenhouse gas emissions from this industry. In order to do this, a demand tree is developed representing different iron and steel production routs and the technologies within each rout. The main energy consumer within the industry is found to be flared heaters accounting for 81% of overall energy consumption followed by motor system and steam generation each accounting for 7% of total energy consumption. Eighteen different energy efficiency measures are identified which will help the efficiency improvement in various subsector of the industry. In the sintering process, heat recovery from coolers provides a high potential for energy saving and can be integrated in both new and existing plants. Coke dry quenching (CDQ) has the same advantages. Within the blast furnace iron-making process, injection of large amounts of coal in the furnace appears to be more effective than any other option in this category. In addition, because coal-powered electricity is being phased out in Ontario (where the majority of iron and steel plants are located) there will be surplus coal that could be used in iron and steel plants. In the steel-making processes, the recovery of Basic Oxygen Furnace (BOF) gas and scrap preheating provides considerable potential for energy savings in BOF and Electric Arc Furnace (EAF) steel-making processes, respectively. However, despite the energy savings potential, the BOF gas recovery is not applicable in existing plants using steam recovery processes. Given that the share of EAF in steel production is expected to increase the application potential of the technology will be limited. On the other hand, the long lifetime of the technology and the expected capacity increase of EAF makes scrap preheating a justified energy saving option. This paper would present the results of the assessment of the above mentioned options in terms of the costs and GHG mitigation potential.Keywords: Iron and Steel Sectors, Energy Efficiency Improvement, Blast Furnace Iron-making Process, GHG Mitigation
Procedia PDF Downloads 4005716 Defect Localization and Interaction on Surfaces with Projection Mapping and Gesture Recognition
Authors: Qiang Wang, Hongyang Yu, MingRong Lai, Miao Luo
Abstract:
This paper presents a method for accurately localizing and interacting with known surface defects by overlaying patterns onto real-world surfaces using a projection system. Given the world coordinates of the defects, we project corresponding patterns onto the surfaces, providing an intuitive visualization of the specific defect locations. To enable users to interact with and retrieve more information about individual defects, we implement a gesture recognition system based on a pruned and optimized version of YOLOv6. This lightweight model achieves an accuracy of 82.8% and is suitable for deployment on low-performance devices. Our approach demonstrates the potential for enhancing defect identification, inspection processes, and user interaction in various applications.Keywords: defect localization, projection mapping, gesture recognition, YOLOv6
Procedia PDF Downloads 915715 Non-Destructive Visual-Statistical Approach to Detect Leaks in Water Mains
Authors: Alaa Al Hawari, Mohammad Khader, Tarek Zayed, Osama Moselhi
Abstract:
In this paper, an effective non-destructive, non-invasive approach for leak detection was proposed. The process relies on analyzing thermal images collected by an IR viewer device that captures thermo-grams. In this study a statistical analysis of the collected thermal images of the ground surface along the expected leak location followed by a visual inspection of the thermo-grams was performed in order to locate the leak. In order to verify the applicability of the proposed approach the predicted leak location from the developed approach was compared with the real leak location. The results showed that the expected leak location was successfully identified with an accuracy of more than 95%.Keywords: thermography, leakage, water pipelines, thermograms
Procedia PDF Downloads 3565714 Numerical Simulation and Laboratory Tests for Rebar Detection in Reinforced Concrete Structures using Ground Penetrating Radar
Authors: Maha Al-Soudani, Gilles Klysz, Jean-Paul Balayssac
Abstract:
The aim of this paper is to use Ground Penetrating Radar (GPR) as a non-destructive testing (NDT) method to increase its accuracy in recognizing the geometric reinforced concrete structures and in particular, the position of steel bars. This definition will help the managers to assess the state of their structures on the one hand vis-a-vis security constraints and secondly to quantify the need for maintenance and repair. Several configurations of acquisition and processing of the simulated signal were tested to propose and develop an appropriate imaging algorithm in the propagation medium to locate accurately the rebar. A subsequent experimental validation was used by testing the imaging algorithm on real reinforced concrete structures. The results indicate that, this algorithm is capable of estimating the reinforcing steel bar position to within (0-1) mm.Keywords: GPR, NDT, Reinforced concrete structures, Rebar location.
Procedia PDF Downloads 5065713 Two Spherical Three Degrees of Freedom Parallel Robots 3-RCC and 3-RRS Static Analysis
Authors: Alireza Abbasi Moshaii, Shaghayegh Nasiri, Mehdi Tale Masouleh
Abstract:
The main purpose of this study is static analysis of two three-degree of freedom parallel mechanisms: 3-RCC and 3-RRS. Geometry of these mechanisms is expressed and static equilibrium equations are derived for the whole chains. For these mechanisms due to the equal number of equations and unknowns, the solution is as same as 3-RCC mechanism. Mathematical software is used to solve the equations. In order to prove the results obtained from solving the equations of mechanisms, their CAD model has been simulated and their static is analysed in ADAMS software. Due to symmetrical geometry of the mechanisms, the force and external torque acting on the end-effecter have been considered asymmetric to prove the generality of the solution method. Finally, the results of both softwares, for both mechanisms are extracted and compared as graphs. The good achieved comparison between the results indicates the accuracy of the analysis.Keywords: robotic, static analysis, 3-RCC, 3-RRS
Procedia PDF Downloads 3895712 The Use of Methods and Techniques of Drama Education with Kindergarten Teachers
Authors: Vladimira Hornackova, Jana Kottasova, Zuzana Vanova, Anna Jungrova
Abstract:
Present study deals with drama education in preschool education. The research made in this field brings a qualitative comparative survey with the aim to find out the use of methods and techniques of drama education in preschool education at university or secondary school graduate preschool teachers. The research uses a content analysis and an unstandardized questionnaire for preschool teachers and obtained data are processed with the help of descriptive methods and correlations. The results allow a comparison of aspects applied through drama in preschool education. The research brings impulses for education improvement in kindergartens and inspiration for university study programs of drama education in the professional training of preschool teachers.Keywords: drama education, preschool education, preschool teacher, research
Procedia PDF Downloads 3675711 The Role of Glyceryl Trinitrate (GTN) in 99mTc-HIDA with Morphine Provocation Scan for the Investigation of Type III Sphincter of Oddi Dysfunction (SOD)
Authors: Ibrahim M Hassan, Lorna Que, Michael Rutland
Abstract:
Type I SOD is usually diagnosed by anatomical imaging such as ultrasound, CT and MRCP. However, the types II and III SOD yield negative results despite the presence of significant symptoms. In particular, the type III is difficult to diagnose due to the absence of significant biochemical or anatomical abnormalities. Nuclear Medicine can aid in this diagnostic dilemma by demonstrating functional changes in the bile flow. Low dose Morphine (0.04mg/Kg) stimulates the tone of the sphincter of Oddi (SO) and its usefulness has been shown in diagnosing SOD by causing a delay in bile flow when compared to a non morphine provoked - baseline scan. This work expands on that process by using sublingual GTN at 60 minutes post tracer and morphine injection to relax the SO and induce an improvement in bile outflow, and in some cases show immediate relief of morphine induced abdominal pain. The criteria for positive SOD are as follows: if during the first hour of the morphine provocation showed (1) delayed intrahepatic biliary ducts tracer accumulation; plus (2) delayed appearance but persistent retention of activity in the common bile duct, and (3) delayed bile flow into the duodenum. In addition, patients who required GTN within the first hour to relieve abdominal pain were regarded as highly supportive of the diagnosis. Retrospective analysis of 85 patients (pts) (78F and 6M) referred for suspected SOD (type III) who had been intensively investigated because of recurrent right upper quadrant or abdominal pain post cholecystectomy. 99mTc-HIDA scan with morphine-provocation is performed followed by GTN at 60 minutes post tracer injection and a further thirty minutes of dynamic imaging are acquired. 30 pts were negative. 55 pts were regarded as positive for SOD and 38/55 (60%) of these patients with an abnormal result were further evaluated with a baseline 99mTc-HIDA. As expected, all 38 pts showed better bile flow characteristics than during the morphine provocation. 20/55 (36%) patients were treated by ERCP sphincterotomy and the rest were managed conservatively by medical therapy. In all cases regarded as positive for SOD, the sublingual GTN at 60 minutes showed immediate improvement in bile flow. 11/55(20%) who developed severe post-morphine abdominal pain were relieved by GTN almost instantaneously. We propose that GTN is a useful agent in the diagnosis of SOD when performing 99mTc-HIDA scan and that the satisfactory response to the sublingual GTN could offer additional information in patients who have severe pain at the time the procedure or when presenting to the emergency unit because of biliary pain. And also in determining whether a trial of medical therapy may be used before considering surgery.Keywords: GTN, HIDA, MORPHINE, SOD
Procedia PDF Downloads 3075710 Check Factors Contributing to the Increase or Decrease in Labor Productivity in Employees Applied Science Center Municipal Andimeshk
Authors: Hossein Boromandfar, Ahmad Ghalavandi
Abstract:
This paper examines the importance of human resources as a strategic resource and the factors that lead to increased Labor productivity in Applied Science Center Andimeshk pay. First, the concepts and definitions of productivity and factors affecting it, and then determine the center Recommendations for improving the productivity of the university at a high level its improvement. What leads to increased productivity of labor is worth. The most competent human resources infrastructure is set, because by moving towards the development and promotion. The use of qualified employees in the university with a focus on specific objectives can be effective on its promotion.Keywords: productivity, manage, human resources, center for applied science
Procedia PDF Downloads 4215709 Reducing Unnecessary CT Aorta Scans in the Emergency Department
Authors: Ibrahim Abouelkhir
Abstract:
Background: Prior to this project, the number of CT aorta requests from our Emergency Department (ED) was reported by the radiology department to be high with a low positive event rate: only 1- 2% of CT aortas performed were positive for acute aortic syndrome. This trend raised concerns about the time required to process and report these scans, potentially impacting the timely reporting of other high-priority imaging, such as trauma-related scans. Other harms identified were unnecessary radiation, patients spending longer in ED contributing to overcrowding, and, most importantly, the patient not getting the right care the first time. The radiology department also raised the problem of reporting bias because they expected our CT aortas to be normal. Aim: The main aim of this project was to reduce the number of unnecessary CT aortas requested, which would be shown by 1. Number of CT aortas requested and 2. Positive event rate. Methodology: This was a quality improvement project carried out in the ED at Frimley Park Hospital, UK. Starting from 1 st January 2024, we recorded the number of days required to reach 35 CT aorta requests. We looked at all patients presenting to the ED over the age of 16 for whom a CT aorta was requested by the ED team. We looked at how many of these scans were positive for acute aortic syndrome. The intervention was a change in practice: all CT aortas should be approved by an ED consultant or ST4+ registrar (5th April 2024). We then reviewed the number of days it took to reach a total of 35 CT aorta requests following the intervention and again reviewed how many were positive. Results: Prior to the intervention, 35 CT Aorta scans were performed over a 20-day period. Following the implementation of the ED senior doctor vetting process, the same number of CT Aorta scan requests was observed over 50 days - more than twice the pre-intervention period. This indicates a significant reduction in the rate of CT Aorta scans being requested. During the pre-intervention phase, there were two positive cases of acute aortic syndrome. In the post-intervention period, there were zero. Conclusion: The mandatory review of CT Aorta scan requested by the ED consultant effectively reduced the number of scans requested. However, this intervention did not lead to an increase in positive scan results. We noted that post-intervention, approximately 50% of scans had been approved by registrar-grade doctors and, only 50% had been approved by ED consultants, and the majority were not in-person reviews. We wonder if restricting the approval to consultant grade only might improve the results, and furthermore, in person reviews should be the gold standard.Keywords: quality improvement project, CT aorta scans, emergency department, radiology department, aortic dissection, scan request vetting, clinical outcomes, imaging efficiency
Procedia PDF Downloads 165708 A Descriptive Approach towards the Understanding of the Central American Coffee Business Demography Phenomena
Authors: Jesus David Argueta Moreno, Justa Rufina Martel, Edith Gabriela Carrasco
Abstract:
The Central American Coffee small, medium, and large corporations search for excellence, sustainability, and continuous improvement, triggers in a still unknown scale the Local expansion, crusading, and franchising strategies towards a more suitable commercial opportunity, where the dynamics of the Central American business displacement can be explained through the markets permeability traits. By considering the previously mentioned, the present study aims to evaluate the franchising potentialities offered by Central American Coffee business scenario, in order to explain dynamics of the business demography phenomena and its relevance on the Central American competitiveness landscape.Keywords: competitiveness, franchising, business demography, Central American Coffee
Procedia PDF Downloads 6135707 Developing the Methods for the Study of Static and Dynamic Balance
Authors: K. Abuzayan, H. Alabed, J. Ezarrugh, M. Agila
Abstract:
Static and dynamic balance are essential in daily and sports life. Many factors have been identified as influencing static balance control. Therefore, the aim of this study was to apply the (XCoM) method and other relevant variables (CoP, CoM, Fh, KE, P, Q, and, AI) to investigate sport related activities such as hopping and jumping. Many studies have represented the CoP data without mentioning its accuracy, so several experiments were done to establish the agreement between the CoP and the projected CoM in a static condition. Five male healthy (Mean ± SD:- age 24.6 years ±4.5, height 177 cm ± 6.3, body mass 72.8 kg ± 6.6) participated in this study. Results found that The implementation of the XCoM method was found to be practical for evaluating both static and dynamic balance. The general findings were that the CoP, the CoM, the XCoM, Fh, and Q were more informative than the other variables (e.g. KE, P, and AI) during static and dynamic balance. The XCoM method was found to be applicable to dynamic balance as well as static balance.Keywords: centre of mass, static balance, dynamic balance, extrapolated centre of mass
Procedia PDF Downloads 4225706 Using Neural Networks for Click Prediction of Sponsored Search
Authors: Afroze Ibrahim Baqapuri, Ilya Trofimov
Abstract:
Sponsored search is a multi-billion dollar industry and makes up a major source of revenue for search engines (SE). Click-through-rate (CTR) estimation plays a crucial role for ads selection, and greatly affects the SE revenue, advertiser traffic and user experience. We propose a novel architecture of solving CTR prediction problem by combining artificial neural networks (ANN) with decision trees. First, we compare ANN with respect to other popular machine learning models being used for this task. Then we go on to combine ANN with MatrixNet (proprietary implementation of boosted trees) and evaluate the performance of the system as a whole. The results show that our approach provides a significant improvement over existing models.Keywords: neural networks, sponsored search, web advertisement, click prediction, click-through rate
Procedia PDF Downloads 5745705 Reliable and Energy-Aware Data Forwarding under Sink-Hole Attack in Wireless Sensor Networks
Authors: Ebrahim Alrashed
Abstract:
Wireless sensor networks are vulnerable to attacks from adversaries attempting to disrupt their operations. Sink-hole attacks are a type of attack where an adversary node drops data forwarded through it and hence affecting the reliability and accuracy of the network. Since sensor nodes have limited battery power, it is essential that any solution to the sinkhole attack problem be very energy-aware. In this paper, we present a reliable and energy efficient scheme to forward data from source nodes to the base station while under sink-hole attack. The scheme also detects sink-hole attack nodes and avoid paths that includes them.Keywords: energy-aware routing, reliability, sink-hole attack, WSN
Procedia PDF Downloads 3985704 Identifying Protein-Coding and Non-Coding Regions in Transcriptomes
Authors: Angela U. Makolo
Abstract:
Protein-coding and Non-coding regions determine the biology of a sequenced transcriptome. Research advances have shown that Non-coding regions are important in disease progression and clinical diagnosis. Existing bioinformatics tools have been targeted towards Protein-coding regions alone. Therefore, there are challenges associated with gaining biological insights from transcriptome sequence data. These tools are also limited to computationally intensive sequence alignment, which is inadequate and less accurate to identify both Protein-coding and Non-coding regions. Alignment-free techniques can overcome the limitation of identifying both regions. Therefore, this study was designed to develop an efficient sequence alignment-free model for identifying both Protein-coding and Non-coding regions in sequenced transcriptomes. Feature grouping and randomization procedures were applied to the input transcriptomes (37,503 data points). Successive iterations were carried out to compute the gradient vector that converged the developed Protein-coding and Non-coding Region Identifier (PNRI) model to the approximate coefficient vector. The logistic regression algorithm was used with a sigmoid activation function. A parameter vector was estimated for every sample in 37,503 data points in a bid to reduce the generalization error and cost. Maximum Likelihood Estimation (MLE) was used for parameter estimation by taking the log-likelihood of six features and combining them into a summation function. Dynamic thresholding was used to classify the Protein-coding and Non-coding regions, and the Receiver Operating Characteristic (ROC) curve was determined. The generalization performance of PNRI was determined in terms of F1 score, accuracy, sensitivity, and specificity. The average generalization performance of PNRI was determined using a benchmark of multi-species organisms. The generalization error for identifying Protein-coding and Non-coding regions decreased from 0.514 to 0.508 and to 0.378, respectively, after three iterations. The cost (difference between the predicted and the actual outcome) also decreased from 1.446 to 0.842 and to 0.718, respectively, for the first, second and third iterations. The iterations terminated at the 390th epoch, having an error of 0.036 and a cost of 0.316. The computed elements of the parameter vector that maximized the objective function were 0.043, 0.519, 0.715, 0.878, 1.157, and 2.575. The PNRI gave an ROC of 0.97, indicating an improved predictive ability. The PNRI identified both Protein-coding and Non-coding regions with an F1 score of 0.970, accuracy (0.969), sensitivity (0.966), and specificity of 0.973. Using 13 non-human multi-species model organisms, the average generalization performance of the traditional method was 74.4%, while that of the developed model was 85.2%, thereby making the developed model better in the identification of Protein-coding and Non-coding regions in transcriptomes. The developed Protein-coding and Non-coding region identifier model efficiently identified the Protein-coding and Non-coding transcriptomic regions. It could be used in genome annotation and in the analysis of transcriptomes.Keywords: sequence alignment-free model, dynamic thresholding classification, input randomization, genome annotation
Procedia PDF Downloads 715703 Effects of Reversible Watermarking on Iris Recognition Performance
Authors: Andrew Lock, Alastair Allen
Abstract:
Fragile watermarking has been proposed as a means of adding additional security or functionality to biometric systems, particularly for authentication and tamper detection. In this paper we describe an experimental study on the effect of watermarking iris images with a particular class of fragile algorithm, reversible algorithms, and the ability to correctly perform iris recognition. We investigate two scenarios, matching watermarked images to unmodified images, and matching watermarked images to watermarked images. We show that different watermarking schemes give very different results for a given capacity, highlighting the importance of investigation. At high embedding rates most algorithms cause significant reduction in recognition performance. However, in many cases, for low embedding rates, recognition accuracy is improved by the watermarking process.Keywords: biometrics, iris recognition, reversible watermarking, vision engineering
Procedia PDF Downloads 4605702 Ionic Liquids-Polymer Nanoparticle Systems as Breakthrough Tools to Improve the Leprosy Treatment
Authors: A. Julio, R. Caparica, S. Costa Lima, S. Reis, J. G. Costa, P. Fonte, T. Santos De Almeida
Abstract:
The Mycobacterium leprae causes a chronic and infectious disease called leprosy, which the most common symptoms are peripheral neuropathy and deformation of several parts of the body. The pharmacological treatment of leprosy is a combined therapy with three different drugs, rifampicin, clofazimine, and dapsone. However, clofazimine and dapsone have poor solubility in water and also low bioavailability. Thus, it is crucial to develop strategies to overcome such drawbacks. The use of ionic liquids (ILs) may be a strategy to overcome the low solubility since they have been used as solubility promoters. ILs are salts, liquid below 100 ºC or even at room temperature, that may be placed in water, oils or hydroalcoholic solutions. Another approach may be the encapsulation of drugs into polymeric nanoparticles, which improves their bioavailability. In this study, two different classes of ILs were used, the imidazole- and the choline-based ionic liquids, as solubility enhancers of the poorly soluble antileprotic drugs. Thus, after the solubility studies, it was developed IL-PLGA nanoparticles hybrid systems to deliver such drugs. First of all, the solubility studies of clofazimine and dapsone were performed in water and in water: IL mixtures, at ILs concentrations where cell viability is maintained, at room temperature for 72 hours. For both drugs, it was observed an improvement on the drug solubility and [Cho][Phe] showed to be the best solubility enhancer, especially for clofazimine, where it was observed a 10-fold improvement. Later, it was produced nanoparticles, with a polymeric matrix of poly(lactic-co-glycolic acid) (PLGA) 75:25, by a modified solvent-evaporation W/O/W double emulsion technique in the presence of [Cho][Phe]. Thus, the inner phase was an aqueous solution of 0.2 % (v/v) of the above IL with each drug to its maximum solubility determined on the previous study. After the production, the nanosystem hybrid was physicochemically characterized. The produced nanoparticles had a diameter of around 580 nm and 640 nm, for clofazimine and dapsone, respectively. Regarding the polydispersity index, it was in agreement of the recommended value of this parameter for drug delivery systems (around 0.3). The association efficiency (AE) of the developed hybrid nanosystems demonstrated promising AE values for both drugs, given their low solubility (64.0 ± 4.0 % for clofazimine and 58.6 ± 10.0 % for dapsone), that prospects the capacity of these delivery systems to enhance the bioavailability and loading of clofazimine and dapsone. Overall, the study achievement may signify an upgrading of the patient’s quality of life, since it may mean a change in the therapeutic scheme, not requiring doses of drug so high to obtain a therapeutic effect. The authors would like to thank Fundação para a Ciência e a Tecnologia, Portugal (FCT/MCTES (PIDDAC), UID/DTP/04567/2016-CBIOS/PRUID/BI2/2018).Keywords: ionic liquids, ionic liquids-PLGA nanoparticles hybrid systems, leprosy treatment, solubility
Procedia PDF Downloads 1545701 Reducing Inequalities for the Uptake of Long-Term Reversible Contraceptive Methods through Special Family Planning Camps: A High Impact Service Delivery Model of Family Planning Practices
Authors: Ghulam Mustafa Halepota, Zaib Dahar
Abstract:
Background: Low acceptance of FP services, particularly in hard to reach areas where geographic, economic, or social barriers limit-service uptake. Moreover, limited resources appeared to be a reflection of dismal contraceptive use in Pakistan. People’s Primary Health Care Initiative (PPHI) is a Public Private Partnership Program of Government of Sindh which aims to improve maternal child health through accessible family planning services in far flung areas. In 2015 PPHI launched special family planning camps to have achieved a rapid improvement in CPR. On quarterly basis, these camps focus on Long Acting Reversible Contraceptives (LARC). These camps are arranged at 250 BHU Plus (24/7 MCHCs). The Organization manages 1140 primary health care facilities all over Sindh province and focuses on maternal, newborn and child health which includes antenatal care, labor/delivery, postnatal care, family planning, immunization, nutrition, BEmONC, CEmONC, diagnostic laboratories, ambulance services. Under the FPRH program, the organization launched special family planning camps in far flung areas to achieve a rapid improvement in CPR-committed to FP 2020 goal. Objective: To assess the performance of special FP camps for the improvement of long acting reversible contraceptive in hard to reach areas. Methodology: Outreach camps are organized on quarterly basis in 250 BHUs and maternal and child health centers (available-24/7). Using observational study design, the study reports 2 years data of special FP camps conducted in 23 various districts of Sindh during April 2015-April 2017. These special camps served a range of modern contraceptive methods including IUCDs, implants, condoms, pills, and injections. Moreover, 125 male medical officers are trained across Sindh in LARC and 554 female have been trained in implants and IUCD insertions. MSI Impact calculator was used to determine health and demographic impact of services. Results: This intervention has brought exceptional results, and the response has been overwhelming in time. Total 2048 special camps during 2015 till April 2017 have been carried out. 231796 MWRAs visited camps 91% opted modern FP, of which 45% opted Implants, 6% selected IUCDs from LARC (long term reversible contraceptive) from short term, 17% opted injectable 18% choose pills, and 12% used condoms. This intervention created a high contraceptive impact in rural Sindh an estimated 125048 FP users have been created, of this 111846 LARC users and 13498 are SARC users, through this intervention an estimated 55774 unintended pregnancies, 36299 live births, 9394, 80 maternal deaths, 926 and 6077 unsafe abortion have been averted. Moreover, the intervention created an economic impact and saved 2,409,563 direct health expenditure on each woman with reproductive age. Conclusion: Special FP Camps along with routine services is an effective and acceptable model for increase in provision of long-acting and permanent methods in hard to reach areas. This innovative approach by PHHI-Sindh has also been adopted in other provinces of Pakistan.Keywords: inequalities, special camps, family planning services, hard to reach areas
Procedia PDF Downloads 1875700 New Iterative Algorithm for Improving Depth Resolution in Ionic Analysis: Effect of Iterations Number
Authors: N. Dahraoui, M. Boulakroune, D. Benatia
Abstract:
In this paper, the improvement by deconvolution of the depth resolution in Secondary Ion Mass Spectrometry (SIMS) analysis is considered. Indeed, we have developed a new Tikhonov-Miller deconvolution algorithm where a priori model of the solution is included. This is a denoisy and pre-deconvoluted signal obtained from: firstly, by the application of wavelet shrinkage algorithm, secondly by the introduction of the obtained denoisy signal in an iterative deconvolution algorithm. In particular, we have focused the light on the effect of the iterations number on the evolution of the deconvoluted signals. The SIMS profiles are multilayers of Boron in Silicon matrix.Keywords: DRF, in-depth resolution, multiresolution deconvolution, SIMS, wavelet shrinkage
Procedia PDF Downloads 4195699 Unified Structured Process for Health Analytics
Authors: Supunmali Ahangama, Danny Chiang Choon Poo
Abstract:
Health analytics (HA) is used in healthcare systems for effective decision-making, management, and planning of healthcare and related activities. However, user resistance, the unique position of medical data content, and structure (including heterogeneous and unstructured data) and impromptu HA projects have held up the progress in HA applications. Notably, the accuracy of outcomes depends on the skills and the domain knowledge of the data analyst working on the healthcare data. The success of HA depends on having a sound process model, effective project management and availability of supporting tools. Thus, to overcome these challenges through an effective process model, we propose an HA process model with features from the rational unified process (RUP) model and agile methodology.Keywords: agile methodology, health analytics, unified process model, UML
Procedia PDF Downloads 5085698 A Systematic Review Investigating the Use of EEG Measures in Neuromarketing
Authors: A. M. Byrne, E. Bonfiglio, C. Rigby, N. Edelstyn
Abstract:
Introduction: Neuromarketing employs numerous methodologies when investigating products and advertisement effectiveness. Electroencephalography (EEG), a non-invasive measure of electrical activity from the brain, is commonly used in neuromarketing. EEG data can be considered using time-frequency (TF) analysis, where changes in the frequency of brainwaves are calculated to infer participant’s mental states, or event-related potential (ERP) analysis, where changes in amplitude are observed in direct response to a stimulus. This presentation discusses the findings of a systematic review of EEG measures in neuromarketing. A systematic review summarises evidence on a research question, using explicit measures to identify, select, and critically appraise relevant research papers. Thissystematic review identifies which EEG measures are the most robust predictor of customer preference and purchase intention. Methods: Search terms identified174 papers that used EEG in combination with marketing-related stimuli. Publications were excluded if they were written in a language other than English or were not published as journal articles (e.g., book chapters). The review investigated which TF effect (e.g., theta-band power) and ERP component (e.g., N400) most consistently reflected preference and purchase intention. Machine-learning prediction was also investigated, along with the use of EEG combined with physiological measures such as eye-tracking. Results: Frontal alpha asymmetry was the most reliable TF signal, where an increase in activity over the left side of the frontal lobe indexed a positive response to marketing stimuli, while an increase in activity over the right side indexed a negative response. The late positive potential, a positive amplitude increase around 600 ms after stimulus presentation, was the most reliable ERP component, reflecting the conscious emotional evaluation of marketing stimuli. However, each measure showed mixed results when related to preference and purchase behaviour. Predictive accuracy was greatly improved through machine-learning algorithms such as deep neural networks, especially when combined with eye-tracking or facial expression analyses. Discussion: This systematic review provides a novel catalogue of the most effective use of each EEG measure commonly used in neuromarketing. Exciting findings to emerge are the identification of the frontal alpha asymmetry and late positive potential as markers of preferential responses to marketing stimuli. Predictive accuracy using machine-learning algorithms achieved predictive accuracies as high as 97%, and future research should therefore focus on machine-learning prediction when using EEG measures in neuromarketing.Keywords: EEG, ERP, neuromarketing, machine-learning, systematic review, time-frequency
Procedia PDF Downloads 1165697 Prediction of Coronary Artery Stenosis Severity Based on Machine Learning Algorithms
Authors: Yu-Jia Jian, Emily Chia-Yu Su, Hui-Ling Hsu, Jian-Jhih Chen
Abstract:
Coronary artery is the major supplier of myocardial blood flow. When fat and cholesterol are deposit in the coronary arterial wall, narrowing and stenosis of the artery occurs, which may lead to myocardial ischemia and eventually infarction. According to the World Health Organization (WHO), estimated 740 million people have died of coronary heart disease in 2015. According to Statistics from Ministry of Health and Welfare in Taiwan, heart disease (except for hypertensive diseases) ranked the second among the top 10 causes of death from 2013 to 2016, and it still shows a growing trend. According to American Heart Association (AHA), the risk factors for coronary heart disease including: age (> 65 years), sex (men to women with 2:1 ratio), obesity, diabetes, hypertension, hyperlipidemia, smoking, family history, lack of exercise and more. We have collected a dataset of 421 patients from a hospital located in northern Taiwan who received coronary computed tomography (CT) angiography. There were 300 males (71.26%) and 121 females (28.74%), with age ranging from 24 to 92 years, and a mean age of 56.3 years. Prior to coronary CT angiography, basic data of the patients, including age, gender, obesity index (BMI), diastolic blood pressure, systolic blood pressure, diabetes, hypertension, hyperlipidemia, smoking, family history of coronary heart disease and exercise habits, were collected and used as input variables. The output variable of the prediction module is the degree of coronary artery stenosis. The output variable of the prediction module is the narrow constriction of the coronary artery. In this study, the dataset was randomly divided into 80% as training set and 20% as test set. Four machine learning algorithms, including logistic regression, stepwise regression, neural network and decision tree, were incorporated to generate prediction results. We used area under curve (AUC) / accuracy (Acc.) to compare the four models, the best model is neural network, followed by stepwise logistic regression, decision tree, and logistic regression, with 0.68 / 79 %, 0.68 / 74%, 0.65 / 78%, and 0.65 / 74%, respectively. Sensitivity of neural network was 27.3%, specificity was 90.8%, stepwise Logistic regression sensitivity was 18.2%, specificity was 92.3%, decision tree sensitivity was 13.6%, specificity was 100%, logistic regression sensitivity was 27.3%, specificity 89.2%. From the result of this study, we hope to improve the accuracy by improving the module parameters or other methods in the future and we hope to solve the problem of low sensitivity by adjusting the imbalanced proportion of positive and negative data.Keywords: decision support, computed tomography, coronary artery, machine learning
Procedia PDF Downloads 2305696 The Use of Computer-Aided Design in Small Contractors in a Local Area of Korea
Authors: Myunghoun Jang
Abstract:
A survey of small-size contractors in Jeju was conducted to investigate college graduate's computer-aided design (CAD) competence. Most of small-size contractors use CAD software to review and update drawings submitted from an architect. This research analyzed the curriculum of the architectural engineering in several national universities. The CAD classes have 4 or 6 hours per week and use AutoCAD primarily. This paper proposes that a CAD class needs 6 hours per week, 2D drawing is the main theme in the curriculum, and exercises to make 3D models are also included in the CAD class. An improved method, for example Internet cafe and real time feedbacks using smartphones, to evaluate the reports and exercise results is necessary.Keywords: CAD (Computer Aided Design), CAD education, education improvement, small-size contractor
Procedia PDF Downloads 2695695 Towards Dynamic Estimation of Residential Building Energy Consumption in Germany: Leveraging Machine Learning and Public Data from England and Wales
Authors: Philipp Sommer, Amgad Agoub
Abstract:
The construction sector significantly impacts global CO₂ emissions, particularly through the energy usage of residential buildings. To address this, various governments, including Germany's, are focusing on reducing emissions via sustainable refurbishment initiatives. This study examines the application of machine learning (ML) to estimate energy demands dynamically in residential buildings and enhance the potential for large-scale sustainable refurbishment. A major challenge in Germany is the lack of extensive publicly labeled datasets for energy performance, as energy performance certificates, which provide critical data on building-specific energy requirements and consumption, are not available for all buildings or require on-site inspections. Conversely, England and other countries in the European Union (EU) have rich public datasets, providing a viable alternative for analysis. This research adapts insights from these English datasets to the German context by developing a comprehensive data schema and calibration dataset capable of predicting building energy demand effectively. The study proposes a minimal feature set, determined through feature importance analysis, to optimize the ML model. Findings indicate that ML significantly improves the scalability and accuracy of energy demand forecasts, supporting more effective emissions reduction strategies in the construction industry. Integrating energy performance certificates into municipal heat planning in Germany highlights the transformative impact of data-driven approaches on environmental sustainability. The goal is to identify and utilize key features from open data sources that significantly influence energy demand, creating an efficient forecasting model. Using Extreme Gradient Boosting (XGB) and data from energy performance certificates, effective features such as building type, year of construction, living space, insulation level, and building materials were incorporated. These were supplemented by data derived from descriptions of roofs, walls, windows, and floors, integrated into three datasets. The emphasis was on features accessible via remote sensing, which, along with other correlated characteristics, greatly improved the model's accuracy. The model was further validated using SHapley Additive exPlanations (SHAP) values and aggregated feature importance, which quantified the effects of individual features on the predictions. The refined model using remote sensing data showed a coefficient of determination (R²) of 0.64 and a mean absolute error (MAE) of 4.12, indicating predictions based on efficiency class 1-100 (G-A) may deviate by 4.12 points. This R² increased to 0.84 with the inclusion of more samples, with wall type emerging as the most predictive feature. After optimizing and incorporating related features like estimated primary energy consumption, the R² score for the training and test set reached 0.94, demonstrating good generalization. The study concludes that ML models significantly improve prediction accuracy over traditional methods, illustrating the potential of ML in enhancing energy efficiency analysis and planning. This supports better decision-making for energy optimization and highlights the benefits of developing and refining data schemas using open data to bolster sustainability in the building sector. The study underscores the importance of supporting open data initiatives to collect similar features and support the creation of comparable models in Germany, enhancing the outlook for environmental sustainability.Keywords: machine learning, remote sensing, residential building, energy performance certificates, data-driven, heat planning
Procedia PDF Downloads 615694 A Novel PSO Based Decision Tree Classification
Authors: Ali Farzan
Abstract:
Classification of data objects or patterns is a major part in most of Decision making systems. One of the popular and commonly used classification methods is Decision Tree (DT). It is a hierarchical decision making system by which a binary tree is constructed and starting from root, at each node some of the classes is rejected until reaching the leaf nods. Each leaf node is a representative of one specific class. Finding the splitting criteria in each node for constructing or training the tree is a major problem. Particle Swarm Optimization (PSO) has been adopted as a metaheuristic searching method for finding the best splitting criteria. Result of evaluating the proposed method over benchmark datasets indicates the higher accuracy of the new PSO based decision tree.Keywords: decision tree, particle swarm optimization, splitting criteria, metaheuristic
Procedia PDF Downloads 4085693 Selection of Variogram Model for Environmental Variables
Authors: Sheikh Samsuzzhan Alam
Abstract:
The present study investigates the selection of variogram model in analyzing spatial variations of environmental variables with the trend. Sometimes, the autofitted theoretical variogram does not really capture the true nature of the empirical semivariogram. So proper exploration and analysis are needed to select the best variogram model. For this study, an open source data collected from California Soil Resource Lab1 is used to explain the problems when fitting a theoretical variogram. Five most commonly used variogram models: Linear, Gaussian, Exponential, Matern, and Spherical were fitted to the experimental semivariogram. Ordinary kriging methods were considered to evaluate the accuracy of the selected variograms through cross-validation. This study is beneficial for selecting an appropriate theoretical variogram model for environmental variables.Keywords: anisotropy, cross-validation, environmental variables, kriging, variogram models
Procedia PDF Downloads 3365692 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)
Authors: Salvatore Luongo, Carlo Luongo
Abstract:
This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilitiesKeywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification
Procedia PDF Downloads 287