Search results for: cointegration techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6589

Search results for: cointegration techniques

5149 A Survey of Dynamic QoS Methods in Sofware Defined Networking

Authors: Vikram Kalekar

Abstract:

Modern Internet Protocol (IP) networks deploy traditional and modern Quality of Service (QoS) management methods to ensure the smooth flow of network packets during regular operations. SDN (Software-defined networking) networks have also made headway into better service delivery by means of novel QoS methodologies. While many of these techniques are experimental, some of them have been tested extensively in controlled environments, and few of them have the potential to be deployed widely in the industry. With this survey, we plan to analyze the approaches to QoS and resource allocation in SDN, and we will try to comment on the possible improvements to QoS management in the context of SDN.

Keywords: QoS, policy, congestion, flow management, latency, delay index terms-SDN, delay

Procedia PDF Downloads 183
5148 Comparative Studies on Thin Film of ZnO Deposited by Spray Pyrolysis and Sputtering Technique

Authors: Musa Momoh, A. U. Moreh, A. M. Bayawa, Sanusi Abdullahi, I. Atiku

Abstract:

In this study, thin films of ZnO were synthesized by two techniques namely RF sputtering and spray pyrolysis. The films were deposited on corning glass. The primary materials used are 99.99% pure. The optical and structural properties of the samples were studied. It has been noted that the samples deposited by Spray pyrolysis have and average transmittance, refractive index and extinction coefficient as 80-90%, 1.33-1.44 and 13.11-27.52 respectively. Those deposited by sputtering method are 34-80%, 1.51-1.52 and 3.15-3.28. The XRD patterns of the samples show that they are polycrystalline.

Keywords: zinc oxide, spray pyrolysis, rf sputtering, optical properties, electrical properties

Procedia PDF Downloads 244
5147 Parametric Approach for Reserve Liability Estimate in Mortgage Insurance

Authors: Rajinder Singh, Ram Valluru

Abstract:

Chain Ladder (CL) method, Expected Loss Ratio (ELR) method and Bornhuetter-Ferguson (BF) method, in addition to more complex transition-rate modeling, are commonly used actuarial reserving methods in general insurance. There is limited published research about their relative performance in the context of Mortgage Insurance (MI). In our experience, these traditional techniques pose unique challenges and do not provide stable claim estimates for medium to longer term liabilities. The relative strengths and weaknesses among various alternative approaches revolve around: stability in the recent loss development pattern, sufficiency and reliability of loss development data, and agreement/disagreement between reported losses to date and ultimate loss estimate. CL method results in volatile reserve estimates, especially for accident periods with little development experience. The ELR method breaks down especially when ultimate loss ratios are not stable and predictable. While the BF method provides a good tradeoff between the loss development approach (CL) and ELR, the approach generates claim development and ultimate reserves that are disconnected from the ever-to-date (ETD) development experience for some accident years that have more development experience. Further, BF is based on subjective a priori assumption. The fundamental shortcoming of these methods is their inability to model exogenous factors, like the economy, which impact various cohorts at the same chronological time but at staggered points along their life-time development. This paper proposes an alternative approach of parametrizing the loss development curve and using logistic regression to generate the ultimate loss estimate for each homogeneous group (accident year or delinquency period). The methodology was tested on an actual MI claim development dataset where various cohorts followed a sigmoidal trend, but levels varied substantially depending upon the economic and operational conditions during the development period spanning over many years. The proposed approach provides the ability to indirectly incorporate such exogenous factors and produce more stable loss forecasts for reserving purposes as compared to the traditional CL and BF methods.

Keywords: actuarial loss reserving techniques, logistic regression, parametric function, volatility

Procedia PDF Downloads 112
5146 Mitigation of Offshore Piling Noise Effects on Marine Mammals

Authors: Waled A. Dawoud, Abdelazim M. Negm, Nasser M. Saleh

Abstract:

Offshore piling generates underwater sound at level high enough to cause physical damage or hearing impairment to the marine mammals. Several methods can be used to mitigate the effect of underwater noise from offshore pile driving on marine mammals which can be divided into three main approaches. The first approach is to keep the mammal out of the high-risk area by using aversive sound waves produced by acoustic mitigation devices such as playing-back of mammal's natural predator vocalization, alarm or distress sounds, and anthropogenic sound. The second approach is to reduce the amount of underwater noise from pile driving using noise mitigation techniques such as bubble curtains, isolation casing, and hydro-sound dampers. The third approach is to eliminate the overlap of underwater waves by using prolonged construction process. To investigate the effectiveness of different noise mitigation methods; a pile driven with 235 kJ rated energy diesel hammer near Jeddah Coast, Kingdom of Saudi Arabia was used. Using empirical sound exposure model based on Red Sea characteristics and limits of National Oceanic and Atmospheric Administration; it was found that the aversive sound waves should extend to 1.8 km around the pile location. Bubble curtains can reduce the behavioral disturbance area up to 28%; temporary threshold shift up to 36%; permanent threshold shift up to 50%; and physical injury up to 70%. Isolation casing can reduce the behavioral disturbance range up to 12%; temporary threshold shift up to 21%; permanent threshold shift up to 29%; and physical injury up to 46%. Hydro-sound dampers efficiency depends mainly on the used technology and it can reduce the behavioral disturbance range from 10% to 33%; temporary threshold shift from 18% to 25%; permanent threshold shift from 32% to 50%; and physical injury from 46% to 60%. To prolong the construction process, it was found that the single pile construction, use of soft start, and keep time between two successive hammer strikes more than 3 seconds are the most effective techniques.

Keywords: offshore pile driving, sound propagation models, noise effects on marine mammals, Underwater noise mitigation

Procedia PDF Downloads 525
5145 Consideration of Uncertainty in Engineering

Authors: A. Mohammadi, M. Moghimi, S. Mohammadi

Abstract:

Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.

Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method

Procedia PDF Downloads 392
5144 Simulation and Characterization of Organic Light Emitting Diodes and Organic Photovoltaics Using Physics Based Tool

Authors: T. A. Shahul Hameed, P. Predeep, Anju Iqbal, M. R. Baiju

Abstract:

Research and development in organic photovoltaic cells and Organic Light Emitting Diodes have gained wider acceptance due to the advent of many advanced techniques to enhance the efficiency and operational hours. Here we report our work on design, simulation and characterizationracterize the bulk heterojunction organic photo cell and polymer light emitting diodes in different layer configurations using ATLAS, a licensed device simulation tool. Bulk heterojuction and multilayer devices were simulated for comparing their performance parameters.

Keywords: HOMO, LUMO, PLED, OPV

Procedia PDF Downloads 569
5143 Gammarus: Asellus Ratio as an Index of Organic Pollution: A Case Study in Markeaton, Kedleston Hall, and Allestree Park Lakes Derby, UK

Authors: Usman Bawa

Abstract:

Macro-invertebrates have been used to monitor organic pollution in rivers and streams. Several biotic indices based on macro-invertebrates have been developed over the years including the Biological Monitoring Working Party (BMWP). A new biotic index, the Gammarus:Asellus ratio has been recently proposed as an index of organic pollution. This study tested the validity of the Gammarus:Asellus ratio as an index of organic pollution, by examining the relationship between the Gammarus:Asellus ratio and physical-chemical parameters, and other biotic indices such as BMWP and, Average Score Per Taxon (ASPT) from lakes and streams at Markeaton Park, Allestree Park, and Kedleston Hall, Derbyshire. Macro invertebrates were sampled using the standard five-minute kick sampling techniques physical and chemical environmental variables were obtained based on standard sampling techniques. Eighteen sites were sampled, six sites from Markeaton Park (three sites across the stream and three sites across the lake). Six sites each were also sampled from Allestree Park and Kedleston Hall lakes. The Gammarus:Asellus ratio showed an opposite significant positive correlations with parameters indicative of organic pollution such as the level of nitrates, phosphates, and calcium and also revealed a negatively significant correlations with other biotic indices (BMWP/ASPT). The BMWP score correlated positively significantly with some water quality parameters such as dissolved oxygen and flow rate, but revealed no correlations with other chemical environmental variables. The BMWP score was significantly higher in the stream than the lake in Markeaton Park, also The ASPT scores appear to be significantly higher in the upper Lakes than the middle and lower lakes. This study has further strengthened the use of BMWP/ASPT score as an index of organic pollution. But, additional application is required to validate the use of Gammarus:Asellus as a rapid bio monitoring tool.

Keywords: Asellus, biotic index, Gammarus, macro invertebrates, organic pollution

Procedia PDF Downloads 327
5142 Beneficial Effect of Micropropagation Coupled with Mycorrhization on Enhancement of Growth Performance of Medicinal Plants

Authors: D. H. Tejavathi

Abstract:

Medicinal plants are globally valuable sources of herbal products. Wild populations of many medicinal plants are facing threat of extinction because of their narrow distribution, endemicity, and degradation of specific habitats. Micropropagation is an established in vitro technique by which large number of clones can be obtained from a small bit of explants in a short span of time within a limited space. Mycorrhization can minimize the transient transplantation shock, experienced by the micropropagated plants when they are transferred from lab to land. AM fungal association improves the physiological status of the host plants through better uptake of water and nutrients, particularly phosphorus. Consequently, the growth performance and biosynthesis of active principles are significantly enhanced in AM fungal treated plants. Bacopa monnieri, Andrographis paniculata, Agave vera-curz, Drymaria cordata and Majorana hortensis, important medicinal plants used in various indigenous systems of medicines, are selected for the present study. They form the main constituents of many herbal formulations. Standard in vitro techniques were followed to obtain the micropropagated plants. Shoot tips and nodal segments were used as explants. Explants were cultured on Murashige and Skoog, and Phillips and Collins media supplemented with various combinations of growth regulators. Multiple shoots were obtained on a media containing both auxins and cytokinins at various concentrations and combinations. Multiple shoots were then transferred to rooting media containing auxins for root induction. Thus, obtained in vitro regenerated plants were subjected to brief acclimatization before transferring them to land. One-month-old in vitro plants were treated with AM fungi, and the symbiotic effect on the overall growth parameters was analyzed. It was found that micropropagation coupled with mycorrhization has significant effect on the enhancement of biomass and biosynthesis of active principles in these selected medicinal plants. In vitro techniques coupled with mycorrhization have opened a possibility of obtaining better clones in respect of enhancement of biomass and biosynthesis of active principles. Beneficial effects of AM fungal association with medicinal plants are discussed.

Keywords: cultivation, medicinal plants, micropropagation, mycorrhization

Procedia PDF Downloads 156
5141 Machine Learning Framework: Competitive Intelligence and Key Drivers Identification of Market Share Trends among Healthcare Facilities

Authors: Anudeep Appe, Bhanu Poluparthi, Lakshmi Kasivajjula, Udai Mv, Sobha Bagadi, Punya Modi, Aditya Singh, Hemanth Gunupudi, Spenser Troiano, Jeff Paul, Justin Stovall, Justin Yamamoto

Abstract:

The necessity of data-driven decisions in healthcare strategy formulation is rapidly increasing. A reliable framework which helps identify factors impacting a healthcare provider facility or a hospital (from here on termed as facility) market share is of key importance. This pilot study aims at developing a data-driven machine learning-regression framework which aids strategists in formulating key decisions to improve the facility’s market share which in turn impacts in improving the quality of healthcare services. The US (United States) healthcare business is chosen for the study, and the data spanning 60 key facilities in Washington State and about 3 years of historical data is considered. In the current analysis, market share is termed as the ratio of the facility’s encounters to the total encounters among the group of potential competitor facilities. The current study proposes a two-pronged approach of competitor identification and regression approach to evaluate and predict market share, respectively. Leveraged model agnostic technique, SHAP, to quantify the relative importance of features impacting the market share. Typical techniques in literature to quantify the degree of competitiveness among facilities use an empirical method to calculate a competitive factor to interpret the severity of competition. The proposed method identifies a pool of competitors, develops Directed Acyclic Graphs (DAGs) and feature level word vectors, and evaluates the key connected components at the facility level. This technique is robust since its data-driven, which minimizes the bias from empirical techniques. The DAGs factor in partial correlations at various segregations and key demographics of facilities along with a placeholder to factor in various business rules (for ex. quantifying the patient exchanges, provider references, and sister facilities). Identified are the multiple groups of competitors among facilities. Leveraging the competitors' identified developed and fine-tuned Random Forest Regression model to predict the market share. To identify key drivers of market share at an overall level, permutation feature importance of the attributes was calculated. For relative quantification of features at a facility level, incorporated SHAP (SHapley Additive exPlanations), a model agnostic explainer. This helped to identify and rank the attributes at each facility which impacts the market share. This approach proposes an amalgamation of the two popular and efficient modeling practices, viz., machine learning with graphs and tree-based regression techniques to reduce the bias. With these, we helped to drive strategic business decisions.

Keywords: competition, DAGs, facility, healthcare, machine learning, market share, random forest, SHAP

Procedia PDF Downloads 77
5140 Efficacy of the Culturally Adapted Stepping Stones Positive Parenting Program on Parents of Children with Autism and down Syndrome

Authors: Afsheen Masood, Sumaira Rashid, Shama Mazahir

Abstract:

The main aim of this research is to evaluate the efficacy of a culturally adapted management program The Stepping Stones Positive Parenting Program (Tripple P; SSTP) for caregivers of children with autism spectrum disorders and Down syndrome. Positive psychology has catered new dimensions to the traditional perspectives of parenting. The current study was designed to determine the adoptions of positive parenting elements such as parenting styles, parental satisfaction, parental competency, and management of parental stress in alignment with behavioral problems of children with special needs after their parents get trained on Positive Parenting Techniques. This research study was devised in liaison with rehabilitation institute that is extending services for children with Autism Spectrum Disorder and Down syndrome. A Quasi experimental research design was employed with pre-test, post-test control group study in order to evaluate the changes in parenting patterns of parents with children (with Autism and Down syndrome). Caregivers of children diagnosed with Autism and Down syndrome between the age ranges of 25 to 45 years, n=20 from autism group and 20 from Down syndrome group (while their children with special needs in the age ranges of 8 to 14 years) participated in the current research. Parenting scale encompassing areas of parental efficacy, parental satisfaction was used in addition to Parenting Stress Index (SF), indigenously developed Child Behavior Problems Checklist and demographic sheet. Findings revealed statistically significant improvement for caregivers in intervention group from pretest to posttest situation. There was considerable decrease in reported mean behavioral issues of children with Down syndrome when parents in experimental group started practicing Positive Parenting Techniques with their special needs children. This change was somehow not recorded in parents of children with autism. Thus these findings establish the efficacy of culturally adapted parenting program that is evidence based and is established in western empirical research. This carries significant implication for practitioners in special needs domain and for school psychologists in Pakistan.

Keywords: Autism and Parenting, Downsyndrome and Parenting , Positive Parenting, Stepping Stone Positive Parenting Program, Mangement of Behavioral Problems with positive parenting

Procedia PDF Downloads 231
5139 Liquid Unloading of Wells with Scaled Perforation via Batch Foamers

Authors: Erwin Chan, Aravind Subramaniyan, Siti Abdullah Fatehah, Steve Lian Kuling

Abstract:

Foam assisted lift technology is proven across the industry to provide efficient deliquification in gas wells. Such deliquification is typically achieved by delivering the foamer chemical downhole via capillary strings. In highly liquid loaded wells where capillary strings are not readily available, foamer can be delivered via batch injection or bull-heading. The latter techniques differ from the former in that cap strings allow for liquid to be unloaded continuously, whereas foamer batches require that periodic batching be conducted for the liquid to be unloaded. Although batch injection allows for liquid to be unloaded in wells with suitable water to gas (WGR) ratio and condensate to gas (CGR) ratio without well intervention for capillary string installation, this technique comes with its own set of challenges - for foamer to de-liquify liquids, the chemical needs to reach perforation locations where gas bubbling is observed. In highly scaled perforation zones in certain wells, foamer delivered in batches is unable to reach the gas bubbling zone, thus achieving poor lift efficiency. This paper aims to discuss the techniques and challenges for unloading liquid via batch injection in scaled perforation wells X and Y, whose WGR is 6bbl/MMscf, whose scale build-up is observed at the bottom of perforation interval, whose water column is 400 feet, and whose ‘bubbling zone’ is less than 100 feet. Variables such as foamer Z dosage, batching technique, and well flow control valve opening times are manipulated during the duration of the trial to achieve maximum liquid unloading and gas rates. During the field trial, the team has found optimal values between the three aforementioned parameters for best unloading results, in which each cycle’s gas and liquid rates are compared with baselines with similar flowing tubing head pressures (FTHP). It is discovered that amongst other factors, a good agitation technique is a primary determinant for efficient liquid unloading. An average increment of 2MMscf/d against an average production of 4MMscf/d at stable FTHP is recorded during the trial.

Keywords: foam, foamer, gas lift, liquid unloading, scale, batch injection

Procedia PDF Downloads 165
5138 On the Volume of Ganglion Cell Stimulation in Visual Prostheses by Finite Element Discretization

Authors: Diego Luján Villarreal

Abstract:

Visual prostheses are designed to repair some eyesight in patients blinded by photoreceptor diseases, such as retinitis pigmentosa (RP) and age-related macular degeneration (AMD). Electrode-to-cell proximity has drawn attention due to its implications on secure single-localized stimulation. Yet, few techniques are available for understanding the relationship between the number of cells activated and the current injection. We propose an answering technique by solving the governing equation for time-dependent electrical currents using finite element discretization to obtain the volume of stimulation.

Keywords: visual prosthetic devices, volume for stimulation, FEM discretization, 3D simulation

Procedia PDF Downloads 58
5137 Effects of Sensory Integration Techniques in Science Education of Autistic Students

Authors: Joanna Estkowska

Abstract:

Sensory integration methods are very useful and improve daily functioning autistic and mentally disabled children. Autism is a neurobiological disorder that impairs one's ability to communicate with and relate to others as well as their sensory system. Children with autism, even highly functioning kids, can find it difficult to process language with surrounding noise or smells. They are hypersensitive to things we can ignore such as sight, sounds and touch. Adolescents with highly functioning autism or Asperger Syndrome can study Science and Math but the social aspect is difficult for them. Nature science is an area of study that attracts many of these kids. It is a systematic field in which the children can focus on a small aspect. If you follow these rules you can come up with an expected result. Sensory integration program and systematic classroom observation are quantitative methods of measuring classroom functioning and behaviors from direct observations. These methods specify both the events and behaviors that are to be observed and how they are to be recorded. Our students with and without autism attended the lessons in the classroom of nature science in the school and in the laboratory of University of Science and Technology in Bydgoszcz. The aim of this study is investigation the effects of sensory integration methods in teaching to students with autism. They were observed during experimental lessons in the classroom and in the laboratory. Their physical characteristics, sensory dysfunction, and behavior in class were taken into consideration by comparing their similarities and differences. In the chemistry classroom, every autistic student is paired with a mentor from their school. In the laboratory, the children are expected to wear goggles, gloves and a lab coat. The chemistry classes in the laboratory were held for four hours with a lunch break, and according to the assistants, the children were engaged the whole time. In classroom of nature science, the students are encouraged to use the interactive exhibition of chemical, physical and mathematical models constructed by the author of this paper. Our students with and without autism attended the lessons in those laboratories. The teacher's goals are: to assist the child in inhibiting and modulating sensory information and support the child in processing a response to sensory stimulation.

Keywords: autism spectrum disorder, science education, sensory integration techniques, student with special educational needs

Procedia PDF Downloads 178
5136 Spatial Mental Imagery in Students with Visual Impairments when Learning Literal and Metaphorical Uses of Prepositions in English as a Foreign Language

Authors: Natalia Sáez, Dina Shulfman

Abstract:

There is an important research gap regarding accessible pedagogical techniques for teaching foreign languages to adults with visual impairments. English as a foreign language (EFL), in particular, is needed in many countries to expand occupational opportunities and improve living standards. Within EFL research, teaching and learning prepositions have only recently gained momentum, considering that they constitute one of the most difficult structures to learn in a foreign language and are fundamental for communicating about spatial relations in the world, both on the physical and imaginary levels. Learning to use prepositions would not only facilitate communication when referring to the surrounding tangible environment but also when conveying ideas about abstract topics (e.g., justice, love, society), for which students’ sociocultural knowledge about space could play an important role. By potentiating visually impaired students’ ability to construe mental spatial imagery, this study made efforts to explore pedagogical techniques that cater to their strengths, helping them create new worlds by welcoming and expanding their sociocultural funds of knowledge as they learn to use English prepositions. Fifteen visually impaired adults living in Chile participated in the study. Their first language was Spanish, and they were learning English at the intermediate level of proficiency in an EFL workshop at La Biblioteca Central para Ciegos (The Central Library for the Blind). Within this workshop, a series of activities and interviews were designed and implemented with the intention of uncovering students’ spatial funds of knowledge when learning literal/physical uses of three English prepositions, namely “in,” “at,” and “on”. The activities and interviews also explored whether students used their original spatial funds of knowledge when learning metaphorical uses of these prepositions and if their use of spatial imagery changed throughout the learning activities. Over the course of approximately half a year, it soon became clear that the students construed mental images of space when learning both literal/physical and metaphorical uses of these prepositions. This research could inform a new approach to inclusive language education using pedagogical methods that are relevant and accessible to students with visual impairments.

Keywords: EFL, funds of knowledge, prepositions, spatial cognition, visually impaired students

Procedia PDF Downloads 64
5135 A Newspapers Expectations Indicator from Web Scraping

Authors: Pilar Rey del Castillo

Abstract:

This document describes the building of an average indicator of the general sentiments about the future exposed in the newspapers in Spain. The raw data are collected through the scraping of the Digital Periodical and Newspaper Library website. Basic tools of natural language processing are later applied to the collected information to evaluate the sentiment strength of each word in the texts using a polarized dictionary. The last step consists of summarizing these sentiments to produce daily indices. The results are a first insight into the applicability of these techniques to produce periodic sentiment indicators.

Keywords: natural language processing, periodic indicator, sentiment analysis, web scraping

Procedia PDF Downloads 115
5134 Rheometer Enabled Study of Tissue/biomaterial Frequency-Dependent Properties

Authors: Polina Prokopovich

Abstract:

Despite the well-established dependence of cartilage mechanical properties on the frequency of the applied load, most research in the field is carried out in either load-free or constant load conditions because of the complexity of the equipment required for the determination of time-dependent properties. These simpler analyses provide a limited representation of cartilage properties thus greatly reducing the impact of the information gathered hindering the understanding of the mechanisms involved in this tissue replacement, development and pathology. More complex techniques could represent better investigative methods, but their uptake in cartilage research is limited by the highly specialised training required and cost of the equipment. There is, therefore, a clear need for alternative experimental approaches to cartilage testing to be deployed in research and clinical settings using more user-friendly and financial accessible devices. Frequency dependent material properties can be determined through rheometry that is an easy to use requiring a relatively inexpensive device; we present how a commercial rheometer can be adapted to determine the viscoelastic properties of articular cartilage. Frequency-sweep tests were run at various applied normal loads on immature, mature and trypsinased (as model of osteoarthritis) cartilage samples to determine the dynamic shear moduli (G*, G′ G″) of the tissues. Moduli increased with increasing frequency and applied load; mature cartilage had generally the highest moduli and GAG depleted samples the lowest. Hydraulic permeability (KH) was estimated from the rheological data and decreased with applied load; GAG depleted cartilage exhibited higher hydraulic permeability than either immature or mature tissues. The rheometer-based methodology developed was validated by the close comparison of the rheometer-obtained cartilage characteristics (G*, G′, G″, KH) with results obtained with more complex testing techniques available in literature. Rheometry is relatively simpler and does not require highly capital intensive machinery and staff training is more accessible; thus the use of a rheometer would represent a cost-effective approach for the determination of frequency-dependent properties of cartilage for more comprehensive and impactful results for both healthcare professional and R&D.

Keywords: tissue, rheometer, biomaterial, cartilage

Procedia PDF Downloads 59
5133 Ischemic Stroke Detection in Computed Tomography Examinations

Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina

Abstract:

Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.

Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means

Procedia PDF Downloads 350
5132 Optimization of a High-Growth Investment Portfolio for the South African Market Using Predictive Analytics

Authors: Mia Françoise

Abstract:

This report aims to develop a strategy for assisting short-term investors to benefit from the current economic climate in South Africa by utilizing technical analysis techniques and predictive analytics. As part of this research, value investing and technical analysis principles will be combined to maximize returns for South African investors while optimizing volatility. As an emerging market, South Africa offers many opportunities for high growth in sectors where other developed countries cannot grow at the same rate. Investing in South African companies with significant growth potential can be extremely rewarding. Although the risk involved is more significant in countries with less developed markets and infrastructure, there is more room for growth in these countries. According to recent research, the offshore market is expected to outperform the local market over the long term; however, short-term investments in the local market will likely be more profitable, as the Johannesburg Stock Exchange is predicted to outperform the S&P500 over the short term. The instabilities in the economy contribute to increased market volatility, which can benefit investors if appropriately utilized. Price prediction and portfolio optimization comprise the two primary components of this methodology. As part of this process, statistics and other predictive modeling techniques will be used to predict the future performance of stocks listed on the Johannesburg Stock Exchange. Following predictive data analysis, Modern Portfolio Theory, based on Markowitz's Mean-Variance Theorem, will be applied to optimize the allocation of assets within an investment portfolio. By combining different assets within an investment portfolio, this optimization method produces a portfolio with an optimal ratio of expected risk to expected return. This methodology aims to provide a short-term investment with a stock portfolio that offers the best risk-to-return profile for stocks listed on the JSE by combining price prediction and portfolio optimization.

Keywords: financial stocks, optimized asset allocation, prediction modelling, South Africa

Procedia PDF Downloads 74
5131 Approximate-Based Estimation of Single Event Upset Effect on Statistic Random-Access Memory-Based Field-Programmable Gate Arrays

Authors: Mahsa Mousavi, Hamid Reza Pourshaghaghi, Mohammad Tahghighi, Henk Corporaal

Abstract:

Recently, Statistic Random-Access Memory-based (SRAM-based) Field-Programmable Gate Arrays (FPGAs) are widely used in aeronautics and space systems where high dependability is demanded and considered as a mandatory requirement. Since design’s circuit is stored in configuration memory in SRAM-based FPGAs; they are very sensitive to Single Event Upsets (SEUs). In addition, the adverse effects of SEUs on the electronics used in space are much higher than in the Earth. Thus, developing fault tolerant techniques play crucial roles for the use of SRAM-based FPGAs in space. However, fault tolerance techniques introduce additional penalties in system parameters, e.g., area, power, performance and design time. In this paper, an accurate estimation of configuration memory vulnerability to SEUs is proposed for approximate-tolerant applications. This vulnerability estimation is highly required for compromising between the overhead introduced by fault tolerance techniques and system robustness. In this paper, we study applications in which the exact final output value is not necessarily always a concern meaning that some of the SEU-induced changes in output values are negligible. We therefore define and propose Approximate-based Configuration Memory Vulnerability Factor (ACMVF) estimation to avoid overestimating configuration memory vulnerability to SEUs. In this paper, we assess the vulnerability of configuration memory by injecting SEUs in configuration memory bits and comparing the output values of a given circuit in presence of SEUs with expected correct output. In spite of conventional vulnerability factor calculation methods, which accounts any deviations from the expected value as failures, in our proposed method a threshold margin is considered depending on user-case applications. Given the proposed threshold margin in our model, a failure occurs only when the difference between the erroneous output value and the expected output value is more than this margin. The ACMVF is subsequently calculated by acquiring the ratio of failures with respect to the total number of SEU injections. In our paper, a test-bench for emulating SEUs and calculating ACMVF is implemented on Zynq-7000 FPGA platform. This system makes use of the Single Event Mitigation (SEM) IP core to inject SEUs into configuration memory bits of the target design implemented in Zynq-7000 FPGA. Experimental results for 32-bit adder show that, when 1% to 10% deviation from correct output is considered, the counted failures number is reduced 41% to 59% compared with the failures number counted by conventional vulnerability factor calculation. It means that estimation accuracy of the configuration memory vulnerability to SEUs is improved up to 58% in the case that 10% deviation is acceptable in output results. Note that less than 10% deviation in addition result is reasonably tolerable for many applications in approximate computing domain such as Convolutional Neural Network (CNN).

Keywords: fault tolerance, FPGA, single event upset, approximate computing

Procedia PDF Downloads 178
5130 Behavior Loss Aversion Experimental Laboratory of Financial Investments

Authors: Jihene Jebeniani

Abstract:

We proposed an approach combining both the techniques of experimental economy and the flexibility of discrete choice models in order to test the loss aversion. Our main objective was to test the loss aversion of the Cumulative Prospect Theory (CPT). We developed an experimental laboratory in the context of the financial investments that aimed to analyze the attitude towards the risk of the investors. The study uses the lotteries and is basing on econometric modeling. The estimated model was the ordered probit.

Keywords: risk aversion, behavioral finance, experimental economic, lotteries, cumulative prospect theory

Procedia PDF Downloads 454
5129 Students Perception of a Gamified Student Engagement Platform as Supportive Technology in Learning

Authors: Pinn Tsin Isabel Yee

Abstract:

Students are increasingly turning towards online learning materials to supplement their education. One such approach would be the gamified student engagement platforms (GSEPs) to instill a new learning culture. Data was collected from closed-ended questions via content analysis techniques. About 81.8% of college students from the Monash University Foundation Year agreed that GSEPs (Quizizz) was an effective tool for learning. Approximately 85.5% of students disagreed that games were a waste of time. GSEPs were highly effective among students to facilitate the learning process.

Keywords: engagement, gamified, Quizizz, technology

Procedia PDF Downloads 89
5128 On the Study of the Electromagnetic Scattering by Large Obstacle Based on the Method of Auxiliary Sources

Authors: Hidouri Sami, Aguili Taoufik

Abstract:

We consider fast and accurate solutions of scattering problems by large perfectly conducting objects (PEC) formulated by an optimization of the Method of Auxiliary Sources (MAS). We present various techniques used to reduce the total computational cost of the scattering problem. The first technique is based on replacing the object by an array of finite number of small (PEC) object with the same shape. The second solution reduces the problem on considering only the half of the object.These two solutions are compared to results from the reference bibliography.

Keywords: method of auxiliary sources, scattering, large object, RCS, computational resources

Procedia PDF Downloads 225
5127 Design of Broadband Power Divider for 3G and 4G Applications

Authors: A. M. El-Akhdar, A. M. El-Tager, H. M. El-Hennawy

Abstract:

This paper presents a broadband power divider with equal power division ratio. Two sections of transmission line transformers based on coupled microstrip lines are applied to obtain broadband performance. In addition, design methodology is proposed for the novel structure. A prototype is designed, simulated to operate in the band from 2.1 to 3.8 GHz to fulfill the requirements of 3G and 4G applications. The proposed structure features reduced size and less resistors than other conventional techniques. Simulation verifies the proposed idea and design methodology.

Keywords: power dividers, coupled lines, microstrip, 4G applications

Procedia PDF Downloads 457
5126 Deep Learning-Based Liver 3D Slicer for Image-Guided Therapy: Segmentation and Needle Aspiration

Authors: Ahmedou Moulaye Idriss, Tfeil Yahya, Tamas Ungi, Gabor Fichtinger

Abstract:

Image-guided therapy (IGT) plays a crucial role in minimally invasive procedures for liver interventions. Accurate segmentation of the liver and precise needle placement is essential for successful interventions such as needle aspiration. In this study, we propose a deep learning-based liver 3D slicer designed to enhance segmentation accuracy and facilitate needle aspiration procedures. The developed 3D slicer leverages state-of-the-art convolutional neural networks (CNNs) for automatic liver segmentation in medical images. The CNN model is trained on a diverse dataset of liver images obtained from various imaging modalities, including computed tomography (CT) and magnetic resonance imaging (MRI). The trained model demonstrates robust performance in accurately delineating liver boundaries, even in cases with anatomical variations and pathological conditions. Furthermore, the 3D slicer integrates advanced image registration techniques to ensure accurate alignment of preoperative images with real-time interventional imaging. This alignment enhances the precision of needle placement during aspiration procedures, minimizing the risk of complications and improving overall intervention outcomes. To validate the efficacy of the proposed deep learning-based 3D slicer, a comprehensive evaluation is conducted using a dataset of clinical cases. Quantitative metrics, including the Dice similarity coefficient and Hausdorff distance, are employed to assess the accuracy of liver segmentation. Additionally, the performance of the 3D slicer in guiding needle aspiration procedures is evaluated through simulated and clinical interventions. Preliminary results demonstrate the effectiveness of the developed 3D slicer in achieving accurate liver segmentation and guiding needle aspiration procedures with high precision. The integration of deep learning techniques into the IGT workflow shows great promise for enhancing the efficiency and safety of liver interventions, ultimately contributing to improved patient outcomes.

Keywords: deep learning, liver segmentation, 3D slicer, image guided therapy, needle aspiration

Procedia PDF Downloads 28
5125 Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset

Authors: Essam Al Daoud

Abstract:

Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.

Keywords: gradient boosting, XGBoost, LightGBM, CatBoost, home credit

Procedia PDF Downloads 149
5124 Rice Area Determination Using Landsat-Based Indices and Land Surface Temperature Values

Authors: Burçin Saltık, Levent Genç

Abstract:

In this study, it was aimed to determine a route for identification of rice cultivation areas within Thrace and Marmara regions of Turkey using remote sensing and GIS. Landsat 8 (OLI-TIRS) imageries acquired in production season of 2013 with 181/32 Path/Row number were used. Four different seasonal images were generated utilizing original bands and different transformation techniques. All images were classified individually using supervised classification techniques and Land Use Land Cover Maps (LULC) were generated with 8 classes. Areas (ha, %) of each classes were calculated. In addition, district-based rice distribution maps were developed and results of these maps were compared with Turkish Statistical Institute (TurkSTAT; TSI)’s actual rice cultivation area records. Accuracy assessments were conducted, and most accurate map was selected depending on accuracy assessment and coherency with TSI results. Additionally, rice areas on over 4° slope values were considered as mis-classified pixels and they eliminated using slope map and GIS tools. Finally, randomized rice zones were selected to obtain maximum-minimum value ranges of each date (May, June, July, August, September images separately) NDVI, LSWI, and LST images to test whether they may be used for rice area determination via raster calculator tool of ArcGIS. The most accurate classification for rice determination was obtained from seasonal LSWI LULC map, and considering TSI data and accuracy assessment results and mis-classified pixels were eliminated from this map. According to results, 83151.5 ha of rice areas exist within study area. However, this result is higher than TSI records with an area of 12702.3 ha. Use of maximum-minimum range of rice area NDVI, LSWI, and LST was tested in Meric district. It was seen that using the value ranges obtained from July imagery, gave the closest results to TSI records, and the difference was only 206.4 ha. This difference is normal due to relatively low resolution of images. Thus, employment of images with higher spectral, spatial, temporal and radiometric resolutions may provide more reliable results.

Keywords: landsat 8 (OLI-TIRS), LST, LSWI, LULC, NDVI, rice

Procedia PDF Downloads 209
5123 Detect Circles in Image: Using Statistical Image Analysis

Authors: Fathi M. O. Hamed, Salma F. Elkofhaifee

Abstract:

The aim of this work is to detect geometrical shape objects in an image. In this paper, the object is considered to be as a circle shape. The identification requires find three characteristics, which are number, size, and location of the object. To achieve the goal of this work, this paper presents an algorithm that combines from some of statistical approaches and image analysis techniques. This algorithm has been implemented to arrive at the major objectives in this paper. The algorithm has been evaluated by using simulated data, and yields good results, and then it has been applied to real data.

Keywords: image processing, median filter, projection, scale-space, segmentation, threshold

Procedia PDF Downloads 415
5122 Rapid and Sensitive Detection: Biosensors as an Innovative Analytical Tools

Authors: Sylwia Baluta, Joanna Cabaj, Karol Malecha

Abstract:

The evolution of biosensors was driven by the need for faster and more versatile analytical methods for application in important areas including clinical, diagnostics, food analysis or environmental monitoring, with minimum sample pretreatment. Rapid and sensitive neurotransmitters detection is extremely important in modern medicine. These compounds mainly occur in the brain and central nervous system of mammals. Any changes in the neurotransmitters concentration may lead to many diseases, such as Parkinson’s or schizophrenia. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements.

Keywords: adrenaline, biosensor, dopamine, laccase, tyrosinase

Procedia PDF Downloads 128
5121 Development of Antioxidant Rich Bakery Products by Applying Lysine and Maillard Reaction Products

Authors: Attila Kiss, Erzsébet Némedi, Zoltán Naár

Abstract:

Due to the rapidly growing number of conscious customers in the recent years, more and more people look for products with positive physiological effects which may contribute to the preservation of their health. In response to these demands Food Science Research Institute of Budapest develops and introduces into the market new functional foods of guaranteed positive effect that contain bioactive agents. New, efficient technologies are also elaborated in order to preserve the maximum biological effect of the produced foods. The main objective of our work was the development of new functional biscuits fortified with physiologically beneficial ingredients. Bakery products constitute the base of the food nutrients’ pyramid, thus they might be regarded as foodstuffs of the largest consumed quantity. In addition to the well-known and certified physiological benefits of lysine, as an essential amino acid, a series of antioxidant type compounds is formed as a consequence of the occurring Maillard-reaction. Progress of the evoked Maillard-reaction was studied by applying diverse sugars (glucose, fructose, saccharose, isosugar) and lysine at several temperatures (120-170°C). Interval of thermal treatment was also varied (10-30 min). The composition and production technologies were tailored in order to reach the maximum of the possible biological benefits, so as to the highest antioxidant capacity in the biscuits. Out of the examined sugar components, theextent of the Maillard-reaction-driven transformation of glucose was the most pronounced at both applied temperatures. For the precise assessment of the antioxidant activity of the products FRAP and DPPH methods were adapted and optimised. To acquire an authentic and extensive mechanism of the occurring transformations, Maillard-reaction products were identified, and relevant reaction pathways were revealed. GC-MS and HPLC-MS techniques were applied for the analysis of the 60 generated MRPs and characterisation of actual transformation processes. 3 plausible major transformation routes might have been suggested based on the analytical result and the deductive sequence of possible occurring conversions between lysine and the sugars.

Keywords: Maillard-reaction, lysine, antioxidant activity, GC-MS and HPLC-MS techniques

Procedia PDF Downloads 467
5120 Studies on the Effect of Dehydration Techniques, Treatments, Packaging Material and Methods on the Quality of Buffalo Meat during Ambient Temperature Storage

Authors: Tariq Ahmad Safapuri, Saghir Ahmad, Farhana Allai

Abstract:

The present study was conducted to evaluate the effect dehydration techniques (polyhouse and tray drying), different treatment (SHMP, SHMP+ salt, salt + turmeric), different packaging material (HDPE, combination film), and different packaging methods (air, vacuum, CO2 Flush) on quality of dehydrated buffalo meat during ambient temperature storage. The quality measuring parameters included physico-chemical characteristics i.e. pH, rehydration ratio, moisture content and microbiological characteristics viz total plate content. It was found that the treatment of (SHMP, SHMP + salt, salt + turmeric increased the pH. Moisture Content of dehydrated meat samples were found in between 7.20% and 5.54%.the rehydration ratio of salt+ turmeric treated sample was found to be highest and lowest for controlled meat sample. the bacterial count log TPC/g of salt + turmeric and tray dried was lowest i.e. 1.80.During ambient temperature storage ,there was no considerable change in pH of dehydrated sample till 150 days. however the moisture content of samples increased in different packaging system in different manner. The highest moisture rise was found in case of controlled meat sample HDPE/air packed while the lowest increase was reported for SHMP+ Salt treated Packed by vacuum in combination film packed sample. Rehydration ratio was found considerably affected in case of HDPE and air packed sample dehydrated in polyhouse after 150 days of ambient storage. While there was a very little change in the rehydration ratio of meat samples packed in combination film CO2 flush system. The TPC was found under safe limit even after 150 days of storage. The microbial count was found to be lowest for salt+ turmeric treated samples after 150 days of storage.

Keywords: ambient temperature, dehydration technique, rehydration ratio, SHMP (sodium hexa meta phosphate), HDPE (high density polyethelene)

Procedia PDF Downloads 401