Search results for: minimum root mean square (RMS) error matching algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9333

Search results for: minimum root mean square (RMS) error matching algorithm

5613 Planning Fore Stress II: Study on Resiliency of New Architectural Patterns in Urban Scale

Authors: Amir Shouri, Fereshteh Tabe

Abstract:

Master planning and urban infrastructure’s thoughtful and sequential design strategies will play the major role in reducing the damages of natural disasters, war and or social/population related conflicts for cities. Defensive strategies have been revised during the history of mankind after having damages from natural depressions, war experiences and terrorist attacks on cities. Lessons learnt from Earthquakes, from 2 world war casualties in 20th century and terrorist activities of all times. Particularly, after Hurricane Sandy of New York in 2012 and September 11th attack on New York’s World Trade Centre (WTC) in 21st century, there have been series of serious collaborations between law making authorities, urban planners and architects and defence related organizations to firstly, getting prepared and/or prevent such activities and secondly, reduce the human loss and economic damages to minimum. This study will work on developing a model of planning for New York City, where its citizens will get minimum impacts in threat-full time with minimum economic damages to the city after the stress is passed. The main discussion in this proposal will focus on pre-hazard, hazard-time and post-hazard transformative policies and strategies that will reduce the “Life casualties” and will ease “Economic Recovery” in post-hazard conditions. This proposal is going to scrutinize that one of the key solutions in this path might be focusing on all overlaying possibilities on architectural platforms of three fundamental infrastructures, the transportation, the power related sources and defensive abilities on a dynamic-transformative framework that will provide maximum safety, high level of flexibility and fastest action-reaction opportunities in stressful periods of time. “Planning Fore Stress” is going to be done in an analytical, qualitative and quantitative work frame, where it will study cases from all over the world. Technology, Organic Design, Materiality, Urban forms, city politics and sustainability will be discussed in deferent cases in international scale. From the modern strategies of Copenhagen for living friendly with nature to traditional approaches of Indonesian old urban planning patterns, the “Iron Dome” of Israel to “Tunnels” in Gaza, from “Ultra-high-performance quartz-infused concrete” of Iran to peaceful and nature-friendly strategies of Switzerland, from “Urban Geopolitics” in cities, war and terrorism to “Design of Sustainable Cities” in the world, will all be studied with references and detailed look to analysis of each case in order to propose the most resourceful, practical and realistic solutions to questions on “New City Divisions”, “New City Planning and social activities” and “New Strategic Architecture for Safe Cities”. This study is a developed version of a proposal that was announced as winner at MoMA in 2013 in call for ideas for Rockaway after Sandy Hurricane took place.

Keywords: urban scale, city safety, natural disaster, war and terrorism, city divisions, architecture for safe cities

Procedia PDF Downloads 484
5612 The Association between Acupuncture Treatment and a Decreased Risk of Irritable Bowel Syndrome in Patients with Depression

Authors: Greg Zimmerman

Abstract:

Background: Major depression is a common illness that affects millions of people globally. It is the leading cause of disability and is projected to become the number one cause of the global burden of disease by 2030. Many of those who suffer from depression also suffer from Irritable Bowel Syndrome (IBS). Acupuncture has been shown to help depression. The aim of this study was to investigate the effectiveness of acupuncture in reducing the risk of IBS in patients with depression. Methods: We enrolled patients diagnosed with depression through the Taiwanese National Health Insurance Research Database (NHIRD). Propensity score matching was used to match equal numbers (n=32971) of the acupuncture cohort and no-acupuncture cohort based on characteristics including sex, age, baseline comorbidity, and medication. The Cox regression model was used to compare the hazard ratios (HRs) of IBS in the two cohorts. Results: The basic characteristics of the two groups were similar. The cumulative incidence of IBS was significantly lower in the acupuncture cohort than in the no-acupuncture cohort (Log-rank test, p<0.001). Conclusion: The results provided real-world evidence that acupuncture may have a beneficial effect on IBS risk reduction in patients with depression.

Keywords: acupuncture, depression, irritable bowel syndrome, national health insurance research database, real-world evidence

Procedia PDF Downloads 106
5611 Delineation of Green Infrastructure Buffer Areas with a Simulated Annealing: Consideration of Ecosystem Services Trade-Offs in the Objective Function

Authors: Andres Manuel Garcia Lamparte, Rocio Losada Iglesias, Marcos BoullóN Magan, David Miranda Barros

Abstract:

The biodiversity strategy of the European Union for 2030, mentions climate change as one of the key factors for biodiversity loss and considers green infrastructure as one of the solutions to this problem. In this line, the European Commission has developed a green infrastructure strategy which commits members states to consider green infrastructure in their territorial planning. This green infrastructure is aimed at granting the provision of a wide number of ecosystem services to support biodiversity and human well-being by countering the effects of climate change. Yet, there are not too many tools available to delimit green infrastructure. The available ones consider the potential of the territory to provide ecosystem services. However, these methods usually aggregate several maps of ecosystem services potential without considering possible trade-offs. This can lead to excluding areas with a high potential for providing ecosystem services which have many trade-offs with other ecosystem services. In order to tackle this problem, a methodology is proposed to consider ecosystem services trade-offs in the objective function of a simulated annealing algorithm aimed at delimiting green infrastructure multifunctional buffer areas. To this end, the provision potential maps of the regulating ecosystem services considered to delimit the multifunctional buffer areas are clustered in groups, so that ecosystem services that create trade-offs are excluded in each group. The normalized provision potential maps of the ecosystem services in each group are added to obtain a potential map per group which is normalized again. Then the potential maps for each group are combined in a raster map that shows the highest provision potential value in each cell. The combined map is then used in the objective function of the simulated annealing algorithm. The algorithm is run both using the proposed methodology and considering the ecosystem services individually. The results are analyzed with spatial statistics and landscape metrics to check the number of ecosystem services that the delimited areas produce, as well as their regularity and compactness. It has been observed that the proposed methodology increases the number of ecosystem services produced by delimited areas, improving their multifunctionality and increasing their effectiveness in preventing climate change impacts.

Keywords: ecosystem services trade-offs, green infrastructure delineation, multifunctional buffer areas, climate change

Procedia PDF Downloads 174
5610 Online Compressor Washing for Gas Turbine Power Output

Authors: Enyia James Diwa, Isaiah Thank-God Ebi, Dodeye Ina Igbong

Abstract:

The privatization of utilities has brought about very strong competition in industries such as petrochemical and gas distribution among others, considering the continuous increase in cost of fuel. This has brought about the intense reason for gas turbine owners and operators to reduce and control performance degradation of the engine in other to minimize cost. The most common and very crucial problem of the gas turbine is the fouling of compressor, which is mostly caused by a reduction in flow capacity, compressor efficiency, and pressure ratio, this, in turn, lead to the engine compressor re-matching and output power and thermal efficiency reduction. The content of this paper encompasses a detailed presentation of the major causes, effects and control mechanism of fouling. The major emphasis is on compressor water washing to enable power augmentation. A modelled gas turbine similar to that of GE LM6000 is modelled for the current study, based on TURBOMATCH which is a Cranfield University software specifically made for gas turbine performance simulation and fouling detection. The compounded and intricate challenges of compressor online water washing of large output gas turbine are carried out. The treatment is applied to axial compressor used in the petrochemical and hydrocarbon industry.

Keywords: gas turbine, fouling, degradation, compressor washing

Procedia PDF Downloads 348
5609 Effective Training System for Riding Posture Using Depth and Inertial Sensors

Authors: Sangseung Kang, Kyekyung Kim, Suyoung Chi

Abstract:

A good posture is the most important factor in riding. In this paper, we present an effective posture correction system for a riding simulator environment to provide position error detection and customized training functions. The proposed system detects and analyzes the rider's posture using depth data and inertial sensing data. Our experiments show that including these functions will help users improve their seat for a riding.

Keywords: posture correction, posture training, riding posture, riding simulator

Procedia PDF Downloads 476
5608 Activity of Commonly Used Intravenous Nutrient and Bisolvon in Neonatal Intensive Care Units against Biofilm Cells and Their Synergetic Effect with Antibiotics

Authors: Marwa Fady Abozed, Hemat Abd El Latif, Fathy Serry, Lotfi El Sayed

Abstract:

The purpose of this study was to investigate the efficacy of intravenous nutrient(soluvit, vitalipid, aminoven infant, lipovenos) and bisolvon commonly used in neonatal intensive care units against biofilm cells of staphylococcus aureus, Staphylococcus epidermidis, Pseudomonas aerguinosa and klebseilla pneumonia as they are the most commonly isolated organisms and are biofilm producers. Also, the synergetic acticity of soluvit, heparin, bisolvon with antibiotics and its effect on minimum biofilm eradication concentration(MBEC) was tested. Intravenous nutrient and bromohexine are widely used in newborns. Numbers of viable cell count released from biofilm after treatment with intravenous nutrient and bromohexine were counted to compare the efficacy. The percentage of reduction in biofilm regrowth in case of using soluvit was 43-51% and 36-42 % for Gram positive and Gram negative respectively, on adding the vitalipid the percentage was 45-50 %and 37-41% for Gram positive and Gram negative respectively. While, in case of using bisolvon the percentage was 46-52% and 47-48% for Gram positive and Gram negative respectively. Adding lipovenos had a reduction percentage of 48-52% and 48-49% for Gram positive and Gram negative respectively. While, adding aminoven infant the percentage was 10-15% and 9-11% for Gram positive and Gram negative respectively. Adding soluvit, heparin and bisolvon to antibiotics had synergic effect. soluvit with ciprofloxacin has 8-16 times decrease than minimum biofilm eradication concentration (MBEC) for ciprofloxacin alone. While, by adding soluvit to vancomycin the MBEC reduced by 16 times than MBEC of vancomycin alone. In case of combination soluvit with cefotaxime, amikacin and gentamycin the reduction in MBEC was 16, 8 and 6-32 times respectively. The synergetic effect of adding heparin to ciprofloxacin, vancomycin, cefotaxime, amikacin and gentamicin was 2 times reduction with all except in case of gram negative the range of reduction was 0-2 with both gentamycin and ciprofloxacin. Bisolvon exihited synergetic effect with ciprofloxacin, vancomycin, cefotaxime, amikacin and gentamicin by 16, 32, 32, 8, 32-64 and 32 times decrease in MBEC respectively.

Keywords: biofilm, neonatal intensive care units, antibiofilm agents, intravenous nutrient

Procedia PDF Downloads 328
5607 Cluster-Based Exploration of System Readiness Levels: Mathematical Properties of Interfaces

Authors: Justin Fu, Thomas Mazzuchi, Shahram Sarkani

Abstract:

A key factor in technological immaturity in defense weapons acquisition is lack of understanding critical integrations at the subsystem and component level. To address this shortfall, recent research in integration readiness level (IRL) combines with technology readiness level (TRL) to form a system readiness level (SRL). SRL can be enriched with more robust quantitative methods to provide the program manager a useful tool prior to committing to major weapons acquisition programs. This research harnesses previous mathematical models based on graph theory, Petri nets, and tropical algebra and proposes a modification of the desirable SRL mathematical properties such that a tightly integrated (multitude of interfaces) subsystem can display a lower SRL than an inherently less coupled subsystem. The synthesis of these methods informs an improved decision tool for the program manager to commit to expensive technology development. This research ties the separately developed manufacturing readiness level (MRL) into the network representation of the system and addresses shortfalls in previous frameworks, including the lack of integration weighting and the over-importance of a single extremely immature component. Tropical algebra (based on the minimum of a set of TRLs or IRLs) allows one low IRL or TRL value to diminish the SRL of the entire system, which may not be reflective of actuality if that component is not critical or tightly coupled. Integration connections can be weighted according to importance and readiness levels are modified to be a cardinal scale (based on an analytic hierarchy process). Integration arcs’ importance are dependent on the connected nodes and the additional integrations arcs connected to those nodes. Lack of integration is not represented by zero, but by a perfect integration maturity value. Naturally, the importance (or weight) of such an arc would be zero. To further explore the impact of grouping subsystems, a multi-objective genetic algorithm is then used to find various clusters or communities that can be optimized for the most representative subsystem SRL. This novel calculation is then benchmarked through simulation and using past defense acquisition program data, focusing on the newly introduced Middle Tier of Acquisition (rapidly field prototypes). The model remains a relatively simple, accessible tool, but at higher fidelity and validated with past data for the program manager to decide major defense acquisition program milestones.

Keywords: readiness, maturity, system, integration

Procedia PDF Downloads 92
5606 Design of 900 MHz High Gain SiGe Power Amplifier with Linearity Improved Bias Circuit

Authors: Guiheng Zhang, Wei Zhang, Jun Fu, Yudong Wang

Abstract:

A 900 MHz three-stage SiGe power amplifier (PA) with high power gain is presented in this paper. Volterra Series is applied to analyze nonlinearity sources of SiGe HBT device model clearly. Meanwhile, the influence of operating current to IMD3 is discussed. Then a β-helper current mirror bias circuit is applied to improve linearity, since the β-helper current mirror bias circuit can offer stable base biasing voltage. Meanwhile, it can also work as predistortion circuit when biasing voltages of three bias circuits are fine-tuned, by this way, the power gain and operating current of PA are optimized for best linearity. The three power stages which fabricated by 0.18 μm SiGe technology are bonded to the printed circuit board (PCB) to obtain impedances by Load-Pull system, then matching networks are done for best linearity with discrete passive components on PCB. The final measured three-stage PA exhibits 21.1 dBm of output power at 1 dB compression point (OP1dB) with power added efficiency (PAE) of 20.6% and 33 dB power gain under 3.3 V power supply voltage.

Keywords: high gain power amplifier, linearization bias circuit, SiGe HBT model, Volterra series

Procedia PDF Downloads 340
5605 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 94
5604 Comparative Study of Impedance Parameters for 42CrMo4 Steel Nitrided and Exposed at Electrochemical Corrosion

Authors: M. H. Belahssen, S. Benramache

Abstract:

This paper presents corrosion behavior of alloy 42CrMo4 steel nitrided by plasma. Different samples nitrided were tested. The corrosion behavior was evaluated by electrochemical impedance spectroscopy and the tests were carried out in acid chloride solution 1M. The best corrosion protection was observed for nitrided samples. The aim of this work is to compare equivalents circuits corresponding to Nyquist curves simulated and experimental and select who gives best results of impedance parameters with lowest error.

Keywords: pasma nitriding, steel, alloy 42CrMo4, elecrochemistry, corrosion behavior

Procedia PDF Downloads 371
5603 Rewritten Oedipus Complex: Huo Datong’s Complex of Generation

Authors: Xinyu Chen

Abstract:

This article reviews Chinese psychoanalytic theorist, Dr. Huo Datong’s notion, the complex of generation, around which Huo conceptualizes a localized set to recapitulate the unconscious structure of Chinese people. Psychoanalysis underwent constant localization influenced by the socio-cultural milieu and endeavored by scholars receiving training backgrounds from different psychoanalytic schools. Dr. Huo Datong is one of the representatives with a Sino-French background of psychoanalytic training, whose enterprise has demonstrated psychoanalysis's cultural and ideological accommodability. Insufficient academic attention has been paid to this concept as the core of Huo’s re-framework. This notion is put forward by sharing a western psychoanalytic reading of Chinese mythologies to contour Chinese unconsciousness. Regarding Huo’s interpretation of the Chinese kinship network as the basis to propose an omnipotent symbolic mother rather than an Oedipal father, this article intends to review this notion in terms of its mythological root to evaluate the theoretical practicality.

Keywords: psychoanalysis, China, Huo Datong, mythology

Procedia PDF Downloads 252
5602 Rice Area Determination Using Landsat-Based Indices and Land Surface Temperature Values

Authors: Burçin Saltık, Levent Genç

Abstract:

In this study, it was aimed to determine a route for identification of rice cultivation areas within Thrace and Marmara regions of Turkey using remote sensing and GIS. Landsat 8 (OLI-TIRS) imageries acquired in production season of 2013 with 181/32 Path/Row number were used. Four different seasonal images were generated utilizing original bands and different transformation techniques. All images were classified individually using supervised classification techniques and Land Use Land Cover Maps (LULC) were generated with 8 classes. Areas (ha, %) of each classes were calculated. In addition, district-based rice distribution maps were developed and results of these maps were compared with Turkish Statistical Institute (TurkSTAT; TSI)’s actual rice cultivation area records. Accuracy assessments were conducted, and most accurate map was selected depending on accuracy assessment and coherency with TSI results. Additionally, rice areas on over 4° slope values were considered as mis-classified pixels and they eliminated using slope map and GIS tools. Finally, randomized rice zones were selected to obtain maximum-minimum value ranges of each date (May, June, July, August, September images separately) NDVI, LSWI, and LST images to test whether they may be used for rice area determination via raster calculator tool of ArcGIS. The most accurate classification for rice determination was obtained from seasonal LSWI LULC map, and considering TSI data and accuracy assessment results and mis-classified pixels were eliminated from this map. According to results, 83151.5 ha of rice areas exist within study area. However, this result is higher than TSI records with an area of 12702.3 ha. Use of maximum-minimum range of rice area NDVI, LSWI, and LST was tested in Meric district. It was seen that using the value ranges obtained from July imagery, gave the closest results to TSI records, and the difference was only 206.4 ha. This difference is normal due to relatively low resolution of images. Thus, employment of images with higher spectral, spatial, temporal and radiometric resolutions may provide more reliable results.

Keywords: landsat 8 (OLI-TIRS), LST, LSWI, LULC, NDVI, rice

Procedia PDF Downloads 228
5601 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms

Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao

Abstract:

Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.

Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50

Procedia PDF Downloads 140
5600 Rapid and Sensitive Detection: Biosensors as an Innovative Analytical Tools

Authors: Sylwia Baluta, Joanna Cabaj, Karol Malecha

Abstract:

The evolution of biosensors was driven by the need for faster and more versatile analytical methods for application in important areas including clinical, diagnostics, food analysis or environmental monitoring, with minimum sample pretreatment. Rapid and sensitive neurotransmitters detection is extremely important in modern medicine. These compounds mainly occur in the brain and central nervous system of mammals. Any changes in the neurotransmitters concentration may lead to many diseases, such as Parkinson’s or schizophrenia. Classical techniques of chemical analysis, despite many advantages, do not permit to obtain immediate results or automatization of measurements.

Keywords: adrenaline, biosensor, dopamine, laccase, tyrosinase

Procedia PDF Downloads 142
5599 Mathematical and Numerical Analysis of a Nonlinear Cross Diffusion System

Authors: Hassan Al Salman

Abstract:

We consider a nonlinear parabolic cross diffusion model arising in applied mathematics. A fully practical piecewise linear finite element approximation of the model is studied. By using entropy-type inequalities and compactness arguments, existence of a global weak solution is proved. Providing further regularity of the solution of the model, some uniqueness results and error estimates are established. Finally, some numerical experiments are performed.

Keywords: cross diffusion model, entropy-type inequality, finite element approximation, numerical analysis

Procedia PDF Downloads 383
5598 Height of Highway Embankment for Tolerable Residual Settlement of Loose Cohesionless Subsoil Overlain by Stronger Soil

Authors: Sharifullah Ahmed

Abstract:

Residual settlement of cohesionless or non-plastic soil of different strength underlying highway embankment overlain by stronger soil layer highway embankment is studied. A parametric study is carried out for different height of embankment and for different ESAL factor. The sum of elastic settlements of cohesionless subsoil due to axle induced stress and due to self-weight of pavement layers is termed as the residual settlement. The values of residual settlement (Sr) for different heights of road embankment (He) are obtained and presented as design charts for different SPT Value (N60) and ESAL factor. For rigid pavement and flexible pavement in approach to bridge or culvert, the tolerable residual settlement is 0.100m. This limit is taken as 0.200m for flexible pavement in general sections of highway without approach to bridge or culvert. A simplified guideline is developed for design of highway embankment underlain by very loose to loose cohesionless subsoil overlain by a stronger soil layer for limiting value of the residual settlement. In the current research study range of ESAL factor is 1-10 and range of SPT value (N60) is 1-10. That is found that, ground improvement is not required if the overlying stronger layer is minimum 1.5m and 4.0m for general road section of flexible pavement except bridge or culvert approach and for rigid pavement or flexible pavement in bridge or culvert approach. Tables and charts are included in the prepared guideline to obtain minimum allowable height of highway embankment to limit the residual settlement with in mentioned tolerable limit. Allowable values of the embankment height (He) are obtained corresponding to tolerable or limiting level of the residual settlement of loose subsoil for different SPT value, thickness of stronger layer (d) and ESAL factor. The developed guideline is may be issued to be used in assessment of the necessity of ground improvement in case of cohesionless subsoil underlying highway embankment overlain by stronger subsoil layer for limiting residual settlement. The ground improvement is only to be required if the residual settlement of subsoil is more than tolerable limit.

Keywords: axle pressure, equivalent single axle load, ground improvement, highway embankment, tolerable residual settlement

Procedia PDF Downloads 127
5597 Lotus Mechanism: Validation of Deployment Mechanism Using Structural and Dynamic Analysis

Authors: Parth Prajapati, A. R. Srinivas

Abstract:

The purpose of this paper is to validate the concept of the Lotus Mechanism using Computer Aided Engineering (CAE) tools considering the statics and dynamics through actual time dependence involving inertial forces acting on the mechanism joints. For a 1.2 m mirror made of hexagonal segments, with simple harnesses and three-point supports, the maximum diameter is 400 mm, minimum segment base thickness is 1.5 mm, and maximum rib height is considered as 12 mm. Manufacturing challenges are explored for the segments using manufacturing research and development approaches to enable use of large lightweight mirrors required for the future space system.

Keywords: dynamics, manufacturing, reflectors, segmentation, statics

Procedia PDF Downloads 373
5596 Image Analysis for Obturator Foramen Based on Marker-controlled Watershed Segmentation and Zernike Moments

Authors: Seda Sahin, Emin Akata

Abstract:

Obturator foramen is a specific structure in pelvic bone images and recognition of it is a new concept in medical image processing. Moreover, segmentation of bone structures such as obturator foramen plays an essential role for clinical research in orthopedics. In this paper, we present a novel method to analyze the similarity between the substructures of the imaged region and a hand drawn template, on hip radiographs to detect obturator foramen accurately with integrated usage of Marker-controlled Watershed segmentation and Zernike moment feature descriptor. Marker-controlled Watershed segmentation is applied to seperate obturator foramen from the background effectively. Zernike moment feature descriptor is used to provide matching between binary template image and the segmented binary image for obturator foramens for final extraction. The proposed method is tested on randomly selected 100 hip radiographs. The experimental results represent that our method is able to segment obturator foramens with % 96 accuracy.

Keywords: medical image analysis, segmentation of bone structures on hip radiographs, marker-controlled watershed segmentation, zernike moment feature descriptor

Procedia PDF Downloads 434
5595 Study on Acoustic Source Detection Performance Improvement of Microphone Array Installed on Drones Using Blind Source Separation

Authors: Youngsun Moon, Yeong-Ju Go, Jong-Soo Choi

Abstract:

Most drones that currently have surveillance/reconnaissance missions are basically equipped with optical equipment, but we also need to use a microphone array to estimate the location of the acoustic source. This can provide additional information in the absence of optical equipment. The purpose of this study is to estimate Direction of Arrival (DOA) based on Time Difference of Arrival (TDOA) estimation of the acoustic source in the drone. The problem is that it is impossible to measure the clear target acoustic source because of the drone noise. To overcome this problem is to separate the drone noise and the target acoustic source using Blind Source Separation(BSS) based on Independent Component Analysis(ICA). ICA can be performed assuming that the drone noise and target acoustic source are independent and each signal has non-gaussianity. For maximized non-gaussianity each signal, we use Negentropy and Kurtosis based on probability theory. As a result, we can improve TDOA estimation and DOA estimation of the target source in the noisy environment. We simulated the performance of the DOA algorithm applying BSS algorithm, and demonstrated the simulation through experiment at the anechoic wind tunnel.

Keywords: aeroacoustics, acoustic source detection, time difference of arrival, direction of arrival, blind source separation, independent component analysis, drone

Procedia PDF Downloads 162
5594 Deep Reinforcement Learning Approach for Trading Automation in The Stock Market

Authors: Taylan Kabbani, Ekrem Duman

Abstract:

The design of adaptive systems that take advantage of financial markets while reducing the risk can bring more stagnant wealth into the global market. However, most efforts made to generate successful deals in trading financial assets rely on Supervised Learning (SL), which suffered from various limitations. Deep Reinforcement Learning (DRL) offers to solve these drawbacks of SL approaches by combining the financial assets price "prediction" step and the "allocation" step of the portfolio in one unified process to produce fully autonomous systems capable of interacting with its environment to make optimal decisions through trial and error. In this paper, a continuous action space approach is adopted to give the trading agent the ability to gradually adjust the portfolio's positions with each time step (dynamically re-allocate investments), resulting in better agent-environment interaction and faster convergence of the learning process. In addition, the approach supports the managing of a portfolio with several assets instead of a single one. This work represents a novel DRL model to generate profitable trades in the stock market, effectively overcoming the limitations of supervised learning approaches. We formulate the trading problem, or what is referred to as The Agent Environment as Partially observed Markov Decision Process (POMDP) model, considering the constraints imposed by the stock market, such as liquidity and transaction costs. More specifically, we design an environment that simulates the real-world trading process by augmenting the state representation with ten different technical indicators and sentiment analysis of news articles for each stock. We then solve the formulated POMDP problem using the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm, which can learn policies in high-dimensional and continuous action spaces like those typically found in the stock market environment. From the point of view of stock market forecasting and the intelligent decision-making mechanism, this paper demonstrates the superiority of deep reinforcement learning in financial markets over other types of machine learning such as supervised learning and proves its credibility and advantages of strategic decision-making.

Keywords: the stock market, deep reinforcement learning, MDP, twin delayed deep deterministic policy gradient, sentiment analysis, technical indicators, autonomous agent

Procedia PDF Downloads 178
5593 A Variable Neighborhood Search with Tabu Conditions for the Roaming Salesman Problem

Authors: Masoud Shahmanzari

Abstract:

The aim of this paper is to present a Variable Neighborhood Search (VNS) with Tabu Search (TS) conditions for the Roaming Salesman Problem (RSP). The RSP is a special case of the well-known traveling salesman problem (TSP) where a set of cities with time-dependent rewards and a set of campaign days are given. Each city can be visited on any day and a subset of cities can be visited multiple times. The goal is to determine an optimal campaign schedule consist of daily open/closed tours that visit some cities and maximizes the total net benefit while respecting daily maximum tour duration constraints and the necessity to return campaign base frequently. This problem arises in several real-life applications and particularly in election logistics where depots are not fixed. We formulate the problem as a mixed integer linear programming (MILP), in which we capture as many real-world aspects of the RSP as possible. We also present a hybrid metaheuristic algorithm based on a VNS with TS conditions. The initial feasible solution is constructed via a new matheuristc approach based on the decomposition of the original problem. Next, this solution is improved in terms of the collected rewards using the proposed local search procedure. We consider a set of 81 cities in Turkey and a campaign of 30 days as our largest instance. Computational results on real-world instances show that the developed algorithm could find near-optimal solutions effectively.

Keywords: optimization, routing, election logistics, heuristics

Procedia PDF Downloads 92
5592 From Vertigo to Verticality: An Example of Phenomenological Design in Architecture

Authors: E. Osorio Schmied

Abstract:

Architects commonly attempt a depiction of organic forms when their works are inspired by nature, regardless of the building site. Nevertheless it is also possible to try matching structures with natural scenery, by applying a phenomenological approach in terms of spatial operations, regarding perceptions from nature through architectural aspects such as protection, views, and orientation. This method acknowledges a relationship between place and space, where intentions towards tangible facts then become design statements. Although spaces resulting from such a process may present an effective response to the environment, they can also offer further outcomes beyond the realm of form. The hypothesis is that, in addition to recognising a bond between architecture and nature, it is also plausible to associate such perceptions with the inner ambient of buildings, by analysing features such as daylight. The case study of a single-family house in a rainforest near Valdivia, Chilean Patagonia is presented, with the intention of addressing the above notions through a discussion of the actual effects of inhabiting a place by way of a series of insights, including a revision of diagrams and photographs that assist in understanding the implications of this design practice. In addition, figures based on post-occupancy behaviour and daylighting performance relate both architectural and environmental issues to a decision-making process motivated by the observation of nature.

Keywords: architecture, design statements, nature, perception

Procedia PDF Downloads 342
5591 Quality of Service of Transportation Networks: A Hybrid Measurement of Travel Time and Reliability

Authors: Chin-Chia Jane

Abstract:

In a transportation network, travel time refers to the transmission time from source node to destination node, whereas reliability refers to the probability of a successful connection from source node to destination node. With an increasing emphasis on quality of service (QoS), both performance indexes are significant in the design and analysis of transportation systems. In this work, we extend the well-known flow network model for transportation networks so that travel time and reliability are integrated into the QoS measurement simultaneously. In the extended model, in addition to the general arc capacities, each intermediate node has a time weight which is the travel time for per unit of commodity going through the node. Meanwhile, arcs and nodes are treated as binary random variables that switch between operation and failure with associated probabilities. For pre-specified travel time limitation and demand requirement, the QoS of a transportation network is the probability that source can successfully transport the demand requirement to destination while the total transmission time is under the travel time limitation. This work is pioneering, since existing literatures that evaluate travel time reliability via a single optimization path, the proposed QoS focuses the performance of the whole network system. To compute the QoS of transportation networks, we first transfer the extended network model into an equivalent min-cost max-flow network model. In the transferred network, each arc has a new travel time weight which takes value 0. Each intermediate node is replaced by two nodes u and v, and an arc directed from u to v. The newly generated nodes u and v are perfect nodes. The new direct arc has three weights: travel time, capacity, and operation probability. Then the universal set of state vectors is recursively decomposed into disjoint subsets of reliable, unreliable, and stochastic vectors until no stochastic vector is left. The decomposition is made possible by applying existing efficient min-cost max-flow algorithm. Because the reliable subsets are disjoint, QoS can be obtained directly by summing the probabilities of these reliable subsets. Computational experiments are conducted on a benchmark network which has 11 nodes and 21 arcs. Five travel time limitations and five demand requirements are set to compute the QoS value. To make a comparison, we test the exhaustive complete enumeration method. Computational results reveal the proposed algorithm is much more efficient than the complete enumeration method. In this work, a transportation network is analyzed by an extended flow network model where each arc has a fixed capacity, each intermediate node has a time weight, and both arcs and nodes are independent binary random variables. The quality of service of the transportation network is an integration of customer demands, travel time, and the probability of connection. We present a decomposition algorithm to compute the QoS efficiently. Computational experiments conducted on a prototype network show that the proposed algorithm is superior to existing complete enumeration methods.

Keywords: quality of service, reliability, transportation network, travel time

Procedia PDF Downloads 221
5590 Development of All-in-One Solar Kit

Authors: Azhan Azhar, Mohammed Sakib, Zaurez Ahmad

Abstract:

The energy we receive from the sun is known as solar energy, and it is a reliable, long-lasting, eco-friendly and the most widely used energy source in the 21st century. It is. There are several techniques for harnessing solar energy, and we are all seeing large utility-scale projects to collect maximum amperes from the sun using current technologies. Solar PV is now on the rise as a means of harvesting energy from the sun. Moving a step further, our project is focused on designing an All-in-one portable Solar Energy based solution. We considered the minimum load conditions and evaluated the requirements of various devices utilized in this study to resolve the power requirements of small stores, hawkers, or travelers.

Keywords: DOD-depth of discharge, pulse width modulation charge controller, renewable energy, solar PV- solar photovoltaic

Procedia PDF Downloads 370
5589 A Randomized, Controlled Trial to Test Behavior Change Techniques to Improve Low Intensity Physical Activity in Older Adults

Authors: Ciaran Friel, Jerry Suls, Mark Butler, Patrick Robles, Samantha Gordon, Frank Vicari, Karina W. Davidson

Abstract:

Physical activity guidelines focus on increasing moderate-intensity activity for older adults, but adherence to recommendations remains low. This is despite the fact that scientific evidence supports that any increase in physical activity is positively correlated with health benefits. Behavior change techniques (BCTs) have demonstrated effectiveness in reducing sedentary behavior and promoting physical activity. This pilot study uses a Personalized Trials (N-of-1) design to evaluate the efficacy of using four BCTs to promote an increase in low-intensity physical activity (2,000 steps of walking per day) in adults aged 45-75 years old. The 4 BCTs tested were goal setting, action planning, feedback, and self-monitoring. BCTs were tested in random order and delivered by text message prompts requiring participant engagement. The study recruited health system employees in the target age range, without mobility restrictions and demonstrating interest in increasing their daily activity by a minimum of 2,000 steps per day for a minimum of five days per week. Participants were sent a Fitbit® fitness tracker with an established study account and password. Participants were recommended to wear the Fitbit device 24/7 but were required to wear it for a minimum of ten hours per day. Baseline physical activity was measured by Fitbit for two weeks. In the 8-week intervention phase of the study, participants received each of the four BCTs, in random order, for a two-week period. Text message prompts were delivered daily each morning at a consistent time. All prompts required participant engagement to acknowledge receipt of the BCT message. Engagement is dependent upon the BCT message and may have included recording that a detailed plan for walking has been made or confirmed a daily step goal (action planning, goal setting). Additionally, participants may have been directed to a study dashboard to view their step counts or compare themselves to their baseline average step count (self-monitoring, feedback). At the end of each two-week testing interval, participants were asked to complete the Self-Efficacy for Walking Scale (SEW_Dur), a validated measure that assesses the participant’s confidence in walking incremental distances, and a survey measuring their satisfaction with the individual BCT that they tested. At the end of their trial, participants received a personalized summary of their step data in response to each individual BCT. The analysis will examine the novel individual-level heterogeneity of treatment effect made possible by N-of-1 design and pool results across participants to efficiently estimate the overall efficacy of the selected behavioral change techniques in increasing low-intensity walking by 2,000 steps, five days per week. Self-efficacy will be explored as the likely mechanism of action prompting behavior change. This study will inform the providers and demonstrate the feasibility of an N-of-1 study design to effectively promote physical activity as a component of healthy aging.

Keywords: aging, exercise, habit, walking

Procedia PDF Downloads 92
5588 Setting Control Limits For Inaccurate Measurements

Authors: Ran Etgar

Abstract:

The process of rounding off measurements in continuous variables is commonly encountered. Although it usually has minor effects, sometimes it can lead to poor outcomes in statistical process control using X ̅-chart. The traditional control limits can cause incorrect conclusions if applied carelessly. This study looks into the limitations of classical control limits, particularly the impact of asymmetry. An approach to determining the distribution function of the measured parameter (Y ̅) is presented, resulting in a more precise method to establish the upper and lower control limits. The proposed method, while slightly more complex than Shewhart's original idea, is still user-friendly and accurate and only requires the use of two straightforward tables.

Keywords: quality control, process control, round-off, measurement, rounding error

Procedia PDF Downloads 99
5587 Factors Influencing Capital Structure: Evidence from the Oil and Gas Industry of Pakistan

Authors: Muhammad Tahir, Mushtaq Muhammad

Abstract:

Capital structure is one of the key decisions taken by the financial managers. This study aims to investigate the factors influencing capital structure decision in Oil and Gas industry of Pakistan using secondary data from published annual reports of listed Oil and Gas Companies of Pakistan. This study covers the time-period from 2008-2014. Capital structure can be affected by profitability, firm size, growth opportunities, dividend payout, liquidity, business risk, and ownership structure. Panel data technique with Ordinary least square (OLS) regression model has been used to find the impact of set of explanatory variables on the capital structure using the Stata. OLS regression results suggest that dividend payout, firm size and government ownership have the most significant impact on financial leverage. Dividend payout and government ownership are found to have significant negative association with financial leverage however firm size indicated positive relationship with financial leverage. Other variables having significant link with financial leverage includes growth opportunities, liquidity and business risk. Results reveal significant positive association between growth opportunities and financial leverage whereas liquidity and business risk are negatively correlated with financial leverage. Profitability and managerial ownership exhibited insignificant relationship with financial leverage. This study contributes to existing Managerial Finance literature with certain managerial implications. Academically, this research study describes the factors affecting capital structure decision of Oil and Gas Companies in Pakistan and adds latest empirical evidence to existing financial literature in Pakistan. Researchers have studies capital structure in Pakistan in general and industry at specific, nevertheless still there is limited literature on this issue. This study will be an attempt to fill this gap in the academic literature. This study has practical implication on both firm level and individual investor/ lenders level. Results of this study can be useful for investors/ lenders in making investment and lending decisions. Further, results of this study can be useful for financial managers to frame optimal capital structure keeping in consideration the factors that can affect capital structure decision as revealed by this study. These results will help financial managers to decide whether to issue stock or issue debt for future investment projects.

Keywords: capital structure, multicollinearity, ordinary least square (OLS), panel data

Procedia PDF Downloads 293
5586 Dual Band LoRa/GPS Dipole Antenna with Harmonic Suppression Capability

Authors: Amar Danial Abd Azis, Shipun Anuar Hamzah, Mohd Noh Dalimin, Khairun Nidzam Ramli, Mohd Sani Yahya, Fauziahanim Che Seman

Abstract:

This paper discusses the design, simulation results, and testing of a compact dual-band printed dipole antenna operating at frequencies of 916 MHz and 1.57 GHz for LoRa and GPS applications, respectively. The basic design of this antenna uses a linear dipole that operates at 916 MHz and 2.7 GHz. A small triangular-shaped linear balun has been developed as the matching network. Parasitic elements are employed to tune the second frequency to 1.57 GHz through a parametric study. Meanwhile, a stub is used to suppress the undesired 2.6 GHz frequency. This antenna is capable of operating on dual-frequency bands simultaneously with high efficiency in suppressing the unwanted frequency. The antenna exhibits the following parameters: return loss of -18.5 dB at 916 MHz and -14 dB at 1.57 GHz, VSWR of 1.25 at 868 MHz and 1.5 at 1.57 GHz, and gain of 2 dBi at 916 MHz and 2.75 dBi at 1.57 GHz. The radiation pattern of the antenna shows a directional E-plane and an omnidirectional H-plane at both frequencies. With its compact size and dual-band capability, this antenna demonstrates great potential for use in IoT applications that require both LoRa and GPS communication, particularly in applications where a small yet efficient form factor is essential.

Keywords: dual band, dipole antenna, parasitic elements, harmonic suppression, LoRa and Gps

Procedia PDF Downloads 6
5585 Supercomputer Simulation of Magnetic Multilayers Films

Authors: Vitalii Yu. Kapitan, Aleksandr V. Perzhu, Konstantin V. Nefedev

Abstract:

The necessity of studying magnetic multilayer structures is explained by the prospects of their practical application as a technological base for creating new storages medium. Magnetic multilayer films have many unique features that contribute to increasing the density of information recording and the speed of storage devices. Multilayer structures are structures of alternating magnetic and nonmagnetic layers. In frame of the classical Heisenberg model, lattice spin systems with direct short- and long-range exchange interactions were investigated by Monte Carlo methods. The thermodynamic characteristics of multilayer structures, such as the temperature behavior of magnetization, energy, and heat capacity, were investigated. The processes of magnetization reversal of multilayer structures in external magnetic fields were investigated. The developed software is based on the new, promising programming language Rust. Rust is a new experimental programming language developed by Mozilla. The language is positioned as an alternative to C and C++. For the Monte Carlo simulation, the Metropolis algorithm and its parallel implementation using MPI and the Wang-Landau algorithm were used. We are planning to study of magnetic multilayer films with asymmetric Dzyaloshinskii–Moriya (DM) interaction, interfacing effects and skyrmions textures. This work was supported by the state task of the Ministry of Education and Science of the Russia # 3.7383.2017/8.9

Keywords: The Monte Carlo methods, Heisenberg model, multilayer structures, magnetic skyrmion

Procedia PDF Downloads 166
5584 Scale up of Isoniazid Preventive Therapy: A Quality Management Approach in Nairobi County, Kenya

Authors: E. Omanya, E. Mueni, G. Makau, M. Kariuki

Abstract:

HIV infection is the strongest risk factor for a person to develop TB. Isoniazid preventive therapy (IPT) for People Living with HIV (PLWHIV) not only reduces the individual patients’ risk of developing active TB but mitigates cross infection. In Kenya, IPT for six months was recommended through the National TB, Leprosy and Lung Disease Program to treat latent TB. In spite of this recommendation by the national government, uptake of IPT among PLHIV remained low in Kenya by the end of 2015. The USAID/Kenya and East Africa Afya Jijini project, which supports 42 TBHIV health facilities in Nairobi County, began addressing low uptake of IPT through Quality Improvement (QI) teams set up at the facility level. Quality is characterized by WHO as one of the four main connectors between health systems building blocks and health systems outputs. Afya Jijini implements the Kenya Quality Model for Health, which involves QI teams being formed at the county, sub-county and facility levels. The teams review facility performance to identify gaps in service delivery and use QI tools to monitor and improve performance. Afya Jijini supported the formation of these teams in 42 facilities and built the teams’ capacity to review data and use QI principles to identify and address performance gaps. When the QI teams began working on improving IPT uptake among PLHIV, uptake was at 31.8%. The teams first conducted a root cause analysis using cause and effect diagrams, which help the teams to brainstorm on and to identify barriers to IPT uptake among PLHIV at the facility level. This is a participatory process where program staff provides technical support to the QI teams in problem identification and problem-solving. The gaps identified were inadequate knowledge and skills on the use of IPT among health care workers, lack of awareness of IPT by patients, inadequate monitoring and evaluation tools, and poor quantification and forecasting of IPT commodities. In response, Afya Jijini trained over 300 health care workers on the administration of IPT, supported patient education, supported quantification and forecasting of IPT commodities, and provided IPT data collection tools to help facilities monitor their performance. The facility QI teams conducted monthly meetings to monitor progress on implementation of IPT and took corrective action when necessary. IPT uptake improved from 31.8% to 61.2% during the second year of the Afya Jijini project and improved to 80.1% during the third year of the project’s support. Use of QI teams and root cause analysis to identify and address service delivery gaps, in addition to targeted program interventions and continual performance reviews, can be successful in increasing TB related service delivery uptake at health facilities.

Keywords: isoniazid, quality, health care workers, people leaving with HIV

Procedia PDF Downloads 99