Search results for: three dimensional data acquisition
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26525

Search results for: three dimensional data acquisition

22985 Real-Time Pedestrian Detection Method Based on Improved YOLOv3

Authors: Jingting Luo, Yong Wang, Ying Wang

Abstract:

Pedestrian detection in image or video data is a very important and challenging task in security surveillance. The difficulty of this task is to locate and detect pedestrians of different scales in complex scenes accurately. To solve these problems, a deep neural network (RT-YOLOv3) is proposed to realize real-time pedestrian detection at different scales in security monitoring. RT-YOLOv3 improves the traditional YOLOv3 algorithm. Firstly, the deep residual network is added to extract vehicle features. Then six convolutional neural networks with different scales are designed and fused with the corresponding scale feature maps in the residual network to form the final feature pyramid to perform pedestrian detection tasks. This method can better characterize pedestrians. In order to further improve the accuracy and generalization ability of the model, a hybrid pedestrian data set training method is used to extract pedestrian data from the VOC data set and train with the INRIA pedestrian data set. Experiments show that the proposed RT-YOLOv3 method achieves 93.57% accuracy of mAP (mean average precision) and 46.52f/s (number of frames per second). In terms of accuracy, RT-YOLOv3 performs better than Fast R-CNN, Faster R-CNN, YOLO, SSD, YOLOv2, and YOLOv3. This method reduces the missed detection rate and false detection rate, improves the positioning accuracy, and meets the requirements of real-time detection of pedestrian objects.

Keywords: pedestrian detection, feature detection, convolutional neural network, real-time detection, YOLOv3

Procedia PDF Downloads 123
22984 Evaluation of Cyclic Thermo-Mechanical Responses of an Industrial Gas Turbine Rotor

Authors: Y. Rae, A. Benaarbia, J. Hughes, Wei Sun

Abstract:

This paper describes an elasto-visco-plastic computational modelling method which can be used to assess the cyclic plasticity responses of high temperature structures operating under thermo-mechanical loadings. The material constitutive equation used is an improved unified multi-axial Chaboche-Lemaitre model, which takes into account non-linear kinematic and isotropic hardening. The computational methodology is a three-dimensional framework following an implicit formulation and based on a radial return mapping algorithm. The associated user material (UMAT) code is developed and calibrated across isothermal hold-time low cycle fatigue tests for a typical turbine rotor steel for use in finite element (FE) implementation. The model is applied to a realistic industrial gas turbine rotor, where the study focuses its attention on the deformation heterogeneities and critical high stress areas within the rotor structure. The potential improvements of such FE visco-plastic approach are discussed. An integrated life assessment procedure based on R5 and visco-plasticity modelling, is also briefly addressed.

Keywords: unified visco-plasticity, thermo-mechanical, turbine rotor, finite element modelling

Procedia PDF Downloads 116
22983 Acceleration of Lagrangian and Eulerian Flow Solvers via Graphics Processing Units

Authors: Pooya Niksiar, Ali Ashrafizadeh, Mehrzad Shams, Amir Hossein Madani

Abstract:

There are many computationally demanding applications in science and engineering which need efficient algorithms implemented on high performance computers. Recently, Graphics Processing Units (GPUs) have drawn much attention as compared to the traditional CPU-based hardware and have opened up new improvement venues in scientific computing. One particular application area is Computational Fluid Dynamics (CFD), in which mature CPU-based codes need to be converted to GPU-based algorithms to take advantage of this new technology. In this paper, numerical solutions of two classes of discrete fluid flow models via both CPU and GPU are discussed and compared. Test problems include an Eulerian model of a two-dimensional incompressible laminar flow case and a Lagrangian model of a two phase flow field. The CUDA programming standard is used to employ an NVIDIA GPU with 480 cores and a C++ serial code is run on a single core Intel quad-core CPU. Up to two orders of magnitude speed up is observed on GPU for a certain range of grid resolution or particle numbers. As expected, Lagrangian formulation is better suited for parallel computations on GPU although Eulerian formulation represents significant speed up too.

Keywords: CFD, Eulerian formulation, graphics processing units, Lagrangian formulation

Procedia PDF Downloads 389
22982 Influence of Bed Depth on Performance of Wire Screen Packed Bed Solar Air Heater

Authors: Vimal Kumar Chouksey, S. P. Sharma

Abstract:

This paper deals with theoretical analysis of performance of solar air collector having its duct packed with blackened wire screen matrices. The heat transfer equations for two-dimensional fully developed fluid flows under quasi-steady-state conditions have been developed in order to analyze the effect of bed depth on performance. A computer programme is developed in C++ language to estimate the temperature rise of entering air for evaluation of performance by solving the governing equations numerically using relevant correlations for heat transfer coefficient for packed bed systems. Results of air temperature rise and thermal efficiency obtained from the analysis have been compared with available experimental results and results have been found fairly in closed agreement. It has been found that there is considerable enhancement in performance with packed bed collector upto a certain total bed depth. Effect of total bed depth on efficiency show that there is an upper limiting value of total bed depth beyond which the thermal efficiency begins to fall again and this type of characteristics behavior is observed at all mass flow rate.

Keywords: plane collector, solar air heater, solar energy, wire screen packed bed

Procedia PDF Downloads 220
22981 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites

Authors: Che-Wei Lee, Bay-Erl Lai

Abstract:

A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.

Keywords: steganography, data hiding, secret authentication, secret sharing

Procedia PDF Downloads 225
22980 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications

Authors: T. Gangadhararao, K. Krishna Kishore

Abstract:

Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.

Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code

Procedia PDF Downloads 413
22979 Overcoming Open Innovation Challenges with Technology Intelligence: Case of Medium-Sized Enterprises

Authors: Akhatjon Nasullaev, Raffaella Manzini, Vincent Frigant

Abstract:

The prior research largely discussed open innovation practices both in large and small and medium-sized enterprises (SMEs). Open Innovation compels firms to observe and analyze the external environment in order to tap new opportunities for inbound and/or outbound flows of knowledge, ideas, work in progress innovations. As SMEs are different from their larger counterparts, they face several limitations in utilizing open innovation activities, such as resource scarcity, unstructured innovation processes and underdeveloped innovation capabilities. Technology intelligence – the process of systematic acquisition, assessment and communication of information about technological trends, opportunities and threats can mitigate this limitation by enabling SMEs to identify technological and market opportunities in timely manner and undertake sound decisions, as well as to realize a ‘first mover advantage’. Several studies highlighted firm-level barriers to successful implementation of open innovation practices in SMEs, namely challenges in partner selection, intellectual property rights and trust, absorptive capacity. This paper aims to investigate the question how technology intelligence can be useful for SMEs to overcome the barriers to effective open innovation. For this, we conduct a case study in four Estonian life-sciences SMEs. Our findings revealed that technology intelligence can support SMEs not only in inbound open innovation (taking into account inclination of most firms toward technology exploration aspects of open innovation) but also outbound open innovation. Furthermore, the results of this study state that, although SMEs conduct technology intelligence in unsystematic and uncoordinated manner, it helped them to increase their innovative performance.

Keywords: technology intelligence, open innovation, SMEs, life sciences

Procedia PDF Downloads 152
22978 Whole Body Cooling Hypothermia Treatment Modelling Using a Finite Element Thermoregulation Model

Authors: Ana Beatriz C. G. Silva, Luiz Carlos Wrobel, Fernando Luiz B. Ribeiro

Abstract:

This paper presents a thermoregulation model using the finite element method to perform numerical analyses of brain cooling procedures as a contribution to the investigation on the use of therapeutic hypothermia after ischemia in adults. The use of computational methods can aid clinicians to observe body temperature using different cooling methods without the need of invasive techniques, and can thus be a valuable tool to assist clinical trials simulating different cooling options that can be used for treatment. In this work, we developed a FEM package applied to the solution of the continuum bioheat Pennes equation. Blood temperature changes were considered using a blood pool approach and a lumped analysis for intravascular catheter method of blood cooling. Some analyses are performed using a three-dimensional mesh based on a complex geometry obtained from computed tomography medical images, considering a cooling blanket and a intravascular catheter. A comparison is made between the results obtained and the effects of each case in brain temperature reduction in a required time, maintenance of body temperature at moderate hypothermia levels and gradual rewarming.

Keywords: brain cooling, finite element method, hypothermia treatment, thermoregulation

Procedia PDF Downloads 297
22977 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base

Authors: Omid Khodabakhshi, Amir Rozdel

Abstract:

The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.

Keywords: code, cloud computing, security, virtual machines

Procedia PDF Downloads 175
22976 Experimental and Numerical Analysis of the Effects of Ball-End Milling Process upon Residual Stresses and Cutting Forces

Authors: Belkacem Chebil Sonia, Bensalem Wacef

Abstract:

The majority of ball end milling models includes only the influence of cutting parameters (cutting speed, feed rate, depth of cut). Furthermore, this influence is studied in most of works on cutting force. Therefore, this study proposes an accurate ball end milling process modeling which includes also the influence of tool workpiece inclination. In addition, a characterization of residual stresses resulting of thermo mechanical loading in the workpiece was also presented. Moreover, the study of the influence of tool workpiece inclination and cutting parameters was made on residual stresses distribution. In order to achieve the predetermination of cutting forces and residual stresses during a milling operation, a thermo mechanical three-dimensional numerical model of ball end milling was developed. Furthermore, an experimental companion of ball end milling tests was realized on a 5-axis machining center to determine the cutting forces and characterize the residual stresses. The simulation results are compared with the experiment to validate the Finite Element Model and subsequently identify the optimum inclination angle and cutting parameters.

Keywords: ball end milling, cutting forces, cutting parameters, residual stress, tool-workpiece inclination

Procedia PDF Downloads 290
22975 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran

Authors: Rojin Bana Derakhshan, Abbas Toloie

Abstract:

For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.

Keywords: energy saving, key elements of success, optimization of energy consumption, data mining

Procedia PDF Downloads 449
22974 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach

Authors: Kwaku Damoah

Abstract:

This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.

Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis

Procedia PDF Downloads 36
22973 Phytochemical and Biological Study of Chrozophora oblongifolia

Authors: Al-Braa Kashegari, Ali M. El-Halawany, Akram A. Shalabi, Sabrin R. M. Ibrahim, Hossam M. Abdallah

Abstract:

Chemical investigation of Chrozophora oblongifolia resulted in the isolation of five major compounds that were identified as apeginin-7-O-glucoside (1), quercetin-3-O-glucuronic acid (2), quercetin-3-O-glacturonic acid (3), rutin (4), and 1,3,6-trigalloyl glucose (5). The identity of isolated compounds was assessed by different spectroscopic methods, including one- and two-dimensional NMR. The isolated compounds were tested for their antioxidant activity using different assays viz., DPPH, FRAP, ABTS, ORAC, and metal chelation effects. In addition, the inhibition of target enzymes involved in the metabolic syndrome, such as alpha-glucosidase and pancreatic lipase, were carried out. Moreover, the effect of the compounds on the advanced glycation end-products (AGEs) as one of the major complications of oxidative stress and hyperglycemia in metabolic syndromes were carried out using BSA‐fructose (bovine serum albumin), BSA-methylglyoxal, and arginine methylglyoxal models. The pure isolates showed a protective effect in metabolic syndromes as well as promising antioxidant activity. The results showed potent activity of compound 5 in all measured parameters meanwhile, none of the tested compounds showed activity against pancreatic lipase.

Keywords: Chrozophora oblongifolia, antioxidant, pancreatic lipase, metabolic syndromes

Procedia PDF Downloads 92
22972 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures

Authors: Salima Kouici, Abdelkader Khelladi

Abstract:

In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.

Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering

Procedia PDF Downloads 466
22971 Transport of Analytes under Mixed Electroosmotic and Pressure Driven Flow of Power Law Fluid

Authors: Naren Bag, S. Bhattacharyya, Partha P. Gopmandal

Abstract:

In this study, we have analyzed the transport of analytes under a two dimensional steady incompressible flow of power-law fluids through rectangular nanochannel. A mathematical model based on the Cauchy momentum-Nernst-Planck-Poisson equations is considered to study the combined effect of mixed electroosmotic (EO) and pressure driven (PD) flow. The coupled governing equations are solved numerically by finite volume method. We have studied extensively the effect of key parameters, e.g., flow behavior index, concentration of the electrolyte, surface potential, imposed pressure gradient and imposed electric field strength on the net average flow across the channel. In addition to study the effect of mixed EOF and PD on the analyte distribution across the channel, we consider a nonlinear model based on general convective-diffusion-electromigration equation. We have also presented the retention factor for various values of electrolyte concentration and flow behavior index.

Keywords: electric double layer, finite volume method, flow behavior index, mixed electroosmotic/pressure driven flow, non-Newtonian power-law fluids, numerical simulation

Procedia PDF Downloads 292
22970 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013

Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran

Abstract:

Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.

Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka

Procedia PDF Downloads 463
22969 Determination of Safety Distance Around Gas Pipelines Using Numerical Methods

Authors: Omid Adibi, Nategheh Najafpour, Bijan Farhanieh, Hossein Afshin

Abstract:

Energy transmission pipelines are one of the most vital parts of each country which several strict laws have been conducted to enhance the safety of these lines and their vicinity. One of these laws is the safety distance around high pressure gas pipelines. Safety distance refers to the minimum distance from the pipeline where people and equipment do not confront with serious damages. In the present study, safety distance around high pressure gas transmission pipelines were determined by using numerical methods. For this purpose, gas leakages from cracked pipeline and created jet fires were simulated as continuous ignition, three dimensional, unsteady and turbulent cases. Numerical simulations were based on finite volume method and turbulence of flow was considered using k-ω SST model. Also, the combustion of natural gas and air mixture was applied using the eddy dissipation method. The results show that, due to the high pressure difference between pipeline and environment, flow chocks in the cracked area and velocity of the exhausted gas reaches to sound speed. Also, analysis of the incident radiation results shows that safety distances around 42 inches high pressure natural gas pipeline based on 5 and 15 kW/m2 criteria are 205 and 272 meters, respectively.

Keywords: gas pipelines, incident radiation, numerical simulation, safety distance

Procedia PDF Downloads 318
22968 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility

Authors: Fu Jinyu, Lin Jinguan

Abstract:

This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.

Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate

Procedia PDF Downloads 138
22967 Exchanging Radiology Reporting System with Electronic Health Record: Designing a Conceptual Model

Authors: Azadeh Bashiri

Abstract:

Introduction: In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. Background: This study, provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. Methods: This is a cross-sectional study that was conducted in 2013. The student community was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also, Visual Paradigm software was used to design a conceptual model. Result: Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. Conclusion: According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, provide the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.

Keywords: structured radiology report, information needs, minimum data set, electronic health record system in Iran

Procedia PDF Downloads 237
22966 Stress Perception, Ethics and Leadership Styles of Pilots: Implications for Airline Global Talent Acquisition and Talent Management Strategy

Authors: Arif Sikander, Imran Saeed

Abstract:

The behavioral pattern and performance of airline pilots are influenced by the level of stress, their ethical decision-making ability and above all their leadership style as part of the Crew Management process. Cultural differences of pilots, especially while working in ex-country airlines, could influence the stress perception. Culture also influences ethical decision making. Leadership style is also a variable dimension, and pilots need to adapt to the cultural settings while flying with the local pilots as part of their team. Studies have found that age, education, gender, and management experience are statistically significant factors in ethical maturity. However, in the decades to come, more studies are required to validate the results over and over again; thereby, providing support for the validity of the Moral Development Theory. Leadership style plays a vital role in ethical decision making. This study is grounded in the Moral Development theory and seeks to analyze the styles of leadership of airline pilots related to ethical decision making and also the influence of the culture on their stress perception. The sample for the study included commercial pilots from a National Airline. It is expected that these results should provide useful input to the literature in the context of developing appropriate Talent Management strategies. The authors intend to extend this study (carried out in one country) to major national carriers (many countries) to be able to develop a ultimate framework on Talent Management which should serve as a benchmark for any international airline as most of them (e.g., Emirates, Etihad, Cathay Pacific, China Southern, etc.) are dependent on the supply of this scarce resource from outside countries.

Keywords: ethics, leadership, pilot, stress

Procedia PDF Downloads 126
22965 A Crowdsourced Homeless Data Collection System And Its Econometric Analysis: Strengthening Inclusive Public Administration Policies

Authors: Praniil Nagaraj

Abstract:

This paper proposes a method to collect homeless data using crowdsourcing and presents an approach to analyze the data, demonstrating its potential to strengthen existing and future policies aimed at promoting socio-economic equilibrium. The 2022 Annual Homeless Assessment Report (AHAR) to Congress highlighted alarming statistics, emphasizing the need for effective decision-making and budget allocation within local planning bodies known as Continuums of Care (CoC). This paper's contributions can be categorized into three main areas. Firstly, a unique method for collecting homeless data is introduced, utilizing a user-friendly smartphone app (currently available for Android). The app enables the general public to quickly record information about homeless individuals, including the number of people and details about their living conditions. The collected data, including date, time, and location, is anonymized and securely transmitted to the cloud. It is anticipated that an increasing number of users motivated to contribute to society will adopt the app, thus expanding the data collection efforts. Duplicate data is addressed through simple classification methods, and historical data is utilized to fill in missing information. The second contribution of this paper is the description of data analysis techniques applied to the collected data. By combining this new data with existing information, statistical regression analysis is employed to gain insights into various aspects, such as distinguishing between unsheltered and sheltered homeless populations, as well as examining their correlation with factors like unemployment rates, housing affordability, and labor demand. Initial data is collected in San Francisco, while pre-existing information is drawn from three cities: San Francisco, New York City, and Washington D.C., facilitating the conduction of simulations. The third contribution focuses on demonstrating the practical implications of the data processing results. The challenges faced by key stakeholders, including charitable organizations and local city governments, are taken into consideration. Two case studies are presented as examples. The first case study explores improving the efficiency of food and necessities distribution, as well as medical assistance, driven by charitable organizations. The second case study examines the correlation between micro-geographic budget expenditure by local city governments and homeless information to justify budget allocation and expenditures. The ultimate objective of this endeavor is to enable the continuous enhancement of the quality of life for the underprivileged. It is hoped that through increased crowdsourcing of data from the public, the Generosity Curve and the Need Curve will intersect, leading to a better world for all.

Keywords: crowdsourcing, homelessness, socio-economic policies, statistical regression

Procedia PDF Downloads 70
22964 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter

Authors: Jisun Lee, Jay Hyoun Kwon

Abstract:

As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.

Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain

Procedia PDF Downloads 328
22963 An Application of Remote Sensing for Modeling Local Warming Trend

Authors: Khan R. Rahaman, Quazi K. Hassan

Abstract:

Global changes in climate, environment, economies, populations, governments, institutions, and cultures converge in localities. Changes at a local scale, in turn, contribute to global changes as well as being affected by them. Our hypothesis is built on a consideration that temperature does vary at local level (i.e., termed as local warming) in comparison to the predicted models at the regional and/or global scale. To date, the bulk of the research relating local places to global climate change has been top-down, from the global toward the local, concentrating on methods of impact analysis that use as a starting point climate change scenarios derived from global models, even though these have little regional or local specificity. Thus, our focus is to understand such trends over the southern Alberta, which will enable decision makers, scientists, researcher community, and local people to adapt their policies based on local level temperature variations and to act accordingly. Specific objectives in this study are: (i) to understand the local warming (temperature in particular) trend in context of temperature normal during the period 1961-2010 at point locations using meteorological data; (ii) to validate the data by using specific yearly data, and (iii) to delineate the spatial extent of the local warming trends and understanding influential factors to adopt situation by local governments. Existing data has brought the evidence of such changes and future research emphasis will be given to validate this hypothesis based on remotely sensed data (i.e. MODIS product by NASA).

Keywords: local warming, climate change, urban area, Alberta, Canada

Procedia PDF Downloads 317
22962 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 265
22961 Scalable Learning of Tree-Based Models on Sparsely Representable Data

Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou

Abstract:

Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.

Keywords: big data, sparsely representable data, tree-based models, scalable learning

Procedia PDF Downloads 244
22960 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis

Authors: Tawfik Thelaidjia, Salah Chenikher

Abstract:

Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approach

Keywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement

Procedia PDF Downloads 419
22959 A Feature Clustering-Based Sequential Selection Approach for Color Texture Classification

Authors: Mohamed Alimoussa, Alice Porebski, Nicolas Vandenbroucke, Rachid Oulad Haj Thami, Sana El Fkihi

Abstract:

Color and texture are highly discriminant visual cues that provide an essential information in many types of images. Color texture representation and classification is therefore one of the most challenging problems in computer vision and image processing applications. Color textures can be represented in different color spaces by using multiple image descriptors which generate a high dimensional set of texture features. In order to reduce the dimensionality of the feature set, feature selection techniques can be used. The goal of feature selection is to find a relevant subset from an original feature space that can improve the accuracy and efficiency of a classification algorithm. Traditionally, feature selection is focused on removing irrelevant features, neglecting the possible redundancy between relevant ones. This is why some feature selection approaches prefer to use feature clustering analysis to aid and guide the search. These techniques can be divided into two categories. i) Feature clustering-based ranking algorithm uses feature clustering as an analysis that comes before feature ranking. Indeed, after dividing the feature set into groups, these approaches perform a feature ranking in order to select the most discriminant feature of each group. ii) Feature clustering-based subset search algorithms can use feature clustering following one of three strategies; as an initial step that comes before the search, binded and combined with the search or as the search alternative and replacement. In this paper, we propose a new feature clustering-based sequential selection approach for the purpose of color texture representation and classification. Our approach is a three step algorithm. First, irrelevant features are removed from the feature set thanks to a class-correlation measure. Then, introducing a new automatic feature clustering algorithm, the feature set is divided into several feature clusters. Finally, a sequential search algorithm, based on a filter model and a separability measure, builds a relevant and non redundant feature subset: at each step, a feature is selected and features of the same cluster are removed and thus not considered thereafter. This allows to significantly speed up the selection process since large number of redundant features are eliminated at each step. The proposed algorithm uses the clustering algorithm binded and combined with the search. Experiments using a combination of two well known texture descriptors, namely Haralick features extracted from Reduced Size Chromatic Co-occurence Matrices (RSCCMs) and features extracted from Local Binary patterns (LBP) image histograms, on five color texture data sets, Outex, NewBarktex, Parquet, Stex and USPtex demonstrate the efficiency of our method compared to seven of the state of the art methods in terms of accuracy and computation time.

Keywords: feature selection, color texture classification, feature clustering, color LBP, chromatic cooccurrence matrix

Procedia PDF Downloads 114
22958 Track Initiation Method Based on Multi-Algorithm Fusion Learning of 1DCNN And Bi-LSTM

Authors: Zhe Li, Aihua Cai

Abstract:

Aiming at the problem of high-density clutter and interference affecting radar detection target track initiation in ECM and complex radar mission, the traditional radar target track initiation method has been difficult to adapt. To this end, we propose a multi-algorithm fusion learning track initiation algorithm, which transforms the track initiation problem into a true-false track discrimination problem, and designs an algorithm based on 1DCNN(One-Dimensional CNN)combined with Bi-LSTM (Bi-Directional Long Short-Term Memory )for fusion classification. The experimental dataset consists of real trajectories obtained from a certain type of three-coordinate radar measurements, and the experiments are compared with traditional trajectory initiation methods such as rule-based method, logical-based method and Hough-transform-based method. The simulation results show that the overall performance of the multi-algorithm fusion learning track initiation algorithm is significantly better than that of the traditional method, and the real track initiation rate can be effectively improved under high clutter density with the average initiation time similar to the logical method.

Keywords: track initiation, multi-algorithm fusion, 1DCNN, Bi-LSTM

Procedia PDF Downloads 58
22957 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 434
22956 TessPy – Spatial Tessellation Made Easy

Authors: Jonas Hamann, Siavash Saki, Tobias Hagen

Abstract:

Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.

Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies

Procedia PDF Downloads 110