Search results for: Sentinel-1A data
23273 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube
Authors: Dan Kanmegne
Abstract:
Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification
Procedia PDF Downloads 14523272 Artificial Intelligence for Traffic Signal Control and Data Collection
Authors: Reggie Chandra
Abstract:
Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal
Procedia PDF Downloads 16923271 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior
Authors: Juliana A. Knocikova
Abstract:
Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex
Procedia PDF Downloads 30023270 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: complex-valued signal processing, synthetic aperture radar, 2-D radar imaging, compressive sensing, sparse Bayesian learning
Procedia PDF Downloads 13223269 Application of Building Information Modeling in Energy Management of Individual Departments Occupying University Facilities
Authors: Kung-Jen Tu, Danny Vernatha
Abstract:
To assist individual departments within universities in their energy management tasks, this study explores the application of Building Information Modeling in establishing the ‘BIM based Energy Management Support System’ (BIM-EMSS). The BIM-EMSS consists of six components: (1) sensors installed for each occupant and each equipment, (2) electricity sub-meters (constantly logging lighting, HVAC, and socket electricity consumptions of each room), (3) BIM models of all rooms within individual departments’ facilities, (4) data warehouse (for storing occupancy status and logged electricity consumption data), (5) building energy management system that provides energy managers with various energy management functions, and (6) energy simulation tool (such as eQuest) that generates real time 'standard energy consumptions' data against which 'actual energy consumptions' data are compared and energy efficiency evaluated. Through the building energy management system, the energy manager is able to (a) have 3D visualization (BIM model) of each room, in which the occupancy and equipment status detected by the sensors and the electricity consumptions data logged are displayed constantly; (b) perform real time energy consumption analysis to compare the actual and standard energy consumption profiles of a space; (c) obtain energy consumption anomaly detection warnings on certain rooms so that energy management corrective actions can be further taken (data mining technique is employed to analyze the relation between space occupancy pattern with current space equipment setting to indicate an anomaly, such as when appliances turn on without occupancy); and (d) perform historical energy consumption analysis to review monthly and annually energy consumption profiles and compare them against historical energy profiles. The BIM-EMSS was further implemented in a research lab in the Department of Architecture of NTUST in Taiwan and implementation results presented to illustrate how it can be used to assist individual departments within universities in their energy management tasks.Keywords: database, electricity sub-meters, energy anomaly detection, sensor
Procedia PDF Downloads 30823268 Comparison of Different Machine Learning Models for Time-Series Based Load Forecasting of Electric Vehicle Charging Stations
Authors: H. J. Joshi, Satyajeet Patil, Parth Dandavate, Mihir Kulkarni, Harshita Agrawal
Abstract:
As the world looks towards a sustainable future, electric vehicles have become increasingly popular. Millions worldwide are looking to switch to Electric cars over the previously favored combustion engine-powered cars. This demand has seen an increase in Electric Vehicle Charging Stations. The big challenge is that the randomness of electrical energy makes it tough for these charging stations to provide an adequate amount of energy over a specific amount of time. Thus, it has become increasingly crucial to model these patterns and forecast the energy needs of power stations. This paper aims to analyze how different machine learning models perform on Electric Vehicle charging time-series data. The data set consists of authentic Electric Vehicle Data from the Netherlands. It has an overview of ten thousand transactions from public stations operated by EVnetNL.Keywords: forecasting, smart grid, electric vehicle load forecasting, machine learning, time series forecasting
Procedia PDF Downloads 10623267 Development of a Miniature and Low-Cost IoT-Based Remote Health Monitoring Device
Authors: Sreejith Jayachandran, Mojtaba Ghods, Morteza Mohammadzaheri
Abstract:
The modern busy world is running behind new embedded technologies based on computers and software; meanwhile, some people forget to do their health condition and regular medical check-ups. Some of them postpone medical check-ups due to a lack of time and convenience, while others skip these regular evaluations and medical examinations due to huge medical bills and hospital expenses. Engineers and medical experts have come together to give birth to a new device in the telemonitoring system capable of monitoring, checking, and evaluating the health status of the human body remotely through the internet for the needs of all kinds of people. The remote health monitoring device is a microcontroller-based embedded unit. Various types of sensors in this device are connected to the human body, and with the help of an Arduino UNO board, the required analogue data is collected from the sensors. The microcontroller on the Arduino board processes the analogue data collected in this way into digital data and transfers that information to the cloud, and stores it there, and the processed digital data is instantly displayed through the LCD attached to the machine. By accessing the cloud storage with a username and password, the concerned person’s health care teams/doctors and other health staff can collect this data for the assessment and follow-up of that patient. Besides that, the family members/guardians can use and evaluate this data for awareness of the patient's current health status. Moreover, the system is connected to a Global Positioning System (GPS) module. In emergencies, the concerned team can position the patient or the person with this device. The setup continuously evaluates and transfers the data to the cloud, and also the user can prefix a normal value range for the evaluation. For example, the blood pressure normal value is universally prefixed between 80/120 mmHg. Similarly, the RHMS is also allowed to fix the range of values referred to as normal coefficients. This IoT-based miniature system (11×10×10) cm³ with a low weight of 500 gr only consumes 10 mW. This smart monitoring system is manufactured with 100 GBP, which can be used not only for health systems, it can be used for numerous other uses including aerospace and transportation sections.Keywords: embedded technology, telemonitoring system, microcontroller, Arduino UNO, cloud storage, global positioning system, remote health monitoring system, alert system
Procedia PDF Downloads 9023266 A Study of Tourists Satisfaction and Behavior Strategies Case Study: International Tourists in Chatuchak Weekend Market
Authors: Weera Weerasophon
Abstract:
The purpose of this research was to study Tourists’s satisfaction strategies case of Tourists who attended and shopped in Chatuchak weekend market (Bangkok) in order to improve service operation of Chatuchak weekend market to serve tourists’ need to impress them. The researcher used the marketing mix as a main factor that affect to tourist satisfaction. This research was emphasized as quantitative research as 400 of questionnaires were used for collecting the data from international tourists around Chatuchak weekend market that questionnaires divided in to 3 parts as a personal information part, satisfaction of marketing/services and facilities and suggestion part. After collecting all the data that would be processed in statistic program of SPSS to use for analyze the data later on. The result is described that most of international tourists satisfied Chatuchak weekend market in the level of 4 as more satisfaction for example friendly staff, Chatuchak information, price of product, facilities and service by the way, the environment of Chatuchak weekend market is the most satisfaction level.Keywords: Chatuchak, satisfaction, Thailand tourism, marketing mix, tourists
Procedia PDF Downloads 36023265 Smart Web Services in the Web of Things
Authors: Sekkal Nawel
Abstract:
The Web of Things (WoT), integration of smart technologies from the Internet or network to Web architecture or application, is becoming more complex, larger, and dynamic. The WoT is associated with various elements such as sensors, devices, networks, protocols, data, functionalities, and architectures to perform services for stakeholders. These services operate in the context of the interaction of stakeholders and the WoT elements. Such context is becoming a key information source from which data are of various nature and uncertain, thus leading to complex situations. In this paper, we take interest in the development of intelligent Web services. The key ingredients of this “intelligent” notion are the context diversity, the necessity of a semantic representation to manage complex situations and the capacity to reason with uncertain data. In this perspective, we introduce a multi-layered architecture based on a generic intelligent Web service model dealing with various contexts, which proactively predict future situations and reactively respond to real-time situations in order to support decision-making. For semantic context data representation, we use PR-OWL, which is a probabilistic ontology based on Multi-Entity Bayesian Networks (MEBN). PR-OWL is flexible enough to represent complex, dynamic, and uncertain contexts, the key requirements of the development for the intelligent Web services. A case study was carried out using the proposed architecture for intelligent plant watering to show the role of proactive and reactive contextual reasoning in terms of WoT.Keywords: smart web service, the web of things, context reasoning, proactive, reactive, multi-entity bayesian networks, PR-OWL
Procedia PDF Downloads 7123264 Firm Performance and Stock Price in Nigeria
Authors: Tijjani Bashir Musa
Abstract:
The recent global crisis which suddenly results to Nigerian stock market crash revealed some peculiarities of Nigerian firms. Some firms in Nigeria are performing but their stock prices are not increasing while some firms are at the brink of collapse but their stock prices are increasing. Thus, this study examines the relationship between firm performance and stock price in Nigeria. The study covered the period of 2005 to 2009. This period is the period of stock boom and also marked the period of stock market crash as a result of global financial meltdown. The study is a panel study. A total of 140 firms were sampled from 216 firms listed on the Nigerian Stock Exchange (NSE). Data were collected from secondary source. These data were divided into four strata comprising the most performing stock, the least performing stock, most performing firms and the least performing firms. Each stratum contains 35 firms with characteristic of most performing stock, most performing firms, least performing stock and least performing firms. Multiple linear regression models were used to analyse the data while statistical/econometrics package of Stata 11.0 version was used to run the data. The study found that, relationship exists between selected firm performance parameters (operating efficiency, firm profit, earning per share and working capital) and stock price. As such firm performance gave sufficient information or has predictive power on stock prices movements in Nigeria for all the years under study.. The study recommends among others that Managers of firms in Nigeria should formulate policies and exert effort geared towards improving firm performance that will enhance stock prices movements.Keywords: firm, Nigeria, performance, stock price
Procedia PDF Downloads 47723263 Modelling Dengue Disease With Climate Variables Using Geospatial Data For Mekong River Delta Region of Vietnam
Authors: Thi Thanh Nga Pham, Damien Philippon, Alexis Drogoul, Thi Thu Thuy Nguyen, Tien Cong Nguyen
Abstract:
Mekong River Delta region of Vietnam is recognized as one of the most vulnerable to climate change due to flooding and seawater rise and therefore an increased burden of climate change-related diseases. Changes in temperature and precipitation are likely to alter the incidence and distribution of vector-borne diseases such as dengue fever. In this region, the peak of the dengue epidemic period is around July to September during the rainy season. It is believed that climate is an important factor for dengue transmission. This study aims to enhance the capacity of dengue prediction by the relationship of dengue incidences with climate and environmental variables for Mekong River Delta of Vietnam during 2005-2015. Mathematical models for vector-host infectious disease, including larva, mosquito, and human being were used to calculate the impacts of climate to the dengue transmission with incorporating geospatial data for model input. Monthly dengue incidence data were collected at provincial level. Precipitation data were extracted from satellite observations of GSMaP (Global Satellite Mapping of Precipitation), land surface temperature and land cover data were from MODIS. The value of seasonal reproduction number was estimated to evaluate the potential, severity and persistence of dengue infection, while the final infected number was derived to check the outbreak of dengue. The result shows that the dengue infection depends on the seasonal variation of climate variables with the peak during the rainy season and predicted dengue incidence follows well with this dynamic for the whole studied region. However, the highest outbreak of 2007 dengue was not captured by the model reflecting nonlinear dependences of transmission on climate. Other possible effects will be discussed to address the limitation of the model. This suggested the need of considering of both climate variables and another variability across temporal and spatial scales.Keywords: infectious disease, dengue, geospatial data, climate
Procedia PDF Downloads 38323262 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance
Authors: Loai AbdAllah, Mahmoud Kaiyal
Abstract:
Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.Keywords: missing values, incomplete data, distance, incomplete diabetes data
Procedia PDF Downloads 22523261 Decoding the Natural Hazards: The Data Paradox, Juggling Data Flows, Transparency and Secrets, Analysis of Khuzestan and Lorestan Floods of Iran
Authors: Kiyanoush Ghalavand
Abstract:
We have a complex paradox in the agriculture and environment sectors in the age of technology. In the one side, the achievements of the science and information ages are shaping to come that is very dangerous than ever last decades. The progress of the past decades is historic, connecting people, empowering individuals, groups, and states, and lifting a thousand people out of land and poverty in the process. Floods are the most frequent natural hazards damaging and recurring of all disasters in Iran. Additionally, floods are morphing into new and even more devastating forms in recent years. Khuzestan and Lorestan Provinces experienced heavy rains that began on March 28, 2019, and led to unprecedented widespread flooding and landslides across the provinces. The study was based on both secondary and primary data. For the present study, a questionnaire-based primary survey was conducted. Data were collected by using a specially designed questionnaire and other instruments, such as focus groups, interview schedules, inception workshops, and roundtable discussions with stakeholders at different levels. Farmers in Khuzestan and Lorestan provinces were the statistical population for this study. Data were analyzed with several software such as ATLASti, NVivo SPSS Win, ،E-Views. According to a factorial analysis conducted for the present study, 10 groups of factors were categorized climatic, economic, cultural, supportive, instructive, planning, military, policymaking, geographical, and human factors. They estimated 71.6 percent of explanatory factors of flood management obstacles in the agricultural sector in Lorestan and Khuzestan provinces. Several recommendations were finally made based on the study findings.Keywords: chaos theory, natural hazards, risks, environmental risks, paradox
Procedia PDF Downloads 14623260 Techniques to Characterize Subpopulations among Hearing Impaired Patients and Its Impact for Hearing Aid Fitting
Authors: Vijaya K. Narne, Gerard Loquet, Tobias Piechowiak, Dorte Hammershoi, Jesper H. Schmidt
Abstract:
BEAR, which stands for better hearing rehabilitation is a large-scale project in Denmark designed and executed by three national universities, three hospitals, and the hearing aid industry with the aim to improve hearing aid fitting. A total of 1963 hearing impaired people were included and were segmented into subgroups based on hearing-loss, demographics, audiological and questionnaires data (i.e., the speech, spatial and qualities of hearing scale [SSQ-12] and the International Outcome Inventory for Hearing-Aids [IOI-HA]). With the aim to provide a better hearing-aid fit to individual patients, we applied modern machine learning techniques with traditional audiograms rule-based systems. Results show that age, speech discrimination scores, and audiogram configurations were evolved as important parameters in characterizing sub-population from the data-set. The attempt to characterize sub-population reveal a clearer picture about the individual hearing difficulties encountered and the benefits derived from more individualized hearing aids.Keywords: hearing loss, audiological data, machine learning, hearing aids
Procedia PDF Downloads 15423259 Female Criminality in Lagos State: A Case of Armed Robbery
Authors: Ebobo Urowoli Christiana
Abstract:
The Nigerian Prison Service statistics of 2007; 2009 revealed that though crime in the past was ascribed to men, but today there is a steady increase in the population of women involved in crime. This study focused on the investigation of female criminality in Lagos State: A case of Armed Robbery. Its major objective was to find out if there is an increase or decrease in female involvement in armed robbery and its growth rate. The major research question is 'Is there an increase in the perpetration of armed robbery by females in Lagos State?' the null hypotheses is 'There is no significant increase in the perpetration of armed robbery by females in Lagos State.' As a result, this study adopted the survey design, purposive sampling method and a sample size of 120 respondents. The rational choice theory was used to explain the reason for female involvement in armed robbery. Both primary and secondary data was generated for this study; the primary data was collected from the criminal records in Lagos State Police Command, Panti while the Quantitative data was collected using the questionnaire from 120 female detainees and inmates. The data collected was analyzed using the simple frequency tables and percentages and chi square was used to test for relationships. The study revealed a persistent rise in the prevalence of female armed robbery and recommended that youths should be equipped with educational/vocational skills in order to lead responsible lives.Keywords: criminality, armed robbery, female, police commands, panti, nature
Procedia PDF Downloads 40623258 Empirical Roughness Progression Models of Heavy Duty Rural Pavements
Authors: Nahla H. Alaswadko, Rayya A. Hassan, Bayar N. Mohammed
Abstract:
Empirical deterministic models have been developed to predict roughness progression of heavy duty spray sealed pavements for a dataset representing rural arterial roads. The dataset provides a good representation of the relevant network and covers a wide range of operating and environmental conditions. A sample with a large size of historical time series data for many pavement sections has been collected and prepared for use in multilevel regression analysis. The modelling parameters include road roughness as performance parameter and traffic loading, time, initial pavement strength, reactivity level of subgrade soil, climate condition, and condition of drainage system as predictor parameters. The purpose of this paper is to report the approaches adopted for models development and validation. The study presents multilevel models that can account for the correlation among time series data of the same section and to capture the effect of unobserved variables. Study results show that the models fit the data very well. The contribution and significance of relevant influencing factors in predicting roughness progression are presented and explained. The paper concludes that the analysis approach used for developing the models confirmed their accuracy and reliability by well-fitting to the validation data.Keywords: roughness progression, empirical model, pavement performance, heavy duty pavement
Procedia PDF Downloads 16823257 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor
Authors: Panupong Makvichian
Abstract:
Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor
Procedia PDF Downloads 19823256 Assessment of Social Vulnerability of Urban Population to Floods – a Case Study of Mumbai
Authors: Sherly M. A., Varsha Vijaykumar, Subhankar Karmakar, Terence Chan, Christian Rau
Abstract:
This study aims at proposing an indicator-based framework for assessing social vulnerability of any coastal megacity to floods. The final set of indicators of social vulnerability are chosen from a set of feasible and available indicators which are prepared using a Geographic Information System (GIS) framework on a smaller scale considering 1-km grid cell to provide an insight into the spatial variability of vulnerability. The optimal weight for each individual indicator is assigned using data envelopment analysis (DEA) as it avoids subjective weights and improves the confidence on the results obtained. In order to de-correlate and reduce the dimension of multivariate data, principal component analysis (PCA) has been applied. The proposed methodology is demonstrated on twenty four wards of Mumbai under the jurisdiction of Municipal Corporation of Greater Mumbai (MCGM). This framework of vulnerability assessment is not limited to the present study area, and may be applied to other urban damage centers.Keywords: urban floods, vulnerability, data envelopment analysis, principal component analysis
Procedia PDF Downloads 36123255 Narcissism in the Life of Howard Hughes: A Psychobiographical Exploration
Authors: Alida Sandison, Louise A. Stroud
Abstract:
Narcissism is a personality configuration which has both normal and pathological personality expressions. Narcissism is highly complex, and is linked to a broad field of research. There are both dimensional and categorical conceptualisations of narcissism, and a variety of theoretical formulations that have been put forward to understand the narcissistic personality configuration. Currently, Kernberg’s Object Relations theory is well supported for this purpose. The complexity and particular defense mechanisms at play in the narcissistic personality make it a difficult personality configuration worth further research. Psychobiography as a methodology allows for the exploration of the lived life, and is thus a useful methodology to surmount these inherent challenges. Narcissism has been a focus of academic interest for a long time, and although there is a lot of research done in this area, to the researchers' knowledge, narcissistic dynamics have never been explored within a psychobiographical format. Thus, the primary aim of the research was to explore and describe narcissism in the life of Howard Hughes, with the objective of gaining further insight into narcissism through the use of this unconventional research approach. Hughes was chosen as subject for the study as he is renowned as an eccentric billionaire who had his revolutionary effect on the world, but was concurrently disturbed within his personal pathologies. Hughes was dynamic in three different sectors, namely motion pictures, aviation and gambling. He became more and more reclusive as he entered into middle age. From his early fifties he was agoraphobic, and the social network of connectivity that could reasonably be expected from someone in the top of their field was notably distorted. Due to his strong narcissistic personality configuration, and the interpersonal difficulties he experienced, Hughes represents an ideal figure to explore narcissism. The study used a single case study design, and purposive sampling to select Hughes. Qualitative data was sampled, using secondary data sources. Given that Hughes was a famous figure, there is a plethora of information on his life, which is primarily autobiographical. This includes books written about his life, and archival material in the form of newspaper articles, interviews and movies. Gathered data were triangulated to avoid the effect of author bias, and increase the credibility of the data used. It was collected using Yin’s guidelines for data collection. Data was analysed using Miles and Huberman strategy of data analysis, which consists of three steps, namely, data reduction, data display, and conclusion drawing and verification. Patterns which emerged in the data highlighted the defense mechanisms used by Hughes, in particular that of splitting and projection, in defending his sense of self. These defense mechanisms help us to understand the high levels of entitlement and paranoia experienced by Hughes. Findings provide further insight into his sense of isolation and difference, and the consequent difficulty he experienced in maintaining connections with others. Findings furthermore confirm the effectiveness of Kernberg’s theory in understanding narcissism observing an individual life.Keywords: Howard Hughes, narcissism, narcissistic defenses, object relations
Procedia PDF Downloads 35723254 The Use of Piezocone Penetration Test Data for the Assessment of Iron Ore Tailings Liquefaction Susceptibility
Authors: Breno M. Castilho
Abstract:
The Iron Ore Quadrangle, located in the state of Minas Gerais, Brazil is responsible for most of the country’s iron ore production. As a result, some of the biggest tailings dams in the country are located in this area. In recent years, several major failure events have happened in Tailings Storage Facilities (TSF) located in the Iron Ore Quadrangle. Some of these failures were found to be caused by liquefaction flowslides. This paper presents Piezocone Penetration Test (CPTu) data that was used, by applying Olson and Peterson methods, for the liquefaction susceptibility assessment of the iron ore tailings that are typically found in most TSF in the area. Piezocone data was also used to determine the steady-state strength of the tailings so as to allow for comparison with its drained strength. Results have shown great susceptibility for liquefaction to occur in the studied tailings and, more importantly, a large reduction in its strength. These results are key to understanding the failures that took place over the last few years.Keywords: Piezocone Penetration Test CPTu, iron ore tailings, mining, liquefaction susceptibility assessment
Procedia PDF Downloads 23323253 A Case Study for User Rating Prediction on Automobile Recommendation System Using Mapreduce
Authors: Jiao Sun, Li Pan, Shijun Liu
Abstract:
Recommender systems have been widely used in contemporary industry, and plenty of work has been done in this field to help users to identify items of interest. Collaborative Filtering (CF, for short) algorithm is an important technology in recommender systems. However, less work has been done in automobile recommendation system with the sharp increase of the amount of automobiles. What’s more, the computational speed is a major weakness for collaborative filtering technology. Therefore, using MapReduce framework to optimize the CF algorithm is a vital solution to this performance problem. In this paper, we present a recommendation of the users’ comment on industrial automobiles with various properties based on real world industrial datasets of user-automobile comment data collection, and provide recommendation for automobile providers and help them predict users’ comment on automobiles with new-coming property. Firstly, we solve the sparseness of matrix using previous construction of score matrix. Secondly, we solve the data normalization problem by removing dimensional effects from the raw data of automobiles, where different dimensions of automobile properties bring great error to the calculation of CF. Finally, we use the MapReduce framework to optimize the CF algorithm, and the computational speed has been improved times. UV decomposition used in this paper is an often used matrix factorization technology in CF algorithm, without calculating the interpolation weight of neighbors, which will be more convenient in industry.Keywords: collaborative filtering, recommendation, data normalization, mapreduce
Procedia PDF Downloads 21723252 ANOVA-Based Feature Selection and Machine Learning System for IoT Anomaly Detection
Authors: Muhammad Ali
Abstract:
Cyber-attacks and anomaly detection on the Internet of Things (IoT) infrastructure is emerging concern in the domain of data-driven intrusion. Rapidly increasing IoT risk is now making headlines around the world. denial of service, malicious control, data type probing, malicious operation, DDos, scan, spying, and wrong setup are attacks and anomalies that can affect an IoT system failure. Everyone talks about cyber security, connectivity, smart devices, and real-time data extraction. IoT devices expose a wide variety of new cyber security attack vectors in network traffic. For further than IoT development, and mainly for smart and IoT applications, there is a necessity for intelligent processing and analysis of data. So, our approach is too secure. We train several machine learning models that have been compared to accurately predicting attacks and anomalies on IoT systems, considering IoT applications, with ANOVA-based feature selection with fewer prediction models to evaluate network traffic to help prevent IoT devices. The machine learning (ML) algorithms that have been used here are KNN, SVM, NB, D.T., and R.F., with the most satisfactory test accuracy with fast detection. The evaluation of ML metrics includes precision, recall, F1 score, FPR, NPV, G.M., MCC, and AUC & ROC. The Random Forest algorithm achieved the best results with less prediction time, with an accuracy of 99.98%.Keywords: machine learning, analysis of variance, Internet of Thing, network security, intrusion detection
Procedia PDF Downloads 12523251 Design of Bidirectional Wavelength Division Multiplexing Passive Optical Network in Optisystem Environment
Authors: Ashiq Hussain, Mahwash Hussain, Zeenat Parveen
Abstract:
Now a days the demand for broadband service has increased. Due to which the researchers are trying to find a solution to provide a large amount of service. There is a shortage of bandwidth because of the use of downloading video, voice and data. One of the solutions to overcome this shortage of bandwidth is to provide the communication system with passive optical components. We have increased the data rate in this system. From experimental results we have concluded that the quality factor has increased by adding passive optical networks.Keywords: WDM-PON, optical fiber, BER, Q-factor, eye diagram
Procedia PDF Downloads 51023250 Analyzing the Social, Cultural and Economic Impacts of Indigenous Tourism on the Indigenous Communities: Case Study of the Nubian Community in Egypt
Authors: M. Makary
Abstract:
Indigenous tourism is nowadays one of the fastest growing sections of the tourism industry. Nevertheless, it does not yet receive attention on the agenda of public tourism policies in Egypt; however, there are various tourism initiatives in indigenous areas throughout the country mainly in the Nubia region, which located in Upper Egypt, where most of Egypt's indigenous Nubians are concentrated. Considering indigenous tourism can lead to both positive and negative impacts on the indigenous communities the main aim of this study is to analyze the socio-cultural and economic impacts of the indigenous tourism on the indigenous communities in Egypt: the case study of Nubians. Qualitative and quantitative approaches of data collection were designed and applied in conducting this study. Semi-structured interviews, focus groups, and the observations are the main preliminary data collection techniques used in this study while, the secondary data were sourced from articles, statistics, dissertations, and websites. The research concludes that indigenous tourism offers a strong motivation to save the identity of the indigenous communities and to foster their economic development. However, it also has negative impacts on their society.Keywords: indigenous tourism, sustainable tourism, Indigenous communities, Nubians
Procedia PDF Downloads 24523249 Risk Analysis of Flood Physical Vulnerability in Residential Areas of Mathare Nairobi, Kenya
Authors: James Kinyua Gitonga, Toshio Fujimi
Abstract:
Vulnerability assessment and analysis is essential to solving the degree of damage and loss as a result of natural disasters. Urban flooding causes a major economic loss and casualties, at Mathare residential area in Nairobi, Kenya. High population caused by rural-urban migration, Unemployment, and unplanned urban development are among factors that increase flood vulnerability in Mathare area. This study aims to analyse flood risk physical vulnerabilities in Mathare based on scientific data, research data that includes the Rainfall data, River Mathare discharge rate data, Water runoff data, field survey data and questionnaire survey through sampling of the study area have been used to develop the risk curves. Three structural types of building were identified in the study area, vulnerability and risk curves were made for these three structural types by plotting the relationship between flood depth and damage for each structural type. The results indicate that the structural type with mud wall and mud floor is the most vulnerable building to flooding while the structural type with stone walls and concrete floor is least vulnerable. The vulnerability of building contents is mainly determined by the number of floors, where households with two floors are least vulnerable, and households with a one floor are most vulnerable. Therefore more than 80% of the residential buildings including the property in the building are highly vulnerable to floods consequently exposed to high risk. When estimating the potential casualties/injuries we discovered that the structural types of houses were major determinants where the mud/adobe structural type had casualties of 83.7% while the Masonry structural type had casualties of 10.71% of the people living in these houses. This research concludes that flood awareness, warnings and observing the building codes will enable reduce damage to the structural types of building, deaths and reduce damage to the building contents.Keywords: flood loss, Mathare Nairobi, risk curve analysis, vulnerability
Procedia PDF Downloads 23923248 The Happiness Pulse: A Measure of Individual Wellbeing at a City Scale, Development and Validation
Authors: Rosemary Hiscock, Clive Sabel, David Manley, Sam Wren-Lewis
Abstract:
As part of the Happy City Index Project, Happy City have developed a survey instrument to measure experienced wellbeing: how people are feeling and functioning in their everyday lives. The survey instrument, called the Happiness Pulse, was developed in partnership with the New Economics Foundation (NEF) with the dual aim of collecting citywide wellbeing data and engaging individuals and communities in the measurement and promotion of their own wellbeing. The survey domains and items were selected through a review of the academic literature and a stakeholder engagement process, including local policymakers, community organisations and individuals. The Happiness Pulse was included in the Bristol pilot of the Happy City Index (n=722). The experienced wellbeing items were subjected to factor analysis. A reduced number of items to be included in a revised scale for future data collection were again entered into a factor analysis. These revised factors were tested for reliability and validity. Among items to be included in a revised scale for future data collection three factors emerged: Be, Do and Connect. The Be factor had good reliability, convergent and criterion validity. The Do factor had good discriminant validity. The Connect factor had adequate reliability and good discriminant and criterion validity. Some age, gender and socioeconomic differentiation was found. The properties of a new scale to measure experienced wellbeing, intended for use by municipal authorities, are described. Happiness Pulse data can be combined with local data on wellbeing conditions to determine what matters for peoples wellbeing across a city and why.Keywords: city wellbeing , community wellbeing, engaging individuals and communities, measuring wellbeing and happiness
Procedia PDF Downloads 26123247 Evaluation of Deformation for Deep Excavations in the Greater Vancouver Area Through Case Studies
Authors: Boris Kolev, Matt Kokan, Mohammad Deriszadeh, Farshid Bateni
Abstract:
Due to the increasing demand for real estate and the need for efficient land utilization in Greater Vancouver, developers have been increasingly considering the construction of high-rise structures with multiple below-grade parking. The temporary excavations required to allow for the construction of underground levels have recently reached up to 40 meters in depth. One of the challenges with deep excavations is the prediction of wall displacements and ground settlements due to their effect on the integrity of City utilities, infrastructure, and adjacent buildings. A large database of survey monitoring data has been collected for deep excavations in various soil conditions and shoring systems. The majority of the data collected is for tie-back anchors and shotcrete lagging systems. The data were categorized, analyzed and the results were evaluated to find a relationship between the most dominant parameters controlling the displacement, such as depth of excavation, soil properties, and the tie-back anchor loading and arrangement. For a select number of deep excavations, finite element modeling was considered for analyses. The lateral displacements from the simulation results were compared to the recorded survey monitoring data. The study concludes with a discussion and comparison of the available empirical and numerical modeling methodologies for evaluating lateral displacements in deep excavations.Keywords: deep excavations, lateral displacements, numerical modeling, shoring walls, tieback anchors
Procedia PDF Downloads 18223246 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models
Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru
Abstract:
Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.Keywords: maize, stem borers, density, RapidEye, GLM
Procedia PDF Downloads 49723245 Social Construction of Gender: Comparison of Gender Stereotypes among Bureaucrats and Non- Bureaucrats
Authors: Arshad Ali
Abstract:
This study aims to highlight the comparative patterns of social construction of gender among bureaucrats and non-bureaucrats. For the purpose of this study purposive sample of 8 respondents, including both male and female bureaucrats and non-bureaucrats, was collected from Gujranwala and Lahore. The measures for collecting data included an indigenous demographic information sheet and interview protocol related to gender roles, social construction of gender and managerial performance. The collected data was analyzed through the Nvivo version 11 and analysis reveals that there are diverse perceptions regarding male and female stereotyping among bureaucrats and non-bureaucrats, as different kinds of social environments lead to the modification of stereotypes. The research contributes to gender studies, specifically in the context of Pakistani society. There are very few studies available, and empirical data about Gender construction is scanty, so the study provides an impetus for future research. It is suggested that future research explore the phenomenon at a larger scale, including more respondents and another dimension, by keeping in view the socio-economic factors and policies of the government regarding the elimination of gender discrimination in Pakistan.Keywords: social construction, gender, bureaucrats, gender perception
Procedia PDF Downloads 7523244 Secure E-Voting Using Blockchain Technology
Authors: Barkha Ramteke, Sonali Ridhorkar
Abstract:
An election is an important event in all countries. Traditional voting has several drawbacks, including the expense of time and effort required for tallying and counting results, the cost of papers, arrangements, and everything else required to complete a voting process. Many countries are now considering online e-voting systems, but the traditional e-voting systems suffer a lack of trust. It is not known if a vote is counted correctly, tampered or not. A lack of transparency means that the voter has no assurance that his or her vote will be counted as they voted in elections. Electronic voting systems are increasingly using blockchain technology as an underlying storage mechanism to make the voting process more transparent and assure data immutability as blockchain technology grows in popularity. The transparent feature, on the other hand, may reveal critical information about applicants because all system users have the same entitlement to their data. Furthermore, because of blockchain's pseudo-anonymity, voters' privacy will be revealed, and third parties involved in the voting process, such as registration institutions, will be able to tamper with data. To overcome these difficulties, we apply Ethereum smart contracts into blockchain-based voting systems.Keywords: blockchain, AMV chain, electronic voting, decentralized
Procedia PDF Downloads 138