Search results for: well data integration
23031 The Role of Japan's Land-Use Planning in Farmland Conservation: A Statistical Study of Tokyo Metropolitan District
Authors: Ruiyi Zhang, Wanglin Yan
Abstract:
Strict land-use plan is issued based on city planning act for controlling urbanization and conserving semi-natural landscape. And the agrarian land resource in the suburbs has indispensable socio-economic value and contributes to the sustainability of the regional environment. However, the agrarian hinterland of metropolitan is witnessing severe farmland conversion and abandonment, while the contribution of land-use planning to farmland conservation remains unclear in those areas. Hypothetically, current land-use plan contributes to farmland loss. So, this research investigated the relationship between farmland loss and land-use planning at municipality level to provide base data for zoning in the metropolitan suburbs, and help to develop a sustainable land-use plan that will conserve the agrarian hinterland. As data and methods, 1) Farmland data of Census of Agriculture and Forestry for 2005 to 2015 and population data of 2015 and 2018 were used to investigate spatial distribution feathers of farmland loss in Tokyo Metropolitan District (TMD) for two periods: 2005-2010;2010-2015. 2) And the samples were divided by four urbanization facts. 3) DID data and zoning data for 2006 to 2018 were used to specify urbanization level of zones for describing land-use plan. 4) Then we conducted multiple regression between farmland loss, both abandonment and conversion amounts, and the described land-use plan in each of the urbanization scenario and in each period. As the results, the study reveals land-use plan has unignorable relation with farmland loss in the metropolitan suburbs at ward-city-town-village level. 1) The urban promotion areas planned larger than necessity and unregulated urbanization promote both farmland conversion and abandonment, and the effect weakens from inner suburbs to outer suburbs. 2) And the effect of land-use plan on farmland abandonment is more obvious than that on farmland conversion. The study advocates that, optimizing land-use plan will hopefully help the farmland conservation in metropolitan suburbs, which contributes to sustainable regional policy making.Keywords: Agrarian land resource, land-use planning, urbanization level, multiple regression
Procedia PDF Downloads 14923030 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 16123029 Techno-Economic Assessments of Promising Chemicals from a Sugar Mill Based Biorefinery
Authors: Kathleen Frances Haigh, Mieke Nieder-Heitmann, Somayeh Farzad, Mohsen Ali Mandegari, Johann Ferdinand Gorgens
Abstract:
Lignocellulose can be converted to a range of biochemicals and biofuels. Where this is derived from agricultural waste, issues of competition with food are virtually eliminated. One such source of lignocellulose is the South African sugar industry. Lignocellulose could be accessed by changes to the current farming practices and investments in more efficient boilers. The South African sugar industry is struggling due to falling sugar prices and increasing costs and it is proposed that annexing a biorefinery to a sugar mill will broaden the product range and improve viability. Process simulations of the selected chemicals were generated using Aspen Plus®. It was envisaged that a biorefinery would be annexed to a typical South African sugar mill. Bagasse would be diverted from the existing boilers to the biorefinery and mixed with harvest residues. This biomass would provide the feedstock for the biorefinery and the process energy for the biorefinery and sugar mill. Thus, in all scenarios a portion of the biomass was diverted to a new efficient combined heat and power plant (CHP). The Aspen Plus® simulations provided the mass and energy balance data to carry out an economic assessment of each scenarios. The net present value (NPV), internal rate of return (IRR) and minimum selling price (MSP) was calculated for each scenario. As a starting point scenarios were generated to investigate the production of ethanol, ethanol and lactic acid, ethanol and furfural, butanol, methanol, and Fischer-Tropsch syncrude. The bypass to the CHP plant is a useful indicator of the energy demands of the chemical processes. An iterative approach was used to identify a suitable bypass because increasing this value had the combined effect of increasing the amount of energy available and reducing the capacity of the chemical plant. Bypass values ranged from 30% for syncrude production to 50% for combined ethanol and furfural production. A hurdle rate of 15.7% was selected for the IRR. The butanol, combined ethanol and furfural, or the Fischer-Tropsch syncrude scenarios are unsuitable for investment with IRRs of 4.8%, 7.5% and 11.5% respectively. This provides valuable insights into research opportunities. For example furfural from sugarcane bagasse is an established process although the integration of furfural production with ethanol is less well understood. The IRR for the ethanol scenario was 14.7%, which is below the investment criteria, but given the technological maturity it may still be considered for investment. The scenarios which met the investment criteria were the combined ethanol and lactic acid, and the methanol scenarios with IRRs of 20.5% and 16.7%, respectively. These assessments show that the production of biochemicals from lignocellulose can be commercially viable. In addition, this assessment have provided valuable insights for research to improve the commercial viability of additional chemicals and scenarios. This has led to further assessments of the production of itaconic acid, succinic acid, citric acid, xylitol, polyhydroxybutyrate, polyethylene, glucaric acid and glutamic acid.Keywords: biorefineries, sugar mill, methanol, ethanol
Procedia PDF Downloads 19723028 Using a Robot Companion to Detect and Visualize the Indicators of Dementia Progression and Quality of Life of People Aged 65 and Older
Authors: Jeoffrey Oostrom, Robbert James Schlingmann, Hani Alers
Abstract:
This document depicts the research into the indicators of dementia progression, the automation of quality of life assignments, and the visualization of it. To do this, the Smart Teddy project was initiated to make a smart companion that both monitors the senior citizen as well as processing the captured data into an insightful dashboard. With around 50 million diagnoses worldwide, dementia proves again and again to be a bothersome strain on the lives of many individuals, their relatives, and society as a whole. In 2015 it was estimated that dementia care cost 818 billion U.S Dollars globally. The Smart Teddy project aims to take away a portion of the burden from caregivers by automating the collection of certain data, like movement, geolocation, and sound-levels. This paper proves that the Smart Teddy has the potential to become a useful tool for caregivers but won’t pose as a solution. The Smart Teddy still faces some problems in terms of emotional privacy, but its non-intrusive nature, as well as diversity in usability, can make up for it.Keywords: dementia care, medical data visualization, quality of life, smart companion
Procedia PDF Downloads 13923027 The Social Aspects of Code-Switching in Online Interaction: The Case of Saudi Bilinguals
Authors: Shirin Alabdulqader
Abstract:
This research aims to investigate the concept of code-switching (CS) between English, Arabic, and the CS practices of Saudi online users via a Translanguaging (TL) lens for more inclusive view towards the nature of the data from the study. It employs Digitally Mediated Communication (DMC), specifically the WhatsApp and Twitter platforms, in order to understand how the users employ online resources to communicate with others on a daily basis. This project looks beyond language and considers the multimodal affordances (visual and audio means) that interlocutors utilise in their online communicative practices to shape their online social existence. This exploratory study is based on a data-driven interpretivist epistemology as it aims to understand how meaning (reality) is created by individuals within different contexts. This project used a mixed-method approach, combining a qualitative and a quantitative approach. In the former, data were collected from online chats and interview responses, while in the latter a questionnaire was employed to understand the frequency and relations between the participants’ linguistic and non-linguistic practices and their social behaviours. The participants were eight bilingual Saudi nationals (both men and women, aged between 20 and 50 years old) who interacted with others online. These participants provided their online interactions, participated in an interview and responded to a questionnaire. The study data were gathered from 194 WhatsApp chats and 122 Tweets. These data were analysed and interpreted according to three levels: conversational turn taking and CS; the linguistic description of the data; and CS and persona. This project contributes to the emerging field of analysing online Arabic data systematically, and the field of multimodality and bilingual sociolinguistics. The findings are reported for each of the three levels. For conversational turn taking, the CS analysis revealed that it was used to accomplish negotiation and develop meaning in the conversation. With regard to the linguistic practices of the CS data, the majority of the code-switched words were content morphemes. The third level of data interpretation is CS and its relationship with identity; two types of identity were indexed; absolute identity and contextual identity. This study contributes to the DMC literature and bridges some of the existing gaps. The findings of this study are that CS by its nature, and most of the findings, if not all, support the notion of TL that multiliteracy is one’s ability to decode multimodal communication, and that this multimodality contributes to the meaning. Either this is applicable to the online affordances used by monolinguals or multilinguals and perceived not only by specific generations but also by any online multiliterates, the study provides the linguistic features of CS utilised by Saudi bilinguals and it determines the relationship between these features and the contexts in which they appear.Keywords: social media, code-switching, translanguaging, online interaction, saudi bilinguals
Procedia PDF Downloads 13123026 Predicting the Compressive Strength of Geopolymer Concrete Using Machine Learning Algorithms: Impact of Chemical Composition and Curing Conditions
Authors: Aya Belal, Ahmed Maher Eltair, Maggie Ahmed Mashaly
Abstract:
Geopolymer concrete is gaining recognition as a sustainable alternative to conventional Portland Cement concrete due to its environmentally friendly nature, which is a key goal for Smart City initiatives. It has demonstrated its potential as a reliable material for the design of structural elements. However, the production of Geopolymer concrete is hindered by batch-to-batch variations, which presents a significant challenge to the widespread adoption of Geopolymer concrete. To date, Machine learning has had a profound impact on various fields by enabling models to learn from large datasets and predict outputs accurately. This paper proposes an integration between the current drift to Artificial Intelligence and the composition of Geopolymer mixtures to predict their mechanical properties. This study employs Python software to develop machine learning model in specific Decision Trees. The research uses the percentage oxides and the chemical composition of the Alkali Solution along with the curing conditions as the input independent parameters, irrespective of the waste products used in the mixture yielding the compressive strength of the mix as the output parameter. The results showed 90 % agreement of the predicted values to the actual values having the ratio of the Sodium Silicate to the Sodium Hydroxide solution being the dominant parameter in the mixture.Keywords: decision trees, geopolymer concrete, machine learning, smart cities, sustainability
Procedia PDF Downloads 8823025 The Challenge of Characterising Drought Risk in Data Scarce Regions: The Case of the South of Angola
Authors: Natalia Limones, Javier Marzo, Marcus Wijnen, Aleix Serrat-Capdevila
Abstract:
In this research we developed a structured approach for the detection of areas under the highest levels of drought risk that is suitable for data-scarce environments. The methodology is based on recent scientific outcomes and methods and can be easily adapted to different contexts in successive exercises. The research reviews the history of drought in the south of Angola and characterizes the experienced hazard in the episode from 2012, focusing on the meteorological and the hydrological drought types. Only global open data information coming from modeling or remote sensing was used for the description of the hydroclimatological variables since there is almost no ground data in this part of the country. Also, the study intends to portray the socioeconomic vulnerabilities and the exposure to the phenomenon in the region to fully understand the risk. As a result, a map of the areas under the highest risk in the south of the country is produced, which is one of the main outputs of this work. It was also possible to confirm that the set of indicators used revealed different drought vulnerability profiles in the South of Angola and, as a result, several varieties of priority areas prone to distinctive impacts were recognized. The results demonstrated that most of the region experienced a severe multi-year meteorological drought that triggered an unprecedent exhaustion of the surface water resources, and that the majority of their socioeconomic impacts started soon after the identified onset of these processes.Keywords: drought risk, exposure, hazard, vulnerability
Procedia PDF Downloads 19123024 Charting Sentiments with Naive Bayes and Logistic Regression
Authors: Jummalla Aashrith, N. L. Shiva Sai, K. Bhavya Sri
Abstract:
The swift progress of web technology has not only amassed a vast reservoir of internet data but also triggered a substantial surge in data generation. The internet has metamorphosed into one of the dynamic hubs for online education, idea dissemination, as well as opinion-sharing. Notably, the widely utilized social networking platform Twitter is experiencing considerable expansion, providing users with the ability to share viewpoints, participate in discussions spanning diverse communities, and broadcast messages on a global scale. The upswing in online engagement has sparked a significant curiosity in subjective analysis, particularly when it comes to Twitter data. This research is committed to delving into sentiment analysis, focusing specifically on the realm of Twitter. It aims to offer valuable insights into deciphering information within tweets, where opinions manifest in a highly unstructured and diverse manner, spanning a spectrum from positivity to negativity, occasionally punctuated by neutrality expressions. Within this document, we offer a comprehensive exploration and comparative assessment of modern approaches to opinion mining. Employing a range of machine learning algorithms such as Naive Bayes and Logistic Regression, our investigation plunges into the domain of Twitter data streams. We delve into overarching challenges and applications inherent in the realm of subjectivity analysis over Twitter.Keywords: machine learning, sentiment analysis, visualisation, python
Procedia PDF Downloads 5623023 Sustainability in Hospitality: An Inevitable Necessity in New Age with Big Environmental Challenges
Authors: Majid Alizadeh, Sina Nematizadeh, Hassan Esmailpour
Abstract:
The mutual effects of hospitality and the environment are undeniable, so that the tourism industry has major harmful effects on the environment. Hotels, as one of the most important pillars of the hospitality industry, have significant effects on the environment. Green marketing is a promising strategy in response to the growing concerns about the environment. A green hotel marketing model was proposed using a grounded theory approach in the hotel industry. The study was carried out as a mixed method study. Data gathering in the qualitative phase was done through literature review and In-depth, semi-structured interviews with 10 experts in green marketing using snowball technique. Following primary analysis, open, axial, and selective coding was done on the data, which yielded 69 concepts, 18 categories and six dimensions. Green hotel (green product) was adopted as the core phenomenon. In the quantitative phase, data were gleaned using 384 questionnaires filled-out by hotel guests and descriptive statistics and Structural equation modeling (SEM) were used for data analysis. The results indicated that the mediating role of behavioral response between the ecological literacy, trust, marketing mix and performance was significant. The green marketing mix, as a strategy, had a significant and positive effect on guests’ behavioral response, corporate green image, and financial and environmental performance of hotels.Keywords: green marketing, sustainable development, hospitality, grounded theory, structural equations model
Procedia PDF Downloads 8123022 The Potential Impact of Big Data Analytics on Pharmaceutical Supply Chain Management
Authors: Maryam Ziaee, Himanshu Shee, Amrik Sohal
Abstract:
Big Data Analytics (BDA) in supply chain management has recently drawn the attention of academics and practitioners. Big data refers to a massive amount of data from different sources, in different formats, generated at high speed through transactions in business environments and supply chain networks. Traditional statistical tools and techniques find it difficult to analyse this massive data. BDA can assist organisations to capture, store, and analyse data specifically in the field of supply chain. Currently, there is a paucity of research on BDA in the pharmaceutical supply chain context. In this research, the Australian pharmaceutical supply chain was selected as the case study. This industry is highly significant since the right medicine must reach the right patients, at the right time, in right quantity, in good condition, and at the right price to save lives. However, drug shortages remain a substantial problem for hospitals across Australia with implications on patient care, staff resourcing, and expenditure. Furthermore, a massive volume and variety of data is generated at fast speed from multiple sources in pharmaceutical supply chain, which needs to be captured and analysed to benefit operational decisions at every stage of supply chain processes. As the pharmaceutical industry lags behind other industries in using BDA, it raises the question of whether the use of BDA can improve transparency among pharmaceutical supply chain by enabling the partners to make informed-decisions across their operational activities. This presentation explores the impacts of BDA on supply chain management. An exploratory qualitative approach was adopted to analyse data collected through interviews. This study also explores the BDA potential in the whole pharmaceutical supply chain rather than focusing on a single entity. Twenty semi-structured interviews were undertaken with top managers in fifteen organisations (five pharmaceutical manufacturers, five wholesalers/distributors, and five public hospital pharmacies) to investigate their views on the use of BDA. The findings revealed that BDA can enable pharmaceutical entities to have improved visibility over the whole supply chain and also the market; it enables entities, especially manufacturers, to monitor consumption and the demand rate in real-time and make accurate demand forecasts which reduce drug shortages. Timely and precise decision-making can allow the entities to source and manage their stocks more effectively. This can likely address the drug demand at hospitals and respond to unanticipated issues such as drug shortages. Earlier studies explore BDA in the context of clinical healthcare; however, this presentation investigates the benefits of BDA in the Australian pharmaceutical supply chain. Furthermore, this research enhances managers’ insight into the potentials of BDA at every stage of supply chain processes and helps to improve decision-making in their supply chain operations. The findings will turn the rhetoric of data-driven decision into a reality where the managers may opt for analytics for improved decision-making in the supply chain processes.Keywords: big data analytics, data-driven decision, pharmaceutical industry, supply chain management
Procedia PDF Downloads 10623021 Clutter Suppression Based on Singular Value Decomposition and Fast Wavelet Algorithm
Authors: Ruomeng Xiao, Zhulin Zong, Longfa Yang
Abstract:
Aiming at the problem that the target signal is difficult to detect under the strong ground clutter environment, this paper proposes a clutter suppression algorithm based on the combination of singular value decomposition and the Mallat fast wavelet algorithm. The method first carries out singular value decomposition on the radar echo data matrix, realizes the initial separation of target and clutter through the threshold processing of singular value, and then carries out wavelet decomposition on the echo data to find out the target location, and adopts the discard method to select the appropriate decomposition layer to reconstruct the target signal, which ensures the minimum loss of target information while suppressing the clutter. After the verification of the measured data, the method has a significant effect on the target extraction under low SCR, and the target reconstruction can be realized without the prior position information of the target and the method also has a certain enhancement on the output SCR compared with the traditional single wavelet processing method.Keywords: clutter suppression, singular value decomposition, wavelet transform, Mallat algorithm, low SCR
Procedia PDF Downloads 11823020 Winter Wheat Yield Forecasting Using Sentinel-2 Imagery at the Early Stages
Authors: Chunhua Liao, Jinfei Wang, Bo Shan, Yang Song, Yongjun He, Taifeng Dong
Abstract:
Winter wheat is one of the main crops in Canada. Forecasting of within-field variability of yield in winter wheat at the early stages is essential for precision farming. However, the crop yield modelling based on high spatial resolution satellite data is generally affected by the lack of continuous satellite observations, resulting in reducing the generalization ability of the models and increasing the difficulty of crop yield forecasting at the early stages. In this study, the correlations between Sentinel-2 data (vegetation indices and reflectance) and yield data collected by combine harvester were investigated and a generalized multivariate linear regression (MLR) model was built and tested with data acquired in different years. It was found that the four-band reflectance (blue, green, red, near-infrared) performed better than their vegetation indices (NDVI, EVI, WDRVI and OSAVI) in wheat yield prediction. The optimum phenological stage for wheat yield prediction with highest accuracy was at the growing stages from the end of the flowering to the beginning of the filling stage. The best MLR model was therefore built to predict wheat yield before harvest using Sentinel-2 data acquired at the end of the flowering stage. Further, to improve the ability of the yield prediction at the early stages, three simple unsupervised domain adaptation (DA) methods were adopted to transform the reflectance data at the early stages to the optimum phenological stage. The winter wheat yield prediction using multiple vegetation indices showed higher accuracy than using single vegetation index. The optimum stage for winter wheat yield forecasting varied with different fields when using vegetation indices, while it was consistent when using multispectral reflectance and the optimum stage for winter wheat yield prediction was at the end of flowering stage. The average testing RMSE of the MLR model at the end of the flowering stage was 604.48 kg/ha. Near the booting stage, the average testing RMSE of yield prediction using the best MLR was reduced to 799.18 kg/ha when applying the mean matching domain adaptation approach to transform the data to the target domain (at the end of the flowering) compared to that using the original data based on the models developed at the booting stage directly (“MLR at the early stage”) (RMSE =1140.64 kg/ha). This study demonstrated that the simple mean matching (MM) performed better than other DA methods and it was found that “DA then MLR at the optimum stage” performed better than “MLR directly at the early stages” for winter wheat yield forecasting at the early stages. The results indicated that the DA had a great potential in near real-time crop yield forecasting at the early stages. This study indicated that the simple domain adaptation methods had a great potential in crop yield prediction at the early stages using remote sensing data.Keywords: wheat yield prediction, domain adaptation, Sentinel-2, within-field scale
Procedia PDF Downloads 6423019 Efficient Delivery of Biomaterials into Living Organism by Using Noble Metal Nanowire Injector
Authors: Kkochorong Park, Keun Cheon Kim, Hyoban Lee, Eun Ju Lee, Bongsoo Kim
Abstract:
Introduction of biomaterials such as DNA, RNA, proteins is important for many research areas. There are many methods to introduce biomaterials into living organisms like tissue and cells. To introduce biomaterials, several indirect methods including virus‐mediated delivery, chemical reagent (i.e., lipofectamine), electrophoresis have been used. Such methods are passive delivery using an endocytosis process of cell, reducing an efficiency of delivery. Unlike the indirect delivery method, it has been reported that a direct delivery of exogenous biomolecules into nucleus have been more efficient to expression or integration of biomolecules. Nano-sized material is beneficial for detect signal from cell or deliver stimuli/materials into the cell at cellular and molecular levels, due to its similar physical scale. Especially, because 1 dimensional (1D) nanomaterials such as nanotube, nanorod and nanowire with high‐aspect ratio have nanoscale geometry and excellent mechanical, electrical, and chemical properties, they could play an important role in molecular and cellular biology. In this study, by using single crystalline 1D noble metal nanowire, we fabricated nano-sized 1D injector which can successfully interface with living cells and directly deliver biomolecules into several types of cell line (i.e., stem cell, mammalian embryo) without inducing detrimental damages on living cell. This nano-bio technology could be a promising and robust tool for introducing exogenous biomaterials into living organism.Keywords: DNA, gene delivery, nanoinjector, nanowire
Procedia PDF Downloads 27523018 Characterization of Agroforestry Systems in Burkina Faso Using an Earth Observation Data Cube
Authors: Dan Kanmegne
Abstract:
Africa will become the most populated continent by the end of the century, with around 4 billion inhabitants. Food security and climate changes will become continental issues since agricultural practices depend on climate but also contribute to global emissions and land degradation. Agroforestry has been identified as a cost-efficient and reliable strategy to address these two issues. It is defined as the integrated management of trees and crops/animals in the same land unit. Agroforestry provides benefits in terms of goods (fruits, medicine, wood, etc.) and services (windbreaks, fertility, etc.), and is acknowledged to have a great potential for carbon sequestration; therefore it can be integrated into reduction mechanisms of carbon emissions. Particularly in sub-Saharan Africa, the constraint stands in the lack of information about both areas under agroforestry and the characterization (composition, structure, and management) of each agroforestry system at the country level. This study describes and quantifies “what is where?”, earliest to the quantification of carbon stock in different systems. Remote sensing (RS) is the most efficient approach to map such a dynamic technology as agroforestry since it gives relatively adequate and consistent information over a large area at nearly no cost. RS data fulfill the good practice guidelines of the Intergovernmental Panel On Climate Change (IPCC) that is to be used in carbon estimation. Satellite data are getting more and more accessible, and the archives are growing exponentially. To retrieve useful information to support decision-making out of this large amount of data, satellite data needs to be organized so to ensure fast processing, quick accessibility, and ease of use. A new solution is a data cube, which can be understood as a multi-dimensional stack (space, time, data type) of spatially aligned pixels and used for efficient access and analysis. A data cube for Burkina Faso has been set up from the cooperation project between the international service provider WASCAL and Germany, which provides an accessible exploitation architecture of multi-temporal satellite data. The aim of this study is to map and characterize agroforestry systems using the Burkina Faso earth observation data cube. The approach in its initial stage is based on an unsupervised image classification of a normalized difference vegetation index (NDVI) time series from 2010 to 2018, to stratify the country based on the vegetation. Fifteen strata were identified, and four samples per location were randomly assigned to define the sampling units. For safety reasons, the northern part will not be part of the fieldwork. A total of 52 locations will be visited by the end of the dry season in February-March 2020. The field campaigns will consist of identifying and describing different agroforestry systems and qualitative interviews. A multi-temporal supervised image classification will be done with a random forest algorithm, and the field data will be used for both training the algorithm and accuracy assessment. The expected outputs are (i) map(s) of agroforestry dynamics, (ii) characteristics of different systems (main species, management, area, etc.); (iii) assessment report of Burkina Faso data cube.Keywords: agroforestry systems, Burkina Faso, earth observation data cube, multi-temporal image classification
Procedia PDF Downloads 14523017 Artificial Intelligence for Traffic Signal Control and Data Collection
Authors: Reggie Chandra
Abstract:
Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal
Procedia PDF Downloads 16923016 Alpha: A Groundbreaking Avatar Merging User Dialogue with OpenAI's GPT-3.5 for Enhanced Reflective Thinking
Authors: Jonas Colin
Abstract:
Standing at the vanguard of AI development, Alpha represents an unprecedented synthesis of logical rigor and human abstraction, meticulously crafted to mirror the user's unique persona and personality, a feat previously unattainable in AI development. Alpha, an avant-garde artefact in the realm of artificial intelligence, epitomizes a paradigmatic shift in personalized digital interaction, amalgamating user-specific dialogic patterns with the sophisticated algorithmic prowess of OpenAI's GPT-3.5 to engender a platform for enhanced metacognitive engagement and individualized user experience. Underpinned by a sophisticated algorithmic framework, Alpha integrates vast datasets through a complex interplay of neural network models and symbolic AI, facilitating a dynamic, adaptive learning process. This integration enables the system to construct a detailed user profile, encompassing linguistic preferences, emotional tendencies, and cognitive styles, tailoring interactions to align with individual characteristics and conversational contexts. Furthermore, Alpha incorporates advanced metacognitive elements, enabling real-time reflection and adaptation in communication strategies. This self-reflective capability ensures continuous refinement of its interaction model, positioning Alpha not just as a technological marvel but as a harbinger of a new era in human-computer interaction, where machines engage with us on a deeply personal and cognitive level, transforming our interaction with the digital world.Keywords: chatbot, GPT 3.5, metacognition, symbiose
Procedia PDF Downloads 7023015 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior
Authors: Juliana A. Knocikova
Abstract:
Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex
Procedia PDF Downloads 30023014 A Generalized Sparse Bayesian Learning Algorithm for Near-Field Synthetic Aperture Radar Imaging: By Exploiting Impropriety and Noncircularity
Authors: Pan Long, Bi Dongjie, Li Xifeng, Xie Yongle
Abstract:
The near-field synthetic aperture radar (SAR) imaging is an advanced nondestructive testing and evaluation (NDT&E) technique. This paper investigates the complex-valued signal processing related to the near-field SAR imaging system, where the measurement data turns out to be noncircular and improper, meaning that the complex-valued data is correlated to its complex conjugate. Furthermore, we discover that the degree of impropriety of the measurement data and that of the target image can be highly correlated in near-field SAR imaging. Based on these observations, A modified generalized sparse Bayesian learning algorithm is proposed, taking impropriety and noncircularity into account. Numerical results show that the proposed algorithm provides performance gain, with the help of noncircular assumption on the signals.Keywords: complex-valued signal processing, synthetic aperture radar, 2-D radar imaging, compressive sensing, sparse Bayesian learning
Procedia PDF Downloads 13223013 Application of Building Information Modeling in Energy Management of Individual Departments Occupying University Facilities
Authors: Kung-Jen Tu, Danny Vernatha
Abstract:
To assist individual departments within universities in their energy management tasks, this study explores the application of Building Information Modeling in establishing the ‘BIM based Energy Management Support System’ (BIM-EMSS). The BIM-EMSS consists of six components: (1) sensors installed for each occupant and each equipment, (2) electricity sub-meters (constantly logging lighting, HVAC, and socket electricity consumptions of each room), (3) BIM models of all rooms within individual departments’ facilities, (4) data warehouse (for storing occupancy status and logged electricity consumption data), (5) building energy management system that provides energy managers with various energy management functions, and (6) energy simulation tool (such as eQuest) that generates real time 'standard energy consumptions' data against which 'actual energy consumptions' data are compared and energy efficiency evaluated. Through the building energy management system, the energy manager is able to (a) have 3D visualization (BIM model) of each room, in which the occupancy and equipment status detected by the sensors and the electricity consumptions data logged are displayed constantly; (b) perform real time energy consumption analysis to compare the actual and standard energy consumption profiles of a space; (c) obtain energy consumption anomaly detection warnings on certain rooms so that energy management corrective actions can be further taken (data mining technique is employed to analyze the relation between space occupancy pattern with current space equipment setting to indicate an anomaly, such as when appliances turn on without occupancy); and (d) perform historical energy consumption analysis to review monthly and annually energy consumption profiles and compare them against historical energy profiles. The BIM-EMSS was further implemented in a research lab in the Department of Architecture of NTUST in Taiwan and implementation results presented to illustrate how it can be used to assist individual departments within universities in their energy management tasks.Keywords: database, electricity sub-meters, energy anomaly detection, sensor
Procedia PDF Downloads 30723012 Comparison of Different Machine Learning Models for Time-Series Based Load Forecasting of Electric Vehicle Charging Stations
Authors: H. J. Joshi, Satyajeet Patil, Parth Dandavate, Mihir Kulkarni, Harshita Agrawal
Abstract:
As the world looks towards a sustainable future, electric vehicles have become increasingly popular. Millions worldwide are looking to switch to Electric cars over the previously favored combustion engine-powered cars. This demand has seen an increase in Electric Vehicle Charging Stations. The big challenge is that the randomness of electrical energy makes it tough for these charging stations to provide an adequate amount of energy over a specific amount of time. Thus, it has become increasingly crucial to model these patterns and forecast the energy needs of power stations. This paper aims to analyze how different machine learning models perform on Electric Vehicle charging time-series data. The data set consists of authentic Electric Vehicle Data from the Netherlands. It has an overview of ten thousand transactions from public stations operated by EVnetNL.Keywords: forecasting, smart grid, electric vehicle load forecasting, machine learning, time series forecasting
Procedia PDF Downloads 10623011 Designing Floor Planning in 2D and 3D with an Efficient Topological Structure
Authors: V. Nagammai
Abstract:
Very-large-scale integration (VLSI) is the process of creating an integrated circuit (IC) by combining thousands of transistors into a single chip. Development of technology increases the complexity in IC manufacturing which may vary the power consumption, increase the size and latency period. Topology defines a number of connections between network. In this project, NoC topology is generated using atlas tool which will increase performance in turn determination of constraints are effective. The routing is performed by XY routing algorithm and wormhole flow control. In NoC topology generation, the value of power, area and latency are predetermined. In previous work, placement, routing and shortest path evaluation is performed using an algorithm called floor planning with cluster reconstruction and path allocation algorithm (FCRPA) with the account of 4 3x3 switch, 6 4x4 switch, and 2 5x5 switches. The usage of the 4x4 and 5x5 switch will increase the power consumption and area of the block. In order to avoid the problem, this paper has used one 8x8 switch and 4 3x3 switches. This paper uses IPRCA which of 3 steps they are placement, clustering, and shortest path evaluation. The placement is performed using min – cut placement and clustering are performed using an algorithm called cluster generation. The shortest path is evaluated using an algorithm called Dijkstra's algorithm. The power consumption of each block is determined. The experimental result shows that the area, power, and wire length improved simultaneously.Keywords: application specific noc, b* tree representation, floor planning, t tree representation
Procedia PDF Downloads 39323010 Development of a Miniature and Low-Cost IoT-Based Remote Health Monitoring Device
Authors: Sreejith Jayachandran, Mojtaba Ghods, Morteza Mohammadzaheri
Abstract:
The modern busy world is running behind new embedded technologies based on computers and software; meanwhile, some people forget to do their health condition and regular medical check-ups. Some of them postpone medical check-ups due to a lack of time and convenience, while others skip these regular evaluations and medical examinations due to huge medical bills and hospital expenses. Engineers and medical experts have come together to give birth to a new device in the telemonitoring system capable of monitoring, checking, and evaluating the health status of the human body remotely through the internet for the needs of all kinds of people. The remote health monitoring device is a microcontroller-based embedded unit. Various types of sensors in this device are connected to the human body, and with the help of an Arduino UNO board, the required analogue data is collected from the sensors. The microcontroller on the Arduino board processes the analogue data collected in this way into digital data and transfers that information to the cloud, and stores it there, and the processed digital data is instantly displayed through the LCD attached to the machine. By accessing the cloud storage with a username and password, the concerned person’s health care teams/doctors and other health staff can collect this data for the assessment and follow-up of that patient. Besides that, the family members/guardians can use and evaluate this data for awareness of the patient's current health status. Moreover, the system is connected to a Global Positioning System (GPS) module. In emergencies, the concerned team can position the patient or the person with this device. The setup continuously evaluates and transfers the data to the cloud, and also the user can prefix a normal value range for the evaluation. For example, the blood pressure normal value is universally prefixed between 80/120 mmHg. Similarly, the RHMS is also allowed to fix the range of values referred to as normal coefficients. This IoT-based miniature system (11×10×10) cm³ with a low weight of 500 gr only consumes 10 mW. This smart monitoring system is manufactured with 100 GBP, which can be used not only for health systems, it can be used for numerous other uses including aerospace and transportation sections.Keywords: embedded technology, telemonitoring system, microcontroller, Arduino UNO, cloud storage, global positioning system, remote health monitoring system, alert system
Procedia PDF Downloads 8923009 A Study of Tourists Satisfaction and Behavior Strategies Case Study: International Tourists in Chatuchak Weekend Market
Authors: Weera Weerasophon
Abstract:
The purpose of this research was to study Tourists’s satisfaction strategies case of Tourists who attended and shopped in Chatuchak weekend market (Bangkok) in order to improve service operation of Chatuchak weekend market to serve tourists’ need to impress them. The researcher used the marketing mix as a main factor that affect to tourist satisfaction. This research was emphasized as quantitative research as 400 of questionnaires were used for collecting the data from international tourists around Chatuchak weekend market that questionnaires divided in to 3 parts as a personal information part, satisfaction of marketing/services and facilities and suggestion part. After collecting all the data that would be processed in statistic program of SPSS to use for analyze the data later on. The result is described that most of international tourists satisfied Chatuchak weekend market in the level of 4 as more satisfaction for example friendly staff, Chatuchak information, price of product, facilities and service by the way, the environment of Chatuchak weekend market is the most satisfaction level.Keywords: Chatuchak, satisfaction, Thailand tourism, marketing mix, tourists
Procedia PDF Downloads 36023008 Firm Performance and Stock Price in Nigeria
Authors: Tijjani Bashir Musa
Abstract:
The recent global crisis which suddenly results to Nigerian stock market crash revealed some peculiarities of Nigerian firms. Some firms in Nigeria are performing but their stock prices are not increasing while some firms are at the brink of collapse but their stock prices are increasing. Thus, this study examines the relationship between firm performance and stock price in Nigeria. The study covered the period of 2005 to 2009. This period is the period of stock boom and also marked the period of stock market crash as a result of global financial meltdown. The study is a panel study. A total of 140 firms were sampled from 216 firms listed on the Nigerian Stock Exchange (NSE). Data were collected from secondary source. These data were divided into four strata comprising the most performing stock, the least performing stock, most performing firms and the least performing firms. Each stratum contains 35 firms with characteristic of most performing stock, most performing firms, least performing stock and least performing firms. Multiple linear regression models were used to analyse the data while statistical/econometrics package of Stata 11.0 version was used to run the data. The study found that, relationship exists between selected firm performance parameters (operating efficiency, firm profit, earning per share and working capital) and stock price. As such firm performance gave sufficient information or has predictive power on stock prices movements in Nigeria for all the years under study.. The study recommends among others that Managers of firms in Nigeria should formulate policies and exert effort geared towards improving firm performance that will enhance stock prices movements.Keywords: firm, Nigeria, performance, stock price
Procedia PDF Downloads 47723007 Modelling Dengue Disease With Climate Variables Using Geospatial Data For Mekong River Delta Region of Vietnam
Authors: Thi Thanh Nga Pham, Damien Philippon, Alexis Drogoul, Thi Thu Thuy Nguyen, Tien Cong Nguyen
Abstract:
Mekong River Delta region of Vietnam is recognized as one of the most vulnerable to climate change due to flooding and seawater rise and therefore an increased burden of climate change-related diseases. Changes in temperature and precipitation are likely to alter the incidence and distribution of vector-borne diseases such as dengue fever. In this region, the peak of the dengue epidemic period is around July to September during the rainy season. It is believed that climate is an important factor for dengue transmission. This study aims to enhance the capacity of dengue prediction by the relationship of dengue incidences with climate and environmental variables for Mekong River Delta of Vietnam during 2005-2015. Mathematical models for vector-host infectious disease, including larva, mosquito, and human being were used to calculate the impacts of climate to the dengue transmission with incorporating geospatial data for model input. Monthly dengue incidence data were collected at provincial level. Precipitation data were extracted from satellite observations of GSMaP (Global Satellite Mapping of Precipitation), land surface temperature and land cover data were from MODIS. The value of seasonal reproduction number was estimated to evaluate the potential, severity and persistence of dengue infection, while the final infected number was derived to check the outbreak of dengue. The result shows that the dengue infection depends on the seasonal variation of climate variables with the peak during the rainy season and predicted dengue incidence follows well with this dynamic for the whole studied region. However, the highest outbreak of 2007 dengue was not captured by the model reflecting nonlinear dependences of transmission on climate. Other possible effects will be discussed to address the limitation of the model. This suggested the need of considering of both climate variables and another variability across temporal and spatial scales.Keywords: infectious disease, dengue, geospatial data, climate
Procedia PDF Downloads 38323006 Distances over Incomplete Diabetes and Breast Cancer Data Based on Bhattacharyya Distance
Authors: Loai AbdAllah, Mahmoud Kaiyal
Abstract:
Missing values in real-world datasets are a common problem. Many algorithms were developed to deal with this problem, most of them replace the missing values with a fixed value that was computed based on the observed values. In our work, we used a distance function based on Bhattacharyya distance to measure the distance between objects with missing values. Bhattacharyya distance, which measures the similarity of two probability distributions. The proposed distance distinguishes between known and unknown values. Where the distance between two known values is the Mahalanobis distance. When, on the other hand, one of them is missing the distance is computed based on the distribution of the known values, for the coordinate that contains the missing value. This method was integrated with Wikaya, a digital health company developing a platform that helps to improve prevention of chronic diseases such as diabetes and cancer. In order for Wikaya’s recommendation system to work distance between users need to be measured. Since there are missing values in the collected data, there is a need to develop a distance function distances between incomplete users profiles. To evaluate the accuracy of the proposed distance function in reflecting the actual similarity between different objects, when some of them contain missing values, we integrated it within the framework of k nearest neighbors (kNN) classifier, since its computation is based only on the similarity between objects. To validate this, we ran the algorithm over diabetes and breast cancer datasets, standard benchmark datasets from the UCI repository. Our experiments show that kNN classifier using our proposed distance function outperforms the kNN using other existing methods.Keywords: missing values, incomplete data, distance, incomplete diabetes data
Procedia PDF Downloads 22523005 Decoding the Natural Hazards: The Data Paradox, Juggling Data Flows, Transparency and Secrets, Analysis of Khuzestan and Lorestan Floods of Iran
Authors: Kiyanoush Ghalavand
Abstract:
We have a complex paradox in the agriculture and environment sectors in the age of technology. In the one side, the achievements of the science and information ages are shaping to come that is very dangerous than ever last decades. The progress of the past decades is historic, connecting people, empowering individuals, groups, and states, and lifting a thousand people out of land and poverty in the process. Floods are the most frequent natural hazards damaging and recurring of all disasters in Iran. Additionally, floods are morphing into new and even more devastating forms in recent years. Khuzestan and Lorestan Provinces experienced heavy rains that began on March 28, 2019, and led to unprecedented widespread flooding and landslides across the provinces. The study was based on both secondary and primary data. For the present study, a questionnaire-based primary survey was conducted. Data were collected by using a specially designed questionnaire and other instruments, such as focus groups, interview schedules, inception workshops, and roundtable discussions with stakeholders at different levels. Farmers in Khuzestan and Lorestan provinces were the statistical population for this study. Data were analyzed with several software such as ATLASti, NVivo SPSS Win, ،E-Views. According to a factorial analysis conducted for the present study, 10 groups of factors were categorized climatic, economic, cultural, supportive, instructive, planning, military, policymaking, geographical, and human factors. They estimated 71.6 percent of explanatory factors of flood management obstacles in the agricultural sector in Lorestan and Khuzestan provinces. Several recommendations were finally made based on the study findings.Keywords: chaos theory, natural hazards, risks, environmental risks, paradox
Procedia PDF Downloads 14523004 Techniques to Characterize Subpopulations among Hearing Impaired Patients and Its Impact for Hearing Aid Fitting
Authors: Vijaya K. Narne, Gerard Loquet, Tobias Piechowiak, Dorte Hammershoi, Jesper H. Schmidt
Abstract:
BEAR, which stands for better hearing rehabilitation is a large-scale project in Denmark designed and executed by three national universities, three hospitals, and the hearing aid industry with the aim to improve hearing aid fitting. A total of 1963 hearing impaired people were included and were segmented into subgroups based on hearing-loss, demographics, audiological and questionnaires data (i.e., the speech, spatial and qualities of hearing scale [SSQ-12] and the International Outcome Inventory for Hearing-Aids [IOI-HA]). With the aim to provide a better hearing-aid fit to individual patients, we applied modern machine learning techniques with traditional audiograms rule-based systems. Results show that age, speech discrimination scores, and audiogram configurations were evolved as important parameters in characterizing sub-population from the data-set. The attempt to characterize sub-population reveal a clearer picture about the individual hearing difficulties encountered and the benefits derived from more individualized hearing aids.Keywords: hearing loss, audiological data, machine learning, hearing aids
Procedia PDF Downloads 15423003 Female Criminality in Lagos State: A Case of Armed Robbery
Authors: Ebobo Urowoli Christiana
Abstract:
The Nigerian Prison Service statistics of 2007; 2009 revealed that though crime in the past was ascribed to men, but today there is a steady increase in the population of women involved in crime. This study focused on the investigation of female criminality in Lagos State: A case of Armed Robbery. Its major objective was to find out if there is an increase or decrease in female involvement in armed robbery and its growth rate. The major research question is 'Is there an increase in the perpetration of armed robbery by females in Lagos State?' the null hypotheses is 'There is no significant increase in the perpetration of armed robbery by females in Lagos State.' As a result, this study adopted the survey design, purposive sampling method and a sample size of 120 respondents. The rational choice theory was used to explain the reason for female involvement in armed robbery. Both primary and secondary data was generated for this study; the primary data was collected from the criminal records in Lagos State Police Command, Panti while the Quantitative data was collected using the questionnaire from 120 female detainees and inmates. The data collected was analyzed using the simple frequency tables and percentages and chi square was used to test for relationships. The study revealed a persistent rise in the prevalence of female armed robbery and recommended that youths should be equipped with educational/vocational skills in order to lead responsible lives.Keywords: criminality, armed robbery, female, police commands, panti, nature
Procedia PDF Downloads 40623002 Single Atom Manipulation with 4 Scanning Tunneling Microscope Technique
Authors: Jianshu Yang, Delphine Sordes, Marek Kolmer, Christian Joachim
Abstract:
Nanoelectronics, for example the calculating circuits integrating at molecule scale logic gates, atomic scale circuits, has been constructed and investigated recently. A major challenge is their functional properties characterization because of the connecting problem from atomic scale to micrometer scale. New experimental instruments and new processes have been proposed therefore. To satisfy a precisely measurement at atomic scale and then connecting micrometer scale electrical integration controller, the technique improvement is kept on going. Our new machine, a low temperature high vacuum four scanning tunneling microscope, as a customer required instrument constructed by Omicron GmbH, is expected to be scaling down to atomic scale characterization. Here, we will present our first testified results about the performance of this new instrument. The sample we selected is Au(111) surface. The measurements have been taken at 4.2 K. The atomic resolution surface structure was observed with each of four scanners with noise level better than 3 pm. With a tip-sample distance calibration by I-z spectra, the sample conductance has been derived from its atomic locally I-V spectra. Furthermore, the surface conductance measurement has been performed using two methods, (1) by landing two STM tips on the surface with sample floating; and (2) by sample floating and one of the landed tips turned to be grounding. In addition, single atom manipulation has been achieved with a modified tip design, which is comparable to a conventional LT-STM.Keywords: low temperature ultra-high vacuum four scanning tunneling microscope, nanoelectronics, point contact, single atom manipulation, tunneling resistance
Procedia PDF Downloads 280