Search results for: solar–climatic data
22311 Accuracy Improvement of Traffic Participant Classification Using Millimeter-Wave Radar by Leveraging Simulator Based on Domain Adaptation
Authors: Tokihiko Akita, Seiichi Mita
Abstract:
A millimeter-wave radar is the most robust against adverse environments, making it an essential environment recognition sensor for automated driving. However, the reflection signal is sparse and unstable, so it is difficult to obtain the high recognition accuracy. Deep learning provides high accuracy even for them in recognition, but requires large scale datasets with ground truth. Specially, it takes a lot of cost to annotate for a millimeter-wave radar. For the solution, utilizing a simulator that can generate an annotated huge dataset is effective. Simulation of the radar is more difficult to match with real world data than camera image, and recognition by deep learning with higher-order features using the simulator causes further deviation. We have challenged to improve the accuracy of traffic participant classification by fusing simulator and real-world data with domain adaptation technique. Experimental results with the domain adaptation network created by us show that classification accuracy can be improved even with a few real-world data.Keywords: millimeter-wave radar, object classification, deep learning, simulation, domain adaptation
Procedia PDF Downloads 9322310 Microclimate Variations in Rio de Janeiro Related to Massive Public Transportation
Authors: Marco E. O. Jardim, Frederico A. M. Souza, Valeria M. Bastos, Myrian C. A. Costa, Nelson F. F. Ebecken
Abstract:
Urban public transportation in Rio de Janeiro is based on bus lines, powered by diesel, and four limited metro lines that support only some neighborhoods. This work presents an infrastructure built to better understand microclimate variations related to massive urban transportation in some specific areas of the city. The use of sensor nodes with small analytics capacity provides environmental information to population or public services. The analyses of data collected from a few small sensors positioned near some heavy traffic streets show the harmful impact due to poor bus route plan.Keywords: big data, IoT, public transportation, public health system
Procedia PDF Downloads 25322309 Evaluation of the Surveillance System for Rift Valley Fever in Ruminants in Mauritania, 2019
Authors: Mohamed El Kory Yacoub, Ahmed Bezeid El Mamy Beyatt, Djibril Barry, Yanogo Pauline, Nicolas Meda
Abstract:
Introduction: Rift Valley Fever is a zoonotic arbovirosis that severely affects ruminants, as well as humans. It causes abortions in pregnant females and deaths in young animals. The disease occurs during heavy rains followed by large numbers of mosquito vectors. The objective of this work is to evaluate the surveillance system for Rift Valley Fever. Methods: We conducted an evaluation of the Rift Valley Fiver surveillance system. Data were collected from the analysis of the national database of the Mauritanian Network of Animal Disease Epidemiological Surveillance at the Ministry of Rural Development, of RVF cases notified from the whole national territory, of questionnaires and interviews with all persons involved in RVF surveillance at the central level. The quality of the system was assessed by analyzing the quantitative attributes defined by the Centers for Disease Control and Prevention. Results: In 2019, 443 cases of RVF were notified by the surveillance system, of which 36 were positive. Among the notified cases of Rift Valley Fever, the 0- to the 3-year-old age group of small ruminants was the most represented with 49.21% of cases, followed by 33.33%, which was recorded in large ruminants in the 0 to 7-year-old age group, 11.11% of cases were older than seven years. The completeness of the data varied between 14.2% (age) and 100% (species). Most positive cases were recorded between October and November 2019 in seven different regions. Attribute analysis showed that 87% of the respondents were able to use the case definition well, and 78.8% said they were familiar with the reporting and feedback loop of the Rift Valley Fever data. 90.3% of the respondents found it easy, while 95% of them responded that it was easy for them to transmit their data to the next level. Conclusions: The epidemiological surveillance system for Rift Valley Fever in Mauritania is simple and representative. However, data quality, stability, and responsiveness are average, as the diagnosis of the disease requires laboratory confirmation and the average delay for this confirmation is long (13 days). Consequently, the lack of completeness of the recorded data and of description of cases in terms of time-place-animal, associated with the delay between the stages of the surveillance system can make prevention, early detection of epidemics, and the initiation of measures for an adequate response difficult.Keywords: evaluation, epidemiological surveillance system, rift valley fever, mauritania, ruminants
Procedia PDF Downloads 14822308 Assessing the Theoretical Suitability of Sentinel-2 and Worldview-3 Data for Hydrocarbon Mapping of Spill Events, Using Hydrocarbon Spectral Slope Model
Authors: K. Tunde Olagunju, C. Scott Allen, Freek Van Der Meer
Abstract:
Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization are only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two (2) operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the hydrocarbon spectral slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven (7) different hydrocarbon oils (crude and refined oil) taken on ten (10) different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon-substrate combination, Sentinel-2, WorldView-3
Procedia PDF Downloads 21622307 FPGA Implementation of RSA Encryption Algorithm for E-Passport Application
Authors: Khaled Shehata, Hanady Hussien, Sara Yehia
Abstract:
Securing the data stored on E-passport is a very important issue. RSA encryption algorithm is suitable for such application with low data size. In this paper the design and implementation of 1024 bit-key RSA encryption and decryption module on an FPGA is presented. The module is verified through comparing the result with that obtained from MATLAB tools. The design runs at a frequency of 36.3 MHz on Virtex-5 Xilinx FPGA. The key size is designed to be 1024-bit to achieve high security for the passport information. The whole design is achieved through VHDL design entry which makes it a portable design and can be directed to any hardware platform.Keywords: RSA, VHDL, FPGA, modular multiplication, modular exponential
Procedia PDF Downloads 39122306 An Experimental Study for Assessing Email Classification Attributes Using Feature Selection Methods
Authors: Issa Qabaja, Fadi Thabtah
Abstract:
Email phishing classification is one of the vital problems in the online security research domain that have attracted several scholars due to its impact on the users payments performed daily online. One aspect to reach a good performance by the detection algorithms in the email phishing problem is to identify the minimal set of features that significantly have an impact on raising the phishing detection rate. This paper investigate three known feature selection methods named Information Gain (IG), Chi-square and Correlation Features Set (CFS) on the email phishing problem to separate high influential features from low influential ones in phishing detection. We measure the degree of influentially by applying four data mining algorithms on a large set of features. We compare the accuracy of these algorithms on the complete features set before feature selection has been applied and after feature selection has been applied. After conducting experiments, the results show 12 common significant features have been chosen among the considered features by the feature selection methods. Further, the average detection accuracy derived by the data mining algorithms on the reduced 12-features set was very slight affected when compared with the one derived from the 47-features set.Keywords: data mining, email classification, phishing, online security
Procedia PDF Downloads 43222305 Mathematical Modelling of Human Cardiovascular-Respiratory System Response to Exercise in Rwanda
Authors: Jean Marie Ntaganda, Froduald Minani, Wellars Banzi, Lydie Mpinganzima, Japhet Niyobuhungiro, Jean Bosco Gahutu, Vincent Dusabejambo, Immaculate Kambutse
Abstract:
In this paper, we present a nonlinear dynamic model for the interactive mechanism of the cardiovascular and respiratory system. The model is designed and analyzed for human during physical exercises. In order to verify the adequacy of the designed model, data collected in Rwanda are used for validation. We have simulated the impact of heart rate and alveolar ventilation as controls of cardiovascular and respiratory system respectively to steady state response of the main cardiovascular hemodynamic quantities i.e., systemic arterial and venous blood pressures, arterial oxygen partial pressure and arterial carbon dioxide partial pressure, to the stabilised values of controls. We used data collected in Rwanda for both male and female during physical activities. We obtained a good agreement with physiological data in the literature. The model may represent an important tool to improve the understanding of exercise physiology.Keywords: exercise, cardiovascular/respiratory, hemodynamic quantities, numerical simulation, physical activity, sportsmen in Rwanda, system
Procedia PDF Downloads 24422304 Knowledge Development: How New Information System Technologies Affect Knowledge Development
Authors: Yener Ekiz
Abstract:
Knowledge development is a proactive process that covers collection, analysis, storage and distribution of information that helps to contribute the understanding of the environment. To transfer knowledge correctly and fastly, you have to use new emerging information system technologies. Actionable knowledge is only of value if it is understandable and usable by target users. The purpose of the paper is to enlighten how technology eases and affects the process of knowledge development. While preparing the paper, literature review, survey and interview methodology will be used. The hypothesis is that the technology and knowledge development are inseparable and the technology will formalize the DIKW hierarchy again. As a result, today there is huge data. This data must be classified sharply and quickly.Keywords: DIKW hierarchy, knowledge development, technology
Procedia PDF Downloads 44122303 Intelligent Technology for Real-Time Monitor and Data Analysis of the Aquaculture Toxic Water Concentration
Authors: Chin-Yuan Hsieh, Wei-Chun Lu, Yu-Hong Zeng
Abstract:
The situation of a group of fish die is frequently found due to the fish disease caused by the deterioration of aquaculture water quality. The toxic ammonia is produced by animals as a byproduct of protein. The system is designed by the smart sensor technology and developed by the mathematical model to monitor the water parameters 24 hours a day and predict the relationship among twelve water quality parameters for monitoring the water quality in aquaculture. All data measured are stored in cloud server. In productive ponds, the daytime pH may be high enough to be lethal to the fish. The sudden change of the aquaculture conditions often results in the increase of PH value of water, lack of oxygen dissolving content, water quality deterioration and yield reduction. From the real measurement, the system can send the message to user’s smartphone successfully on the bad conditions of water quality. From the data comparisons between measurement and model simulation in fish aquaculture site, the difference of parameters is less than 2% and the correlation coefficient is at least 98.34%. The solubility rate of oxygen decreases exponentially with the elevation of water temperature. The correlation coefficient is 98.98%.Keywords: aquaculture, sensor, ammonia, dissolved oxygen
Procedia PDF Downloads 28322302 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function
Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos
Abstract:
Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function
Procedia PDF Downloads 30822301 Knowledge Graph Development to Connect Earth Metadata and Standard English Queries
Authors: Gabriel Montague, Max Vilgalys, Catherine H. Crawford, Jorge Ortiz, Dava Newman
Abstract:
There has never been so much publicly accessible atmospheric and environmental data. The possibilities of these data are exciting, but the sheer volume of available datasets represents a new challenge for researchers. The task of identifying and working with a new dataset has become more difficult with the amount and variety of available data. Datasets are often documented in ways that differ substantially from the common English used to describe the same topics. This presents a barrier not only for new scientists, but for researchers looking to find comparisons across multiple datasets or specialists from other disciplines hoping to collaborate. This paper proposes a method for addressing this obstacle: creating a knowledge graph to bridge the gap between everyday English language and the technical language surrounding these datasets. Knowledge graph generation is already a well-established field, although there are some unique challenges posed by working with Earth data. One is the sheer size of the databases – it would be infeasible to replicate or analyze all the data stored by an organization like The National Aeronautics and Space Administration (NASA) or the European Space Agency. Instead, this approach identifies topics from metadata available for datasets in NASA’s Earthdata database, which can then be used to directly request and access the raw data from NASA. By starting with a single metadata standard, this paper establishes an approach that can be generalized to different databases, but leaves the challenge of metadata harmonization for future work. Topics generated from the metadata are then linked to topics from a collection of English queries through a variety of standard and custom natural language processing (NLP) methods. The results from this method are then compared to a baseline of elastic search applied to the metadata. This comparison shows the benefits of the proposed knowledge graph system over existing methods, particularly in interpreting natural language queries and interpreting topics in metadata. For the research community, this work introduces an application of NLP to the ecological and environmental sciences, expanding the possibilities of how machine learning can be applied in this discipline. But perhaps more importantly, it establishes the foundation for a platform that can enable common English to access knowledge that previously required considerable effort and experience. By making this public data accessible to the full public, this work has the potential to transform environmental understanding, engagement, and action.Keywords: earth metadata, knowledge graphs, natural language processing, question-answer systems
Procedia PDF Downloads 14822300 Parabolic Impact Law of High Frequency Exchanges on Price Formation in Commodities Market
Authors: L. Maiza, A. Cantagrel, M. Forestier, G. Laucoin, T. Regali
Abstract:
Evaluation of High Frequency Trading (HFT) impact on financial markets is very important for traders who use market analysis to detect winning transaction opportunity. Analysis of HFT data on tobacco commodity market is discussed here and interesting linear relationship has been shown between trading frequency and difference between averaged trading prices above and below considered trading frequency. This may open new perspectives on markets data understanding and could provide possible interpretation of Adam Smith invisible hand.Keywords: financial market, high frequency trading, analysis, impacts, Adam Smith invisible hand
Procedia PDF Downloads 35922299 Improving Data Completeness and Timely Reporting: A Joint Collaborative Effort between Partners in Health and Ministry of Health in Remote Areas, Neno District, Malawi
Authors: Wiseman Emmanuel Nkhomah, Chiyembekezo Kachimanga, Moses Banda Aron, Julia Higgins, Manuel Mulwafu, Kondwani Mpinga, Mwayi Chunga, Grace Momba, Enock Ndarama, Dickson Sumphi, Atupere Phiri, Fabien Munyaneza
Abstract:
Background: Data is key to supporting health service delivery as stakeholders, including NGOs rely on it for effective service delivery, decision-making, and system strengthening. Several studies generated debate on data quality from national health management information systems (HMIS) in sub-Saharan Africa. This limits the utilization of data in resource-limited settings, which already struggle to meet standards set by the World Health Organization (WHO). We aimed to evaluate data quality improvement of Neno district HMIS over a 4-year period (2018 – 2021) following quarterly data reviews introduced in January 2020 by the district health management team and Partners In Health. Methods: Exploratory Mixed Research was used to examine report rates, followed by in-depth interviews using Key Informant Interviews (KIIs) and Focus Group Discussions (FGDs). We used the WHO module desk review to assess the quality of HMIS data in the Neno district captured from 2018 to 2021. The metrics assessed included the completeness and timeliness of 34 reports. Completeness was measured as a percentage of non-missing reports. Timeliness was measured as the span between data inputs and expected outputs meeting needs. We computed T-Test and recorded P-values, summaries, and percentage changes using R and Excel 2016. We analyzed demographics for key informant interviews in Power BI. We developed themes from 7 FGDs and 11 KIIs using Dedoose software, from which we picked perceptions of healthcare workers, interventions implemented, and improvement suggestions. The study was reviewed and approved by Malawi National Health Science Research Committee (IRB: 22/02/2866). Results: Overall, the average reporting completeness rate was 83.4% (before) and 98.1% (after), while timeliness was 68.1% and 76.4 respectively. Completeness of reports increased over time: 2018, 78.8%; 2019, 88%; 2020, 96.3% and 2021, 99.9% (p< 0.004). The trend for timeliness has been declining except in 2021, where it improved: 2018, 68.4%; 2019, 68.3%; 2020, 67.1% and 2021, 81% (p< 0.279). Comparing 2021 reporting rates to the mean of three preceding years, both completeness increased from 88% to 99% (in 2021), while timeliness increased from 68% to 81%. Sixty-five percent of reports have maintained meeting a national standard of 90%+ in completeness while only 24% in timeliness. Thirty-two percent of reports met the national standard. Only 9% improved on both completeness and timeliness, and these are; cervical cancer, nutrition care support and treatment, and youth-friendly health services reports. 50% of reports did not improve to standard in timeliness, and only one did not in completeness. On the other hand, factors associated with improvement included improved communications and reminders using internal communication, data quality assessments, checks, and reviews. Decentralizing data entry at the facility level was suggested to improve timeliness. Conclusion: Findings suggest that data quality in HMIS for the district has improved following collaborative efforts. We recommend maintaining such initiatives to identify remaining quality gaps and that results be shared publicly to support increased use of data. These results can inform Ministry of Health and its partners on some interventions and advise initiatives for improving its quality.Keywords: data quality, data utilization, HMIS, collaboration, completeness, timeliness, decision-making
Procedia PDF Downloads 8422298 Evaluate the Changes in Stress Level Using Facial Thermal Imaging
Authors: Amin Derakhshan, Mohammad Mikaili, Mohammad Ali Khalilzadeh, Amin Mohammadian
Abstract:
This paper proposes a stress recognition system from multi-modal bio-potential signals. For stress recognition, Support Vector Machines (SVM) and LDA are applied to design the stress classifiers and its characteristics are investigated. Using gathered data under psychological polygraph experiments, the classifiers are trained and tested. The pattern recognition method classifies stressful from non-stressful subjects based on labels which come from polygraph data. The successful classification rate is 96% for 12 subjects. It means that facial thermal imaging due to its non-contact advantage could be a remarkable alternative for psycho-physiological methods.Keywords: stress, thermal imaging, face, SVM, polygraph
Procedia PDF Downloads 48622297 Entrepreneurs’ Perceptions of the Economic, Social and Physical Impacts of Tourism
Authors: Oktay Emir
Abstract:
The objective of this study is to determine how entrepreneurs perceive the economic, social and physical impacts of tourism. The study was conducted in the city of Afyonkarahisar, Turkey, which is rich in thermal tourism resources and investments. A survey was used as the data collection method, and the questionnaire was applied to 472 entrepreneurs. A simple random sampling method was used to identify the sample. Independent sampling t-tests and ANOVA tests were used to analyse the data obtained. Additionally, some statistically significant differences (p<0.05) were found based on the participants’ demographic characteristics regarding their opinions about the social, economic and physical impacts of tourism activities.Keywords: tourism, perception, entrepreneurship, entrepreneurs, structural equation modelling
Procedia PDF Downloads 45122296 Unsupervised Learning and Similarity Comparison of Water Mass Characteristics with Gaussian Mixture Model for Visualizing Ocean Data
Authors: Jian-Heng Wu, Bor-Shen Lin
Abstract:
The temperature-salinity relationship is one of the most important characteristics used for identifying water masses in marine research. Temperature-salinity characteristics, however, may change dynamically with respect to the geographic location and is quite sensitive to the depth at the same location. When depth is taken into consideration, however, it is not easy to compare the characteristics of different water masses efficiently for a wide range of areas of the ocean. In this paper, the Gaussian mixture model was proposed to analyze the temperature-salinity-depth characteristics of water masses, based on which comparison between water masses may be conducted. Gaussian mixture model could model the distribution of a random vector and is formulated as the weighting sum for a set of multivariate normal distributions. The temperature-salinity-depth data for different locations are first used to train a set of Gaussian mixture models individually. The distance between two Gaussian mixture models can then be defined as the weighting sum of pairwise Bhattacharyya distances among the Gaussian distributions. Consequently, the distance between two water masses may be measured fast, which allows the automatic and efficient comparison of the water masses for a wide range area. The proposed approach not only can approximate the distribution of temperature, salinity, and depth directly without the prior knowledge for assuming the regression family, but may restrict the complexity by controlling the number of mixtures when the amounts of samples are unevenly distributed. In addition, it is critical for knowledge discovery in marine research to represent, manage and share the temperature-salinity-depth characteristics flexibly and responsively. The proposed approach has been applied to a real-time visualization system of ocean data, which may facilitate the comparison of water masses by aggregating the data without degrading the discriminating capabilities. This system provides an interface for querying geographic locations with similar temperature-salinity-depth characteristics interactively and for tracking specific patterns of water masses, such as the Kuroshio near Taiwan or those in the South China Sea.Keywords: water mass, Gaussian mixture model, data visualization, system framework
Procedia PDF Downloads 14422295 Optimal Data Selection in Non-Ergodic Systems: A Tradeoff between Estimator Convergence and Representativeness Errors
Authors: Jakob Krause
Abstract:
Past Financial Crisis has shown that contemporary risk management models provide an unjustified sense of security and fail miserably in situations in which they are needed the most. In this paper, we start from the assumption that risk is a notion that changes over time and therefore past data points only have limited explanatory power for the current situation. Our objective is to derive the optimal amount of representative information by optimizing between the two adverse forces of estimator convergence, incentivizing us to use as much data as possible, and the aforementioned non-representativeness doing the opposite. In this endeavor, the cornerstone assumption of having access to identically distributed random variables is weakened and substituted by the assumption that the law of the data generating process changes over time. Hence, in this paper, we give a quantitative theory on how to perform statistical analysis in non-ergodic systems. As an application, we discuss the impact of a paragraph in the last iteration of proposals by the Basel Committee on Banking Regulation. We start from the premise that the severity of assumptions should correspond to the robustness of the system they describe. Hence, in the formal description of physical systems, the level of assumptions can be much higher. It follows that every concept that is carried over from the natural sciences to economics must be checked for its plausibility in the new surroundings. Most of the probability theory has been developed for the analysis of physical systems and is based on the independent and identically distributed (i.i.d.) assumption. In Economics both parts of the i.i.d. assumption are inappropriate. However, only dependence has, so far, been weakened to a sufficient degree. In this paper, an appropriate class of non-stationary processes is used, and their law is tied to a formal object measuring representativeness. Subsequently, that data set is identified that on average minimizes the estimation error stemming from both, insufficient and non-representative, data. Applications are far reaching in a variety of fields. In the paper itself, we apply the results in order to analyze a paragraph in the Basel 3 framework on banking regulation with severe implications on financial stability. Beyond the realm of finance, other potential applications include the reproducibility crisis in the social sciences (but not in the natural sciences) and modeling limited understanding and learning behavior in economics.Keywords: banking regulation, non-ergodicity, risk management, semimartingale modeling
Procedia PDF Downloads 14822294 Sustainable Happiness of Thai People: Monitoring the Thai Happiness Index
Authors: Kalayanee Senasu
Abstract:
This research investigates the influences of different factors on the happiness of Thai people, including both general factors and sustainable ones. Additionally, this study also monitors Thai people’s happiness via Thai Happiness Index developed in 2017. Besides reflecting happiness level of Thai people, this index also identifies related important issues. The data were collected by both secondary related data and primary survey data collected by interviewed questionnaires. The research data were from stratified multi-stage sampling in region, province, district, and enumeration area, and simple random sampling in each enumeration area. The research data cover 20 provinces, including Bangkok and 4-5 provinces in each region of the North, Northeastern, Central, and South. There were 4,960 usable respondents who were at least 15 years old. Statistical analyses included both descriptive and inferential statistics, including hierarchical regression and one-way ANOVA. The Alkire and Foster method was adopted to develop and calculate the Thai happiness index. The results reveal that the quality of household economy plays the most important role in predicting happiness. The results also indicate that quality of family, quality of health, and effectiveness of public administration in the provincial level have positive effects on happiness at about similar levels. For the socio-economic factors, the results reveal that age, education level, and household revenue have significant effects on happiness. For computing Thai happiness index (THaI), the result reveals the 2018 THaI value is 0.556. When people are divided into four groups depending upon their degree of happiness, it is found that a total of 21.1% of population are happy, with 6.0% called deeply happy and 15.1% called extensively happy. A total of 78.9% of population are not-yet-happy, with 31.8% called narrowly happy, and 47.1% called unhappy. A group of happy population reflects the happiness index THaI valued of 0.789, which is much higher than the THaI valued of 0.494 of the not-yet-happy population. Overall Thai people have higher happiness compared to 2017 when the happiness index was 0.506.Keywords: happiness, quality of life, sustainability, Thai Happiness Index
Procedia PDF Downloads 16822293 Using Electrical Impedance Tomography to Control a Robot
Authors: Shayan Rezvanigilkolaei, Shayesteh Vefaghnematollahi
Abstract:
Electrical impedance tomography is a non-invasive medical imaging technique suitable for medical applications. This paper describes an electrical impedance tomography device with the ability to navigate a robotic arm to manipulate a target object. The design of the device includes various hardware and software sections to perform medical imaging and control the robotic arm. In its hardware section an image is formed by 16 electrodes which are located around a container. This image is used to navigate a 3DOF robotic arm to reach the exact location of the target object. The data set to form the impedance imaging is obtained by having repeated current injections and voltage measurements between all electrode pairs. After performing the necessary calculations to obtain the impedance, information is transmitted to the computer. This data is fed and then executed in MATLAB which is interfaced with EIDORS (Electrical Impedance Tomography Reconstruction Software) to reconstruct the image based on the acquired data. In the next step, the coordinates of the center of the target object are calculated by image processing toolbox of MATLAB (IPT). Finally, these coordinates are used to calculate the angles of each joint of the robotic arm. The robotic arm moves to the desired tissue with the user command.Keywords: electrical impedance tomography, EIT, surgeon robot, image processing of electrical impedance tomography
Procedia PDF Downloads 27222292 Introduction of Robust Multivariate Process Capability Indices
Authors: Behrooz Khalilloo, Hamid Shahriari, Emad Roghanian
Abstract:
Process capability indices (PCIs) are important concepts of statistical quality control and measure the capability of processes and how much processes are meeting certain specifications. An important issue in statistical quality control is parameter estimation. Under the assumption of multivariate normality, the distribution parameters, mean vector and variance-covariance matrix must be estimated, when they are unknown. Classic estimation methods like method of moment estimation (MME) or maximum likelihood estimation (MLE) makes good estimation of the population parameters when data are not contaminated. But when outliers exist in the data, MME and MLE make weak estimators of the population parameters. So we need some estimators which have good estimation in the presence of outliers. In this work robust M-estimators for estimating these parameters are used and based on robust parameter estimators, robust process capability indices are introduced. The performances of these robust estimators in the presence of outliers and their effects on process capability indices are evaluated by real and simulated multivariate data. The results indicate that the proposed robust capability indices perform much better than the existing process capability indices.Keywords: multivariate process capability indices, robust M-estimator, outlier, multivariate quality control, statistical quality control
Procedia PDF Downloads 28322291 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers
Authors: Neliswa Dyosi
Abstract:
Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation
Procedia PDF Downloads 13622290 Simulation of Turbulent Flow in Channel Using Generalized Hydrodynamic Equations
Authors: Alex Fedoseyev
Abstract:
This study explores Generalized Hydrodynamic Equations (GHE) for the simulation of turbulent flows. The GHE was derived from the Generalized Boltzmann Equation (GBE) by Alexeev (1994). GBE was obtained by first principles from the chain of Bogolubov kinetic equations and considered particles of finite dimensions, Alexeev (1994). The GHE has new terms, temporal and spatial fluctuations compared to the Navier-Stokes equations (NSE). These new terms have a timescale multiplier τ, and the GHE becomes the NSE when τ is zero. The nondimensional τ is a product of the Reynolds number and the squared length scale ratio, τ=Re*(l/L)², where l is the apparent Kolmogorov length scale, and L is a hydrodynamic length scale. The turbulence phenomenon is not well understood and is not described by NSE. An additional one or two equations are required for the turbulence model, which may have to be tuned for specific problems. We show that, in the case of the GHE, no additional turbulence model is needed, and the turbulent velocity profile is obtained from the GHE. The 2D turbulent channel and circular pipe flows were investigated using a numerical solution of the GHE for several cases. The solutions are compared with the experimental data in the circular pipes and 2D channels by Nicuradse (1932, Prandtl Lab), Hussain and Reynolds (1975), Wei and Willmarth (1989), Van Doorne (2007), theory by Wosnik, Castillo and George (2000), and the relevant experiments on Superpipe setup at Princeton, data by Zagarola (1996) and Zagarola and Smits (1998), the Reynolds number is from Re=7200 to Re=960000. The numerical solution data compared well with the experimental data, as well as with the approximate analytical solution for turbulent flow in channel Fedoseyev (2023). The obtained results confirm that the Alexeev generalized hydrodynamic theory (GHE) is in good agreement with the experiments for turbulent flows. The proposed approach is limited to 2D and 3D axisymmetric channel geometries. Further work will extend this approach by including channels with square and rectangular cross-sections.Keywords: comparison with experimental data. generalized hydrodynamic equations, numerical solution, turbulent boundary layer, turbulent flow in channel
Procedia PDF Downloads 6522289 Analyzing Current Transformers Saturation Characteristics for Different Connected Burden Using LabVIEW Data Acquisition Tool
Authors: D. Subedi, S. Pradhan
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However when the power system experiences an abnormal situation leading to huge current flow, then this huge current is proportionally injected to the protection and metering circuit. Since the protection and metering equipment’s are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and also on the reliability of the protection and metering system. This paper shows the effect of burden on the Accuracy Limiting factor/ Instrument security factor of current transformers and also the change in saturation characteristics of the CT’s. The response of the CT to varying levels of overcurrent at different connected burden will be captured using the data acquisition software LabVIEW. Analysis is done on the real time data gathered using LabVIEW. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: accuracy limiting factor, burden, current transformer, instrument security factor, saturation characteristics
Procedia PDF Downloads 41522288 Crowdsensing Project in the Brazilian Municipality of Florianópolis for the Number of Visitors Measurement
Authors: Carlos Roberto De Rolt, Julio da Silva Dias, Rafael Tezza, Luca Foschini, Matteo Mura
Abstract:
The seasonal population fluctuation presents a challenge to touristic cities since the number of inhabitants can double according to the season. The aim of this work is to develop a model that correlates the waste collected with the population of the city and also allow cooperation between the inhabitants and the local government. The model allows public managers to evaluate the impact of the seasonal population fluctuation on waste generation and also to improve planning resource utilization throughout the year. The study uses data from the company that collects the garbage in Florianópolis, a Brazilian city that presents the profile of a city that attracts tourists due to numerous beaches and warm weather. The fluctuations are caused by the number of people that come to the city throughout the year for holidays, summer time vacations or business events. Crowdsensing will be accomplished through smartphones with access to an app for data collection, with voluntary participation of the population. Crowdsensing participants can access information collected in waves for this portal. Crowdsensing represents an innovative and participatory approach which involves the population in gathering information to improve the quality of life. The management of crowdsensing solutions plays an essential role given the complexity to foster collaboration, establish available sensors and collect and process the collected data. Practical implications of this tool described in this paper refer, for example, to the management of seasonal tourism in a large municipality, whose public services are impacted by the floating of the population. Crowdsensing and big data support managers in predicting the arrival, permanence, and movement of people in a given urban area. Also, by linking crowdsourced data to databases from other public service providers - e.g., water, garbage collection, electricity, public transport, telecommunications - it is possible to estimate the floating of the population of an urban area affected by seasonal tourism. This approach supports the municipality in increasing the effectiveness of resource allocation while, at the same time, increasing the quality of the service as perceived by citizens and tourists.Keywords: big data, dashboards, floating population, smart city, urban management solutions
Procedia PDF Downloads 28722287 Debris Flow Mapping Using Geographical Information System Based Model and Geospatial Data in Middle Himalayas
Authors: Anand Malik
Abstract:
The Himalayas with high tectonic activities poses a great threat to human life and property. Climate change is another reason which triggering extreme events multiple fold effect on high mountain glacial environment, rock falls, landslides, debris flows, flash flood and snow avalanches. One such extreme event of cloud burst along with breach of moraine dammed Chorabri Lake occurred from June 14 to June 17, 2013, triggered flooding of Saraswati and Mandakini rivers in the Kedarnath Valley of Rudraprayag district of Uttrakhand state of India. As a result, huge volume of water with its high velocity created a catastrophe of the century, which resulted into loss of large number of human/animals, pilgrimage, tourism, agriculture and property. Thus a comprehensive assessment of debris flow hazards requires GIS-based modeling using numerical methods. The aim of present study is to focus on analysis and mapping of debris flow movements using geospatial data with flow-r (developed by team at IGAR, University of Lausanne). The model is based on combined probabilistic and energetic algorithms for the assessment of spreading of flow with maximum run out distances. Aster Digital Elevation Model (DEM) with 30m x 30m cell size (resolution) is used as main geospatial data for preparing the run out assessment, while Landsat data is used to analyze land use land cover change in the study area. The results of the study area show that model can be applied with great accuracy as the model is very useful in determining debris flow areas. The results are compared with existing available landslides/debris flow maps. ArcGIS software is used in preparing run out susceptibility maps which can be used in debris flow mitigation and future land use planning.Keywords: debris flow, geospatial data, GIS based modeling, flow-R
Procedia PDF Downloads 27322286 Causes of Terrorism: Perceptions of University Students of Teacher Training Institutions
Authors: Saghir Ahmad, Abid Hussain Ch, Misbah Malik, Ayesha Batool
Abstract:
Terrorism is the marvel in which dreadful circumstance is made by a gathering of individuals who view themselves as abused by society. Terrorism is the unlawful utilization of power or viciousness by a man or a sorted out gathering by the general population or property with the aim of intimidation or compulsion of social orders or governments frequently for ideological or political reasons. Terrorism is as old as people. The main aim of the study was to find out the causes of terrorism through the perceptions of the universities students of teacher training institutions. This study was quantitative in nature. Survey method was used to collect data. A sample of two hundred and sixty seven students was selected from public universities. A five point Likert scale was used to collect data. Mean, Standard deviation, independent sample t-test, and One Way ANOVA were applied to analyze the data. The major findings of the study indicated that students perceived the main causes of terrorism are poverty, foreign interference, wrong concept of Islamization, and social injustice. It is also concluded that mostly, students think that drone attacks are promoting the terrorist activities. The education is key to eliminate the terrorism. There is need to educate the people and specially youngsters to bring the peace in the world.Keywords: dreadful circumstance, governments, power, students, terrorism
Procedia PDF Downloads 54822285 Experimental investigation on the lithium-Ion Battery Thermal Management System Based on Micro Heat Pipe Array in High Temperature Environment
Authors: Ruyang Ren, Yaohua Zhao, Yanhua Diao
Abstract:
The intermittent and unstable characteristics of renewable energy such as solar energy can be effectively solved through battery energy storage system. Lithium-ion battery is widely used in battery energy storage system because of its advantages of high energy density, small internal resistance, low self-discharge rate, no memory effect and long service life. However, the performance and service life of lithium-ion battery is seriously affected by its operating temperature. Thus, the safety operation of the lithium-ion battery module is inseparable from an effective thermal management system (TMS). In this study, a new type of TMS based on micro heat pipe array (MHPA) for lithium-ion battery is established, and the TMS is applied to a battery energy storage box that needs to operate at a high temperature environment of 40 °C all year round. MHPA is a flat shape metal body with high thermal conductivity and excellent temperature uniformity. The battery energy storage box is composed of four battery modules, with a nominal voltage of 51.2 V, a nominal capacity of 400 Ah. Through the excellent heat transfer characteristics of the MHPA, the heat generated by the charge and discharge process can be quickly transferred out of the battery module. In addition, if only the MHPA cannot meet the heat dissipation requirements of the battery module, the TMS can automatically control the opening of the external fan outside the battery module according to the temperature of the battery, so as to further enhance the heat dissipation of the battery module. The thermal management performance of lithium-ion battery TMS based on MHPA is studied experimentally under different ambient temperatures and the condition to turn on the fan or not. Results show that when the ambient temperature is 40 °C and the fan is not turned on in the whole charge and discharge process, the maximum temperature of the battery in the energy storage box is 53.1 °C and the maximum temperature difference in the battery module is 2.4 °C. After the fan is turned on in the whole charge and discharge process, the maximum temperature is reduced to 50.1 °C, and the maximum temperature difference is reduced to 1.7 °C. Obviously, the lithium-ion battery TMS based on MHPA not only could control the maximum temperature of the battery below 55 °C, but also ensure the excellent temperature uniformity of the battery module. In conclusion, the lithium-ion battery TMS based on MHPA can ensure the safe and stable operation of the battery energy storage box in high temperature environment.Keywords: heat dissipation, lithium-ion battery thermal management, micro heat pipe array, temperature uniformity
Procedia PDF Downloads 18122284 Effect of Climate Change on the Genomics of Invasiveness of the Whitefly Bemisia tabaci Species Complex by Estimating the Effective Population Size via a Coalescent Method
Authors: Samia Elfekih, Wee Tek Tay, Karl Gordon, Paul De Barro
Abstract:
Invasive species represent an increasing threat to food biosecurity, causing significant economic losses in agricultural systems. An example is the sweet potato whitefly, Bemisia tabaci, which is a complex of morphologically indistinguishable species causing average annual global damage estimated at US$2.4 billion. The Bemisia complex represents an interesting model for evolutionary studies because of their extensive distribution and potential for invasiveness and population expansion. Within this complex, two species, Middle East-Asia Minor 1 (MEAM1) and Mediterranean (MED) have invaded well beyond their home ranges whereas others, such as Indian Ocean (IO) and Australia (AUS), have not. In order to understand why some Bemisia species have become invasive, genome-wide sequence scans were used to estimate population dynamics over time and relate these to climate. The Bayesian Skyline Plot (BSP) method as implemented in BEAST was used to infer the historical effective population size. In order to overcome sampling bias, the populations were combined based on geographical origin. The datasets used for this particular analysis are genome-wide SNPs (single nucleotide polymorphisms) called separately in each of the following groups: Sub-Saharan Africa (Burkina Faso), Europe (Spain, France, Greece and Croatia), USA (Arizona), Mediterranean-Middle East (Israel, Italy), Middle East-Central Asia (Turkmenistan, Iran) and Reunion Island. The non-invasive ‘AUS’ species endemic to Australia was used as an outgroup. The main findings of this study show that the BSP for the Sub-Saharan African MED population is different from that observed in MED populations from the Mediterranean Basin, suggesting evolution under a different set of environmental conditions. For MED, the effective size of the African (Burkina Faso) population showed a rapid expansion ≈250,000-310,000 years ago (YA), preceded by a period of slower growth. The European MED populations (i.e., Spain, France, Croatia, and Greece) showed a single burst of expansion at ≈160,000-200,000 YA. The MEAM1 populations from Israel and Italy and the ones from Iran and Turkmenistan are similar as they both show the earlier expansion at ≈250,000-300,000 YA. The single IO population lacked the latter expansion but had the earlier one. This pattern is shared with the Sub-Saharan African (Burkina Faso) MED, suggesting IO also faced a similar history of environmental change, which seems plausible given their relatively close geographical distributions. In conclusion, populations within the invasive species MED and MEAM1 exhibited signatures of population expansion lacking in non-invasive species (IO and AUS) during the Pleistocene, a geological epoch marked by repeated climatic oscillations with cycles of glacial and interglacial periods. These expansions strongly suggested the potential of some Bemisia species’ genomes to affect their adaptability and invasiveness.Keywords: whitefly, RADseq, invasive species, SNP, climate change
Procedia PDF Downloads 12622283 Globalization of Pesticide Technology and Sustainable Agriculture
Authors: Gagandeep Kaur
Abstract:
The pesticide industry is a big supplier of agricultural inputs. The uses of pesticides control weeds, fungal diseases, etc., which causes of yield losses in agricultural production. In agribusiness and agrichemical industry, Globalization of markets, competition and innovation are the dominant trends. By the tradition of increasing the productivity of agro-systems through generic, universally applicable technologies, innovation in the agrichemical industry is limited. The marketing of technology of agriculture needs to deal with some various trends such as locally-organized forces that envision regionalized sustainable agriculture in the future. Agricultural production has changed dramatically over the past century. Before World War second agricultural production was featured as a low input of money, high labor, mixed farming and low yields. Although mineral fertilizers were applied already in the second half of the 19th century, most f the crops were restricted by local climatic, geological and ecological conditions. After World War second, in the period of reconstruction, political and socioeconomic pressure changed the nature of agricultural production. For a growing population, food security at low prices and securing farmer income at acceptable levels became political priorities. Current agricultural policy the new European common agricultural policy is aimed to reduce overproduction, liberalization of world trade and the protection of landscape and natural habitats. Farmers have to increase the quality of their productivity and they have to control costs because of increased competition from the world market. Pesticides should be more effective at lower application doses, less toxic and not pose a threat to groundwater. There is a big debate taking place about how and whether to mitigate the intensive use of pesticides. This debate is about the future of agriculture which is sustainable agriculture. This is possible by moving away from conventional agriculture. Conventional agriculture is featured as high inputs and high yields. The use of pesticides in conventional agriculture implies crop production in a wide range. To move away from conventional agriculture is possible through the gradual adoption of less disturbing and polluting agricultural practices at the level of the cropping system. For a healthy environment for crop production in the future there is a need for the maintenance of chemical, physical or biological properties. There is also required to minimize the emission of volatile compounds in the atmosphere. Companies are limiting themselves to a particular interpretation of sustainable development, characterized by technological optimism and production-maximizing. So the main objective of the paper will present the trends in the pesticide industry and in agricultural production in the era of Globalization. The second objective is to analyze sustainable agriculture. Companies of pesticides seem to have identified biotechnology as a promising alternative and supplement to the conventional business of selling pesticides. The agricultural sector is in the process of transforming its conventional mode of operation. Some experts give suggestions to farmers to move towards precision farming and some suggest engaging in organic farming. The methodology of the paper will be historical and analytical. Both primary and secondary sources will be used.Keywords: globalization, pesticides, sustainable development, organic farming
Procedia PDF Downloads 9822282 Lateral Capacity of Helical-Pile Groups Subjected to Bearing Combined Loads
Authors: Hesham Hamdy Abdelmohsen, Ahmed Shawky Abdul Azizb, Mona Fawzy Aldaghma
Abstract:
Helical piles have earned considerable attention as an effective deep foundation alternative due to their rapid installation process and their dual purpose in compression and tension. These piles find common uses as foundations for structures like solar panels, wind turbines, offshore platforms, and some kinds of retaining walls. These structures usually transfer different combinations of loads to their helical-pile foundations in the form of axial and lateral loads. Extensive research has been conducted to investigate and understand the behavior of these piles under the influence of either axial or lateral loads. However, the impacts of loading patterns that may act on the helical piles as combinations of axial compression and lateral loads still need more efforts of research work. This paper presents the results of an experimental (Lab tests) and numerical (PLAXIS-3D) study performed on vertical helical-pile groups under the action of combined loads as axial compression (bearing loads), acting successively with lateral (horizontal) loads. The study aims to clarify the effects of key factors, like helix location and direction of lateral load, on the lateral capacity of helical-pile groups and, consequently, on group efficiency. Besides the variation of helix location and lateral load direction, three patterns of successive bearing combined loads were considered, in which the axial vertical compression load was either zero, V1 or V2, whereas the lateral horizontal loads were varied under each vertical compression load. The study concluded that the lateral capacity of the helical-pile group is significantly affected by helix location within the length of the pile shaft. The optimal lateral performance is achieved with helices at a depth ratio of H/L = 0.4. Furthermore, groups of rectangular plan distribution exhibit greater lateral capacity if subjected to lateral horizontal load in the direction of its long axis. Additionally, the research emphasizes that the presence of vertical compression loading can enhance the lateral capacity of the group. This enhancement depends on the value of the vertical compression load, lateral load direction, and helix location, which highlights the complex interaction effect of these factors on the efficiency of helical-pile groups.Keywords: helical piles, experimental, numerical, lateral loading, group efficiency
Procedia PDF Downloads 32