Search results for: real-time data collection
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24869

Search results for: real-time data collection

24809 Data Collection in Protected Agriculture for Subsequent Big Data Analysis: Methodological Evaluation in Venezuela

Authors: Maria Antonieta Erna Castillo Holly

Abstract:

During the last decade, data analysis, strategic decision making, and the use of artificial intelligence (AI) tools in Latin American agriculture have been a challenge. In some countries, the availability, quality, and reliability of historical data, in addition to the current data recording methodology in the field, makes it difficult to use information systems, complete data analysis, and their support for making the right strategic decisions. This is something essential in Agriculture 4.0. where the increase in the global demand for fresh agricultural products of tropical origin, during all the seasons of the year requires a change in the production model and greater agility in the responses to the consumer market demands of quality, quantity, traceability, and sustainability –that means extensive data-. Having quality information available and updated in real-time on what, how much, how, when, where, at what cost, and the compliance with production quality standards represents the greatest challenge for sustainable and profitable agriculture in the region. The objective of this work is to present a methodological proposal for the collection of georeferenced data from the protected agriculture sector, specifically in production units (UP) with tall structures (Greenhouses), initially for Venezuela, taking the state of Mérida as the geographical framework, and horticultural products as target crops. The document presents some background information and explains the methodology and tools used in the 3 phases of the work: diagnosis, data collection, and analysis. As a result, an evaluation of the process is carried out, relevant data and dashboards are displayed, and the first satellite maps integrated with layers of information in a geographic information system are presented. Finally, some improvement proposals and tentatively recommended applications are added to the process, understanding that their objective is to provide better qualified and traceable georeferenced data for subsequent analysis of the information and more agile and accurate strategic decision making. One of the main points of this study is the lack of quality data treatment in the Latin America area and especially in the Caribbean basin, being one of the most important points how to manage the lack of complete official data. The methodology has been tested with horticultural products, but it can be extended to other tropical crops.

Keywords: greenhouses, protected agriculture, data analysis, geographic information systems, Venezuela

Procedia PDF Downloads 106
24808 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 60
24807 Optimization of Marine Waste Collection Considering Dynamic Transport and Ship’s Wake Impact

Authors: Guillaume Richard, Sarra Zaied

Abstract:

Marine waste quantities increase more and more, 5 million tons of plastic waste enter the ocean every year. Their spatiotemporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment, as well as the size and location of the waste. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. In this context, diverse studies have been dedicated to describing waste behavior in order to identify its accumulation in ocean areas. None of the existing tools which track objects at sea had the objective of tracking down a slick of waste. Moreover, the applications related to marine waste are in the minority compared to rescue applications or oil slicks tracking applications. These approaches are able to accurately simulate an object's behavior over time but not during the collection mission of a waste sheet. This paper presents numerical modeling of a boat’s wake impact on the floating marine waste behavior during a collection mission. The aim is to predict the trajectory of a marine waste slick to optimize its collection using meteorological data of ocean currents, wind, and possibly waves. We have made the choice to use Ocean Parcels which is a Python library suitable for trajectoring particles in the ocean. The modeling results showed the important role of advection and diffusion processes in the spatiotemporal distribution of floating plastic litter. The performance of the proposed method was evaluated on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). The results of the evaluation in Cape of Good Hope (South Africa) prove that the proposed approach can effectively predict the position and velocity of marine litter during collection, which allowed for optimizing time and more than $90\%$ of the amount of collected waste.

Keywords: marine litter, advection-diffusion equation, sea current, numerical model

Procedia PDF Downloads 60
24806 Artificial Intelligence for Traffic Signal Control and Data Collection

Authors: Reggie Chandra

Abstract:

Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.

Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal

Procedia PDF Downloads 127
24805 Improved Safety Science: Utilizing a Design Hierarchy

Authors: Ulrica Pettersson

Abstract:

Collection of information on incidents is regularly done through pre-printed incident report forms. These tend to be incomplete and frequently lack essential information. ne consequence is that reports with inadequate information, that do not fulfil analysts’ requirements, are transferred into the analysis process. To improve an incident reporting form, theory in design science, witness psychology and interview and questionnaire research has been used. Previously three experiments have been conducted to evaluate the form and shown significant improved results. The form has proved to capture knowledge, regardless of the incidents’ character or context. The aim in this paper is to describe how design science, in more detail a design hierarchy can be used to construct a collection form for improvements in safety science.

Keywords: data collection, design science, incident reports, safety science

Procedia PDF Downloads 189
24804 Ontological Modeling Approach for Statistical Databases Publication in Linked Open Data

Authors: Bourama Mane, Ibrahima Fall, Mamadou Samba Camara, Alassane Bah

Abstract:

At the level of the National Statistical Institutes, there is a large volume of data which is generally in a format which conditions the method of publication of the information they contain. Each household or business data collection project includes a dissemination platform for its implementation. Thus, these dissemination methods previously used, do not promote rapid access to information and especially does not offer the option of being able to link data for in-depth processing. In this paper, we present an approach to modeling these data to publish them in a format intended for the Semantic Web. Our objective is to be able to publish all this data in a single platform and offer the option to link with other external data sources. An application of the approach will be made on data from major national surveys such as the one on employment, poverty, child labor and the general census of the population of Senegal.

Keywords: Semantic Web, linked open data, database, statistic

Procedia PDF Downloads 152
24803 Estimating Destinations of Bus Passengers Using Smart Card Data

Authors: Hasik Lee, Seung-Young Kho

Abstract:

Nowadays, automatic fare collection (AFC) system is widely used in many countries. However, smart card data from many of cities does not contain alighting information which is necessary to build OD matrices. Therefore, in order to utilize smart card data, destinations of passengers should be estimated. In this paper, kernel density estimation was used to forecast probabilities of alighting stations of bus passengers and applied to smart card data in Seoul, Korea which contains boarding and alighting information. This method was also validated with actual data. In some cases, stochastic method was more accurate than deterministic method. Therefore, it is sufficiently accurate to be used to build OD matrices.

Keywords: destination estimation, Kernel density estimation, smart card data, validation

Procedia PDF Downloads 327
24802 Exploring Influence Range of Tainan City Using Electronic Toll Collection Big Data

Authors: Chen Chou, Feng-Tyan Lin

Abstract:

Big Data has been attracted a lot of attentions in many fields for analyzing research issues based on a large number of maternal data. Electronic Toll Collection (ETC) is one of Intelligent Transportation System (ITS) applications in Taiwan, used to record starting point, end point, distance and travel time of vehicle on the national freeway. This study, taking advantage of ETC big data, combined with urban planning theory, attempts to explore various phenomena of inter-city transportation activities. ETC, one of government's open data, is numerous, complete and quick-update. One may recall that living area has been delimited with location, population, area and subjective consciousness. However, these factors cannot appropriately reflect what people’s movement path is in daily life. In this study, the concept of "Living Area" is replaced by "Influence Range" to show dynamic and variation with time and purposes of activities. This study uses data mining with Python and Excel, and visualizes the number of trips with GIS to explore influence range of Tainan city and the purpose of trips, and discuss living area delimited in current. It dialogues between the concepts of "Central Place Theory" and "Living Area", presents the new point of view, integrates the application of big data, urban planning and transportation. The finding will be valuable for resource allocation and land apportionment of spatial planning.

Keywords: Big Data, ITS, influence range, living area, central place theory, visualization

Procedia PDF Downloads 247
24801 Mobile Crowdsensing Scheme by Predicting Vehicle Mobility Using Deep Learning Algorithm

Authors: Monojit Manna, Arpan Adhikary

Abstract:

In Mobile cloud sensing across the globe, an emerging paradigm is selected by the user to compute sensing tasks. In urban cities current days, Mobile vehicles are adapted to perform the task of data sensing and data collection for universality and mobility. In this work, we focused on the optimality and mobile nodes that can be selected in order to collect the maximum amount of data from urban areas and fulfill the required data in the future period within a couple of minutes. We map out the requirement of the vehicle to configure the maximum data optimization problem and budget. The Application implementation is basically set up to generalize a realistic online platform in which real-time vehicles are moving apparently in a continuous manner. The data center has the authority to select a set of vehicles immediately. A deep learning-based scheme with the help of mobile vehicles (DLMV) will be proposed to collect sensing data from the urban environment. From the future time perspective, this work proposed a deep learning-based offline algorithm to predict mobility. Therefore, we proposed a greedy approach applying an online algorithm step into a subset of vehicles for an NP-complete problem with a limited budget. Real dataset experimental extensive evaluations are conducted for the real mobility dataset in Rome. The result of the experiment not only fulfills the efficiency of our proposed solution but also proves the validity of DLMV and improves the quantity of collecting the sensing data compared with other algorithms.

Keywords: mobile crowdsensing, deep learning, vehicle recruitment, sensing coverage, data collection

Procedia PDF Downloads 53
24800 Development of Drying System for Dew Collection to Supplement Minimum Water Required for Grazing Plants in Arid Regions

Authors: Mohamed I. Alzarah

Abstract:

Passive dew harvesting and rainwater collection requires a very small financial investment meanwhile they can exploit a free and clean source of water in rural or remote areas. Dew condensation on greenhouse dryer cladding and assorted other surfaces was frequently noticed. Accordingly, this study was performed in order to measure the quantity of condensation in the arid regions. Dew was measured by using three different kinds of collectors which were glass of flat plate solar collector, tempered glass of photovoltaic (PV) and double sloped (25°) acrylic plexiglas of greenhouse dryer. The total amount of dew collection for three different types of collectors was measured during December 2013 to March 2014 in Alahsa, Saudi Arabia. Meteorological data were collected for one year. The condensate dew drops were collected naturally (before scraping) and by scraping once and twice. Dew began to condense most likely between 12:00 am and 6:30 am and its intensity reached the peak at about 45 min before sunrise. The cumulative dew yield on double-sloped test roof was varying with wind speed and direction. Results indicated that, wiping twice gave more dew yield compared to wiping once or collection by gravity. Dew and rain pH were neutral (close to 7) and the total mineralization was considerable. The ions concentration agrees with the World Health Organization recommendations for potable water. Using existing drying system for dew and rain harvesting cold provide a potable water source for arid region.

Keywords: PV module, flat plate solar collector, greenhouse, drying system, dew collection, water vapor, rainwater harvesting

Procedia PDF Downloads 305
24799 The Development of Research Based Model to Enhance Critical Thinking, Cognitive Skills and Culture and Local Wisdom Knowledge of Undergraduate Students

Authors: Nithipattara Balsiri

Abstract:

The purposes of this research was to develop instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge of undergraduate students. The sample consisted of 307 undergraduate students. Critical thinking and cognitive skills test were employed for data collection. Second-order confirmatory factor analysis, t-test, and one-way analysis of variance were employed for data analysis using SPSS and LISREL programs. The major research results were as follows; 1) the instructional model by using research-based learning enhancing critical thinking, cognitive skills, and culture and local wisdom knowledge should be consists of 6 sequential steps, namely (1) the setting research problem (2) the setting research hypothesis (3) the data collection (4) the data analysis (5) the research result conclusion (6) the application for problem solving, and 2) after the treatment undergraduate students possessed a higher scores in critical thinking and cognitive skills than before treatment at the 0.05 level of significance.

Keywords: critical thinking, cognitive skills, culture and local wisdom knowledge

Procedia PDF Downloads 335
24798 An Analysis of Privacy and Security for Internet of Things Applications

Authors: Dhananjay Singh, M. Abdullah-Al-Wadud

Abstract:

The Internet of Things is a concept of a large scale ecosystem of wireless actuators. The actuators are defined as things in the IoT, those which contribute or produces some data to the ecosystem. However, ubiquitous data collection, data security, privacy preserving, large volume data processing, and intelligent analytics are some of the key challenges into the IoT technologies. In order to solve the security requirements, challenges and threats in the IoT, we have discussed a message authentication mechanism for IoT applications. Finally, we have discussed data encryption mechanism for messages authentication before propagating into IoT networks.

Keywords: Internet of Things (IoT), message authentication, privacy, security

Procedia PDF Downloads 348
24797 Statistical Investigation Projects: A Way for Pre-Service Mathematics Teachers to Actively Solve a Campus Problem

Authors: Muhammet Şahal, Oğuz Köklü

Abstract:

As statistical thinking and problem-solving processes have become increasingly important, teachers need to be more rigorously prepared with statistical knowledge to teach their students effectively. This study examined preservice mathematics teachers' development of statistical investigation projects using data and exploratory data analysis tools, following a design-based research perspective and statistical investigation cycle. A total of 26 pre-service senior mathematics teachers from a public university in Turkiye participated in the study. They formed groups of 3-4 members voluntarily and worked on their statistical investigation projects for six weeks. The data sources were audio recordings of pre-service teachers' group discussions while working on their projects in class, whole-class video recordings, and each group’s weekly and final reports. As part of the study, we reviewed weekly reports, provided timely feedback specific to each group, and revised the following week's class work based on the groups’ needs and development in their project. We used content analysis to analyze groups’ audio and classroom video recordings. The participants encountered several difficulties, which included formulating a meaningful statistical question in the early phase of the investigation, securing the most suitable data collection strategy, and deciding on the data analysis method appropriate for their statistical questions. The data collection and organization processes were challenging for some groups and revealed the importance of comprehensive planning. Overall, preservice senior mathematics teachers were able to work on a statistical project that contained the formulation of a statistical question, planning, data collection, analysis, and reaching a conclusion holistically, even though they faced challenges because of their lack of experience. The study suggests that preservice senior mathematics teachers have the potential to apply statistical knowledge and techniques in a real-world context, and they could proceed with the project with the support of the researchers. We provided implications for the statistical education of teachers and future research.

Keywords: design-based study, pre-service mathematics teachers, statistical investigation projects, statistical model

Procedia PDF Downloads 49
24796 Safety Culture, Mindfulness and Safe Behaviours of Students Residing in the Halls of Residence of Obafemi Awolowo University, Ile Ife, Nigeria

Authors: Olajumoke Adetoun Ojeleye

Abstract:

The study assessed the safety culture, mindfulness and safe behaviors of students residing in the halls of residence of Obafemi Awolowo University (OAU), Ile Ife, Nigeria. The objectives of the study were to assess the level of safety mindfulness of students residing in the halls of residence of OAU, examine their safety culture and establish whether these students are involved in unsafe practices. The study employed a cross-sectional research design and instrument used for data collection was a self-structured, self-administered questionnaire. The questionnaire was tested for validity and reliability with its reliability coefficient at 0.71 before being used for data collection. Respondents were selected by multi-stage sampling technique and the sample size was 530. Data collection took 2 weeks and analysed using descriptive statistical techniques. Results showed that about half of the respondents’ population (49.8%) was between the ages of 20-24 years. There were more males (56.2%) than females (43.8%). Although data demonstrated that majority (91.7%) of the respondents are highly safety minded and the safety culture of an equally high proportion (83.4%) was adjudged fair, a lot of improvement is needed in the area of alerting or informing management of impending dangers and studying the hall handbook to internalize its contents. The study further showed that only 43.6% of respondents had good safety practices and behaviors and majority (56.4%) had fair safety practices and behaviors. One accidental discovery of the study is the finding that not a few of the students squat their counterparts. The study recommended the establishment of clearly written out complaint procedure that is accessible and available to all hall residents, building more hostels with adequate facilities to address the issue of overcrowding and also putting systems in place in order to encourage residents to report incidences/accidents.

Keywords: safe behaviours, safety culture, safety mindfulness, student

Procedia PDF Downloads 237
24795 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 399
24794 Supply Chain Risk Management: A Meta-Study of Empirical Research

Authors: Shoufeng Cao, Kim Bryceson, Damian Hine

Abstract:

The existing supply chain risk management (SCRM) research is currently chaotic and somewhat disorganized, and the topic has been addressed conceptually more often than empirically. This paper, using both qualitative and quantitative data, employs a modified Meta-study method to investigate the SCRM empirical research published in quality journals over the period of 12 years (2004-2015). The purpose is to outline the extent research trends and the employed research methodologies (i.e., research method, data collection and data analysis) across the sub-field that will guide future research. The synthesized findings indicate that empirical study on risk ripple effect along an entire supply chain, industry-specific supply chain risk management and global/export supply chain risk management has not yet given much attention than it deserves in the SCRM field. Besides, it is suggested that future empirical research should employ multiple and/or mixed methods and multi-source data collection techniques to reduce common method bias and single-source bias, thus improving research validity and reliability. In conclusion, this paper helps to stimulate more quality empirical research in the SCRM field via identifying promising research directions and providing some methodology guidelines.

Keywords: empirical research, meta-study, methodology guideline, research direction, supply chain risk management

Procedia PDF Downloads 291
24793 Development the Potential of Parking Tax and Parking Retribution Revenues: Case Study in Bekasi City

Authors: Ivan Yudianto

Abstract:

The research objectives are to analyze the factors that impede the Parking Tax and Parking Retribution collection in Bekasi City Government, analyzing the factors that can increase local own revenue from the tax sector of parking tax and parking retribution, analyze monitoring the parking retribution collection by the Bekasi City Government, analyze strategies Bekasi City Government through the preparation of a roadmap and action plan to increase parking tax and parking retribution revenues. The approach used in this research is a qualitative approach. Qualitative research is used because the problem is not yet clear and the object to be studied will be holistic, complex, and dynamic, and the relationship will be interactive symptoms. Methods of data collection and technical analysis of the data was in-depth interviews, participant observation, documentary materials, literature, and triangulation, as well as new methods such as the methods of visual materials and internet browsing. The results showed that there are several factors that become an obstacle such as the parking taxpayer does not disclose the actual parking revenue, the parking taxpayer are late or do not pay Parking Tax, many parking locations controlled by illegal organizations, shortage of human resources in charge levy and supervise the parking tax and parking retribution collection in the Bekasi City Government, surveillance parking tax and parking retribution are not scheduled on a regular basis. Several strategic priorities in order to develop the potential of the Parking Tax and Parking Retribution in the Bekasi City Government, namely through increased controling and monitoring of the Parking Taxpayer, forming a team of auditors to audit the Parking Taxpayer, seek law enforcement persuasive and educative to reduce Parking Taxpayer wayward, providing strict sanctions against the Parking Taxpayer disobedient, revised regulations mayors about locations of parking in Bekasi City, rationalize revenues target of Parking Retribution, conducting takeover attempts parking location on the roadside of the individual or specific group, and drafting regional regulations on parking subscribe.

Keywords: local own revenue, parking retribution, parking tax, parking taxpayer

Procedia PDF Downloads 301
24792 Effectiveness of a Malaysian Workplace Intervention Study on Physical Activity Levels

Authors: M. Z. Bin Mohd Ghazali, N. C. Wilson, A. F. Bin Ahmad Fuad, M. A. H. B. Musa, M. U. Mohamad Sani, F. Zulkifli, M. S. Zainal Abidin

Abstract:

Physical activity levels are low in Malaysia and this study was undertaken to determine if a four week work-based intervention program would be effective in changing physical activity levels. The study was conducted in a Malaysian Government Department and had three stages: baseline data collection, four-week intervention and two-month post intervention data collection. During the intervention and two-month post intervention phases, physical activity levels (determined by a pedometer) and basic health profiles (BMI, abdominal obesity, blood pressure) were measured. Staff (58 males, 47 females) with an average age of 33 years completed baseline data collection. Pedometer steps averaged 7,102 steps/day at baseline, although male step counts were significantly higher than females (7,861 vs. 6114). Health profiles were poor: over 50% were overweight/obese (males 66%, females 40%); hypertension (males 23%, females 6%); excess waist circumference (males 52%, females 17%). While 86 staff participated in the intervention, only 49 regularly reported their steps. There was a significant increase (17%) in average daily steps from 8,965 (week 1) to 10,436 (week 4). Unfortunately, participation in the intervention program was avoided by the less healthy staff. Two months after the intervention there was no significant difference in average steps/day, despite the fact that 89% of staff reporting they planned to make long-term changes to their lifestyle. An unexpected average increase of 2kg in body weight occurred in participants, although this was less than the 5.6kg in non-participants. A number of recommendations are made for future interventions, including the conclusion that pedometers were a useful tool and popular with participants.

Keywords: pedometers, walking, health, intervention

Procedia PDF Downloads 275
24791 The Effectiveness of Teaching Emotional Intelligence on Reducing Marital Conflicts and Marital Adjustment in Married Students of Tehran University

Authors: Elham Jafari

Abstract:

The aim of this study was to evaluate the effectiveness of emotional intelligence training on reducing marital conflict and marital adjustment in married students of the University of Tehran. This research is an applied type in terms of purpose and a semi-experimental design of pre-test-post-test type with the control group and with follow-up test in terms of the data collection method. The statistical population of the present study consisted of all married students of the University of Tehran. In this study, 30 married students of the University of Tehran were selected by convenience sampling method as a sample that 15 people in the experimental group and 15 people in the control group were randomly selected. The method of data collection in this research was field and library. The data collection tool in the field section was two questionnaires of marital conflict and marital adjustment. To analyze the collected data, first at the descriptive level, using statistical indicators, the demographic characteristics of the sample were described by SPSS software. In inferential statistics, the statistical method used was the test of analysis of covariance. The results showed that the effect of the independent variable of emotional intelligence on the reduction of marital conflicts is statistically significant. And it can be inferred that emotional intelligence training has reduced the marital conflicts of married students of the University of Tehran in the experimental group compared to the control group. Also, the effect of the independent variable of emotional intelligence on marital adjustment was statistically significant. It can be inferred that emotional intelligence training has adjusted the marital adjustment of married students of the University of Tehran in the experimental group compared to the control group.

Keywords: emotional intelligence, marital conflicts, marital compatibility, married students

Procedia PDF Downloads 229
24790 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning

Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga

Abstract:

Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.

Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter

Procedia PDF Downloads 184
24789 A Web-Based Systems Immunology Toolkit Allowing the Visualization and Comparative Analysis of Publically Available Collective Data to Decipher Immune Regulation in Early Life

Authors: Mahbuba Rahman, Sabri Boughorbel, Scott Presnell, Charlie Quinn, Darawan Rinchai, Damien Chaussabel, Nico Marr

Abstract:

Collections of large-scale datasets made available in public repositories can be used to identify and fill gaps in biomedical knowledge. But first, these data need to be made readily accessible to researchers for analysis and interpretation. Here a collection of transcriptome datasets was made available to investigate the functional programming of human hematopoietic cells in early life. Thirty two datasets were retrieved from the NCBI Gene Expression Omnibus (GEO) and loaded in a custom, interactive web application called the Gene Expression browser (GXB), designed for visualization and query of integrated large-scale data. Multiple sample groupings and gene rank lists were created based on the study design and variables in each dataset. Web links to customized graphical views can be generated by users and subsequently be used to graphically present data in manuscripts for publication. The GXB tool also enables browsing of a single gene across datasets, which can provide information on the role of a given molecule across biological systems. The dataset collection is available online. As a proof-of-principle, one of the datasets (GSE25087) was re-analyzed to identify genes that are differentially expressed by regulatory T cells in early life. Re-analysis of this dataset and a cross-study comparison using multiple other datasets in the above mentioned collection revealed that PMCH, a gene encoding a precursor of melanin-concentrating hormone (MCH), a cyclic neuropeptide, is highly expressed in a variety of other hematopoietic cell types, including neonatal erythroid cells as well as plasmacytoid dendritic cells upon viral infection. Our findings suggest an as yet unrecognized role of MCH in immune regulation, thereby highlighting the unique potential of the curated dataset collection and systems biology approach to generate new hypotheses which can be tested in future mechanistic studies.

Keywords: early-life, GEO datasets, PMCH, interactive query, systems biology

Procedia PDF Downloads 270
24788 Evaluation of the Improve Vacuum Blood Collection Tube for Laboratory Tests

Authors: Yoon Kyung Song, Seung Won Han, Sang Hyun Hwang, Do Hoon Lee

Abstract:

Laboratory tests is a significant part for the diagnosis, prognosis, treatment of diseases. Blood collection is a simple process, but can be a potential cause of pre-analytical errors. Vacuum blood collection tubes used to collect and store the blood specimens is necessary for accurate test results. The purpose of this study was to validate Improve serum separator tube(SST) (Guanzhou Improve Medical Instruments Co., Ltd, China) for routine clinical chemistry laboratory testing. Blood specimens were collected from 100 volunteers in three different serum vacuum tubes (Greiner SST , Becton Dickinson SST , Improve SST). The specimens were evaluated for 16 routine chemistry tests using TBA-200FR NEO (Toshiba Medical Co. JAPAN). The results were statistically analyzed by paired t-test and Bland-Altman plot. For stability test, the initial results for each tube were compared with results of 72 hours preserved specimens. Their clinical availability was evaluated by biological Variation of Ricos data bank. Paired t-test analysis revealed that AST, ALT, K, Cl showed statistically same results but calcium (CA), phosphorus(PHOS), glucose(GLU), BUN, uric acid(UA), cholesterol(CHOL), total protein(TP), albumin(ALB), total bilirubin(TB), ALP, creatinine(CRE), sodium(NA) were different(P < 0.05) between Improve SST and Greiner SST. Also, CA, PHOS, TP, TB, AST, ALT, NA, K, Cl showed statistically the same results but GLU, BUN, UA, CHOL, ALB, ALP, CRE were different between Improve SST and Becton Dickinson SST. All statistically different cases were clinically acceptable by biological Variation of Ricos data bank. Improve SST tubes showed satisfactory results compared with Greiner SST and Becton Dickinson SST. We concluded that the tubes are acceptable for routine clinical chemistry laboratory testing.

Keywords: blood collection, Guanzhou Improve, SST, vacuum tube

Procedia PDF Downloads 217
24787 A Critical Analysis on Gaps Associated with Culture Policy Milieu Governing Traditional Male Circumcision in the Eastern Cape, South Africa

Authors: Thanduxolo Nomngcoyiya, Simon M. Kang’ethe

Abstract:

The paper aimed to critically analyse gaps pertaining to the cultural policy environments governing traditional male circumcision in the Eastern Cape as exemplified by an empirical case study. The original study which this paper is derived from utilized qualitative paradigm; and encompassed 28 participants. It used in-depth one-on-one interviews complemented by focus group discussions and key informants as a method of data collection. It also adopted interview guide as a data collection instrument. The original study was cross-sectional in nature, and the data was audio recorded and transcribed later during the data analysis and coding process. The study data analysis was content thematic analysis and identified the following key major findings on the culture of male circumcision policy: Lack of clarity on culture of male circumcision policy operations; Myths surrounding procedures on culture of male circumcision; Divergent views on cultural policies between government and male circumcision custodians; Unclear cultural policies on selection criteria of practitioners; and Lack of policy enforcement and implementation on transgressors of culture of male circumcision. It recommended: a stringent selection criteria of practitioners; a need to carry out death-free male circumcision; a need for male circumcision stakeholders to work with other culture and tradition-friendly stakeholders.

Keywords: human rights, policy enforcement, traditional male circumcision, traditional surgeons and nurses

Procedia PDF Downloads 277
24786 Preparedness for Microbial Forensics Evidence Collection on Best Practice

Authors: Victor Ananth Paramananth, Rashid Muniginin, Mahaya Abd Rahman, Siti Afifah Ismail

Abstract:

Safety issues, scene protection, and appropriate evidence collection must be handled in any bio crime scene. There will be a scene or multi-scene to be cordoned for investigation in any bio-incident or bio crime event. Evidence collection is critical in determining the type of microbial or toxin, its lethality, and its source. As a consequence, from the start of the investigation, a proper sampling method is required. The most significant challenges for the crime scene officer would be deciding where to obtain samples, the best sampling method, and the sample sizes needed. Since there could be evidence in liquid, viscous, or powder shape at a crime scene, crime scene officers have difficulty determining which tools to use for sampling. To maximize sample collection, the appropriate tools for sampling methods are necessary. This study aims to assist the crime scene officer in collecting liquid, viscous, and powder biological samples in sufficient quantity while preserving sample quality. Observational tests on sample collection using liquid, viscous, and powder samples for adequate quantity and sample quality were performed using UV light in this research. The density of the light emission varies upon the method of collection and sample types. The best tools for collecting sufficient amounts of liquid, viscous, and powdered samples can be identified by observing UV light. Instead of active microorganisms, the invisible powder is used to assess sufficient sample collection during a crime scene investigation using various collection tools. The liquid, powdered and viscous samples collected using different tools were analyzed using Fourier transform infrared - attenuate total reflection (FTIR-ATR). FTIR spectroscopy is commonly used for rapid discrimination, classification, and identification of intact microbial cells. The liquid, viscous and powdered samples collected using various tools have been successfully observed using UV light. Furthermore, FTIR-ATR analysis showed that collected samples are sufficient in quantity while preserving their quality.

Keywords: biological sample, crime scene, collection tool, UV light, forensic

Procedia PDF Downloads 171
24785 Kuwait Environmental Remediation Program: Waste Management Data Analytics for Planning and Optimization of Waste Collection

Authors: Aisha Al-Baroud

Abstract:

The United Nations Compensation Commission (UNCC), Kuwait National Focal Point (KNFP) and Kuwait Oil Company (KOC) cooperated in a joint project to undertake comprehensive and collaborative efforts to remediate 26 million m3 of crude oil contaminated soil that had resulted from the Gulf War in 1990/1991. These efforts are referred to as the Kuwait Environmental Remediation Program (KERP). KOC has developed a Total Remediation Solution (TRS) for KERP, which will guide the Remediation projects, comprises of alternative remedial solutions with treatment techniques inclusive of limited landfills for non-treatable soil materials disposal, and relies on treating certain ranges of Total Petroleum Hydrocarbon (TPH) contamination with the most appropriate remediation techniques. The KERP Remediation projects will be implemented within the KOC’s oilfields in North and South East Kuwait. The objectives of this remediation project is to clear land for field development and treat all the oil contaminated features (dry oil lakes, wet oil lakes, and oil contaminated piles) through TRS plan to optimize the treatment processes and minimize the volume of contaminated materials to be placed into landfills. The treatment strategy will comprise of Excavation and Transportation (E&T) of oil contaminated soils from contaminated land to remote treatment areas and to use appropriate remediation technologies or a combination of treatment technologies to achieve remediation target criteria (RTC). KOC has awarded five mega projects to achieve the same and is currently in the execution phase. As a part of the company’s commitment to environment and for the fulfillment of the mandatory HSSEMS procedures, all the Remediation contractors needs to report waste generation data from the various project activities on a monthly basis. Data on waste generation is collected in order to implement cost-efficient and sustainable waste management operations. Data analytics approaches can be built on the top of the data to produce more detailed, and in-time waste generation information for the basis of waste management and collection. The results obtained highlight the potential of advanced data analytic approaches in producing more detailed waste generation information for planning and optimization of waste collection and recycling.

Keywords: waste, tencnolgies, KERP, data, soil

Procedia PDF Downloads 80
24784 Lagrangian Approach for Modeling Marine Litter Transport

Authors: Sarra Zaied, Arthur Bonpain, Pierre Yves Fravallo

Abstract:

The permanent supply of marine litter implies their accumulation in the oceans, which causes the presence of more compact wastes layers. Their Spatio-temporal distribution is never homogeneous and depends mainly on the hydrodynamic characteristics of the environment and the size and location of the wastes. As part of optimizing collect of marine plastic wastes, it is important to measure and monitor their evolution over time. For this, many research studies have been dedicated to describing the wastes behavior in order to identify their accumulation in oceans areas. Several models are therefore developed to understand the mechanisms that allow the accumulation and the displacements of marine litter. These models are able to accurately simulate the drift of wastes to study their behavior and stranding. However, these works aim to study the wastes behavior over a long period of time and not at the time of waste collection. This work investigates the transport of floating marine litter (FML) to provide basic information that can help in optimizing wastes collection by proposing a model for predicting their behavior during collection. The proposed study is based on a Lagrangian modeling approach that uses the main factors influencing the dynamics of the waste. The performance of the proposed method was assessed on real data collected from the Copernicus Marine Environment Monitoring Service (CMEMS). Evaluation results in the Java Sea (Indonesia) prove that the proposed model can effectively predict the position and the velocity of marine wastes during collection.

Keywords: floating marine litter, lagrangian transport, particle-tracking model, wastes drift

Procedia PDF Downloads 168
24783 GPU-Based Back-Projection of Synthetic Aperture Radar (SAR) Data onto 3D Reference Voxels

Authors: Joshua Buli, David Pietrowski, Samuel Britton

Abstract:

Processing SAR data usually requires constraints in extent in the Fourier domain as well as approximations and interpolations onto a planar surface to form an exploitable image. This results in a potential loss of data requires several interpolative techniques, and restricts visualization to two-dimensional plane imagery. The data can be interpolated into a ground plane projection, with or without terrain as a component, all to better view SAR data in an image domain comparable to what a human would view, to ease interpretation. An alternate but computationally heavy method to make use of more of the data is the basis of this research. Pre-processing of the SAR data is completed first (matched-filtering, motion compensation, etc.), the data is then range compressed, and lastly, the contribution from each pulse is determined for each specific point in space by searching the time history data for the reflectivity values for each pulse summed over the entire collection. This results in a per-3D-point reflectivity using the entire collection domain. New advances in GPU processing have finally allowed this rapid projection of acquired SAR data onto any desired reference surface (called backprojection). Mathematically, the computations are fast and easy to implement, despite limitations in SAR phase history data size and 3D-point cloud size. Backprojection processing algorithms are embarrassingly parallel since each 3D point in the scene has the same reflectivity calculation applied for all pulses, independent of all other 3D points and pulse data under consideration. Therefore, given the simplicity of the single backprojection calculation, the work can be spread across thousands of GPU threads allowing for accurate reflectivity representation of a scene. Furthermore, because reflectivity values are associated with individual three-dimensional points, a plane is no longer the sole permissible mapping base; a digital elevation model or even a cloud of points (collected from any sensor capable of measuring ground topography) can be used as a basis for the backprojection technique. This technique minimizes any interpolations and modifications of the raw data, maintaining maximum data integrity. This innovative processing will allow for SAR data to be rapidly brought into a common reference frame for immediate exploitation and data fusion with other three-dimensional data and representations.

Keywords: backprojection, data fusion, exploitation, three-dimensional, visualization

Procedia PDF Downloads 46
24782 Benefit Of Waste Collection Route Optimisation

Authors: Bojana Tot, Goran BošKović, Goran Vujić

Abstract:

Route optimisation is a process of planning one or multiple routes, with the purpose of minimizing overall costs, while achieving the highest possible performance under a set of given constraints. It combines routing or route planning, which is the process of creating the most cost-effective route by minimizing the distance or travelled time necessary to reach a set of planned stops, and route scheduling, which is the process of assigning an arrival and service time for each stop, with drivers being given shifts that adhere to their working hours. The objective of this paper is to provide benefits on the implementation of waste collection route optimisation and thus achieve economic efficiency for public utility companies, better service for citizens and positive environment and health.

Keywords: waste management, environment, collection route optimisation, GIS

Procedia PDF Downloads 129
24781 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria

Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe

Abstract:

Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.

Keywords: data portal, data infrastructure, open source, sustainability

Procedia PDF Downloads 66
24780 Geo-Collaboration Model between a City and Its Inhabitants to Develop Complementary Solutions for Better Household Waste Collection

Authors: Abdessalam Hijab, Hafida Boulekbache, Eric Henry

Abstract:

According to several research studies, the city as a whole is a complex, spatially organized system; its modeling must take into account several factors, socio-economic, and political, or geographical, acting at multiple scales of observation according to varied temporalities. Sustainable management and protection of the environment in this complex system require significant human and technical investment, particularly for monitoring and maintenance. The objective of this paper is to propose an intelligent approach based on the coupling of Geographic Information System (GIS) and Information and Communications Technology (ICT) tools in order to integrate the inhabitants in the processes of sustainable management and protection of the urban environment, specifically in the processes of household waste collection in urban areas. We are discussing a collaborative 'city/inhabitant' space. Indeed, it is a geo-collaborative approach, based on the spatialization and real-time geo-localization of topological and multimedia data taken by the 'active' inhabitant, in the form of geo-localized alerts related to household waste issues in their city. Our proposal provides a good understanding of the extent to which civil society (inhabitants) can help and contribute to the development of complementary solutions for the collection of household waste and the protection of the urban environment. Moreover, it allows the inhabitant to contribute to the enrichment of a data bank for future uses. Our geo-collaborative model will be tested in the Lamkansa sampling district of the city of Casablanca in Morocco.

Keywords: geographic information system, GIS, information and communications technology, ICT, geo-collaboration, inhabitants, city

Procedia PDF Downloads 87