Search results for: missing data imputation
25175 Transport Emission Inventories and Medical Exposure Modeling: A Missing Link for Urban Health
Authors: Frederik Schulte, Stefan Voß
Abstract:
The adverse effects of air pollution on public health are an increasingly vital problem in planning for urban regions in many parts of the world. The issue is addressed from various angles and by distinct disciplines in research. Epidemiological studies model the relative increase of numerous diseases in response to an increment of different forms of air pollution. A significant share of air pollution in urban regions is related to transport emissions that are often measured and stored in emission inventories. Though, most approaches in transport planning, engineering, and operational design of transport activities are restricted to general emission limits for specific air pollutants and do not consider more nuanced exposure models. We conduct an extensive literature review on exposure models and emission inventories used to study the health impact of transport emissions. Furthermore, we review methods applied in both domains and use emission inventory data of transportation hubs such as ports, airports, and urban traffic for an in-depth analysis of public health impacts deploying medical exposure models. The results reveal specific urban health risks related to transport emissions that may improve urban planning for environmental health by providing insights in actual health effects instead of only referring to general emission limits.Keywords: emission inventories, exposure models, transport emissions, urban health
Procedia PDF Downloads 38825174 The Role of Digital Technology in Crime Prevention: A Case Study of Cellular Forensics Unit, Capital City Police Peshawar
Authors: Muhammad Ashfaq
Abstract:
Main theme: This prime focus of this study is on the role of digital technology in crime prevention, with special focus on Cellular Forensic Unit, Capital City Police Peshawar-Khyber Pakhtunkhwa-Pakistan. Objective(s) of the study: The prime objective of this study is to provide statistics, strategies, and pattern of analysis used for crime prevention in Cellular Forensic Unit of Capital City Police Peshawar, Khyber Pakhtunkhwa-Pakistan. Research Method and Procedure: Qualitative method of research has been used in the study for obtaining secondary data from research wing and Information Technology (IT) section of Peshawar police. Content analysis was the method used for the conduction of the study. This study is delimited to Capital City Police and Cellular Forensic Unit Peshawar-KP, Pakistan. information technologies. Major finding(s): It is evident that the old traditional approach will never provide solutions for better management in controlling crimes. The best way to control crimes and promotion of proactive policing is to adopt new technologies. The study reveals that technology have transformed police more effective and vigilant as compared to traditional policing. The heinous crimes like abduction, missing of an individual, snatching, burglaries, and blind murder cases are now traceable with the help of technology. Recommendation(s): From the analysis of the data, it is reflected that Information Technology (IT) expert should be recruited along with research analyst to timely assist and facilitate operational as well as investigation units of police. A mobile locator should be Provided to Cellular Forensic Unit to timely apprehend the criminals. Latest digital analysis software should be provided to equip the Cellular Forensic Unit.Keywords: criminology-pakistan, crime prevention-KP, digital forensics, digital technology-pakistan
Procedia PDF Downloads 9725173 Times2D: A Time-Frequency Method for Time Series Forecasting
Authors: Reza Nematirad, Anil Pahwa, Balasubramaniam Natarajan
Abstract:
Time series data consist of successive data points collected over a period of time. Accurate prediction of future values is essential for informed decision-making in several real-world applications, including electricity load demand forecasting, lifetime estimation of industrial machinery, traffic planning, weather prediction, and the stock market. Due to their critical relevance and wide application, there has been considerable interest in time series forecasting in recent years. However, the proliferation of sensors and IoT devices, real-time monitoring systems, and high-frequency trading data introduce significant intricate temporal variations, rapid changes, noise, and non-linearities, making time series forecasting more challenging. Classical methods such as Autoregressive integrated moving average (ARIMA) and Exponential Smoothing aim to extract pre-defined temporal variations, such as trends and seasonality. While these methods are effective for capturing well-defined seasonal patterns and trends, they often struggle with more complex, non-linear patterns present in real-world time series data. In recent years, deep learning has made significant contributions to time series forecasting. Recurrent Neural Networks (RNNs) and their variants, such as Long short-term memory (LSTMs) and Gated Recurrent Units (GRUs), have been widely adopted for modeling sequential data. However, they often suffer from the locality, making it difficult to capture local trends and rapid fluctuations. Convolutional Neural Networks (CNNs), particularly Temporal Convolutional Networks (TCNs), leverage convolutional layers to capture temporal dependencies by applying convolutional filters along the temporal dimension. Despite their advantages, TCNs struggle with capturing relationships between distant time points due to the locality of one-dimensional convolution kernels. Transformers have revolutionized time series forecasting with their powerful attention mechanisms, effectively capturing long-term dependencies and relationships between distant time points. However, the attention mechanism may struggle to discern dependencies directly from scattered time points due to intricate temporal patterns. Lastly, Multi-Layer Perceptrons (MLPs) have also been employed, with models like N-BEATS and LightTS demonstrating success. Despite this, MLPs often face high volatility and computational complexity challenges in long-horizon forecasting. To address intricate temporal variations in time series data, this study introduces Times2D, a novel framework that parallelly integrates 2D spectrogram and derivative heatmap techniques. The spectrogram focuses on the frequency domain, capturing periodicity, while the derivative patterns emphasize the time domain, highlighting sharp fluctuations and turning points. This 2D transformation enables the utilization of powerful computer vision techniques to capture various intricate temporal variations. To evaluate the performance of Times2D, extensive experiments were conducted on standard time series datasets and compared with various state-of-the-art algorithms, including DLinear (2023), TimesNet (2023), Non-stationary Transformer (2022), PatchTST (2023), N-HiTS (2023), Crossformer (2023), MICN (2023), LightTS (2022), FEDformer (2022), FiLM (2022), SCINet (2022a), Autoformer (2021), and Informer (2021) under the same modeling conditions. The initial results demonstrated that Times2D achieves consistent state-of-the-art performance in both short-term and long-term forecasting tasks. Furthermore, the generality of the Times2D framework allows it to be applied to various tasks such as time series imputation, clustering, classification, and anomaly detection, offering potential benefits in any domain that involves sequential data analysis.Keywords: derivative patterns, spectrogram, time series forecasting, times2D, 2D representation
Procedia PDF Downloads 4225172 Products in Early Development Phases: Ecological Classification and Evaluation Using an Interval Arithmetic Based Calculation Approach
Authors: Helen L. Hein, Joachim Schwarte
Abstract:
As a pillar of sustainable development, ecology has become an important milestone in research community, especially due to global challenges like climate change. The ecological performance of products can be scientifically conducted with life cycle assessments. In the construction sector, significant amounts of CO2 emissions are assigned to the energy used for building heating purposes. Therefore, sustainable construction materials for insulating purposes are substantial, whereby aerogels have been explored intensively in the last years due to their low thermal conductivity. Therefore, the WALL-ACE project aims to develop an aerogel-based thermal insulating plaster that would achieve minor thermal conductivities. But as in the early stage of development phases, a lot of information is still missing or not yet accessible, the ecological performance of innovative products bases increasingly on uncertain data that can lead to significant deviations in the results. To be able to predict realistically how meaningful the results are and how viable the developed products may be with regard to their corresponding respective market, these deviations however have to be considered. Therefore, a classification method is presented in this study, which may allow comparing the ecological performance of modern products with already established and competitive materials. In order to achieve this, an alternative calculation method was used that allows computing with lower and upper bounds to consider all possible values without precise data. The life cycle analysis of the considered products was conducted with an interval arithmetic based calculation method. The results lead to the conclusion that the interval solutions describing the possible environmental impacts are so wide that the result usability is limited. Nevertheless, a further optimization in reducing environmental impacts of aerogels seems to be needed to become more competitive in the future.Keywords: aerogel-based, insulating material, early development phase, interval arithmetic
Procedia PDF Downloads 14025171 Efficacy and Safety of Electrical Vestibular Stimulation on Adults with Symptoms of Insomnia: A Double-Blind, Randomized, Sham-Controlled Trial
Authors: Teris Cheung, Joyce Yuen Ting Lam, Kwan Hin Fong, Calvin Pak-Wing Cheng, Julie Sittlington, Yu-Tao Xiang, Tim Man Ho Li
Abstract:
Insomnia is one of the most common health problems in the general population. Insomnia can be acute, intermittent, and become chronic, often due to comorbidity with other physical and mental health conditions. Although there are conventional pharmaceutical and psychotherapeutic treatments to treat symptoms of insomnia, however; there is no robust and novel randomized controlled trial (RCT) using transdermal neurostimulation on individuals with insomnia symptoms. This gives us the impetus to execute the first nationwide RCT. Aim: To evaluate the efficacy of Electrical Vestibular Stimulation (VeNS) on individuals with insomnia in Hong Kong. Design: This study was a two-armed, double blinded, randomized, sham-controlled trial. Sampling: 60 community-dwelling adults aged 18 and 60 years with moderate insomnia symptoms or above (Insomnia Severity Index > 14) were recruited. All subjects were computerized randomized into either the active VeNS group or the sham VeNS group on a 1:1 ratio. Intervention: All participants received a home-use VeNS device and used 30-min VeNS sessions during five consecutive days across a 4-week period (total treatment hours: 10). Baseline measurements and post-VeNS evaluation of the psychological outcomes, including 1) insomnia severity, 2) sleep quality, and 3) quality of life were investigated. The short-and long-term sustainability of the VeNS intervention was assessed immediately after poststim and at a 1-month and 3-month follow-up period. Data analysis: A mixed GEE model was used to analyze the repeated measures data. Missing data were managed by multiple imputations. The level of significance was set to p < 0.05. Significance of the study: This is the first trial to examine the efficacy and safety of VeNS among adults with insomnia symptoms in Hong Kong. Findings that emerged were used to determine whether this VeNS device can be considered a self-help technological device to reduce the severity of insomnia in the community setting and to reduce the global disease burden. Clinical Trial Registration: ClinicalTrials.gov, identifier: NCT04452981.Keywords: adults, insomnia, neuromodulation, rct, vestibular stimulation
Procedia PDF Downloads 8225170 The Role of Digital Technology in Crime Prevention: a Case Study of Cellular Forensics Unit, Capital City Police Peshawar-Pakistan
Authors: Muhammad Ashfaq
Abstract:
Main theme: This prime focus of this study is on the role of digital technology in crime prevention, with special focus on Cellular Forensic Unit, Capital City Police Peshawar-Khyber Pakhtunkhwa-Pakistan. Objective(s) of the study: The prime objective of this study is to provide statistics, strategies and pattern of analysis used for crime prevention in Cellular Forensic Unit of Capital City Police Peshawar, Khyber Pakhtunkhwa-Pakistan. Research Method and Procedure: Qualitative method of research has been used in the study for obtaining secondary data from research wing and Information Technology (IT) section of Peshawar police. Content analysis was the method used for the conduction of the study. This study is delimited to Capital City Police and Cellular Forensic Unit Peshawar-KP, Pakistan. information technologies.Major finding(s): It is evident that the old traditional approach will never provide solutions for better management in controlling crimes. The best way to control crimes and promotion of proactive policing is to adopt new technologies. The study reveals that technology have transformed police more effective and vigilant as compared to traditional policing. The heinous crimes like abduction, missing of an individual, snatching, burglaries and blind murder cases are now traceable with the help of technology.Recommendation(s): From the analysis of the data, it is reflected that Information Technology (IT) expert should be recruited along with research analyst to timely assist and facilitate operational as well as investigation units of police .A mobile locator should be Provided to Cellular Forensic Unit to timely apprehend the criminals .Latest digital analysis software should be provided to equip the Cellular Forensic Unit.Keywords: crime-prevention, cellular-forensic unit-pakistan, crime prevention-digital-pakistan, crminology-pakistan
Procedia PDF Downloads 8225169 On the Optimization of a Decentralized Photovoltaic System
Authors: Zaouche Khelil, Talha Abdelaziz, Berkouk El Madjid
Abstract:
In this paper, we present a grid-tied photovoltaic system. The studied topology is structured around a seven-level inverter, supplying a non-linear load. A three-stage step-up DC/DC converter ensures DC-link balancing. The presented system allows the extraction of all the available photovoltaic power. This extracted energy feeds the local load; the surplus energy is injected into the electrical network. During poor weather conditions, where the photovoltaic panels cannot meet the energy needs of the load, the missing power is supplied by the electrical network. At the common connexion point, the network current shows excellent spectral performances.Keywords: seven-level inverter, multi-level DC/DC converter, photovoltaic, non-linear load
Procedia PDF Downloads 19225168 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series
Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold
Abstract:
To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network
Procedia PDF Downloads 13925167 Implementation of an IoT Sensor Data Collection and Analysis Library
Authors: Jihyun Song, Kyeongjoo Kim, Minsoo Lee
Abstract:
Due to the development of information technology and wireless Internet technology, various data are being generated in various fields. These data are advantageous in that they provide real-time information to the users themselves. However, when the data are accumulated and analyzed, more various information can be extracted. In addition, development and dissemination of boards such as Arduino and Raspberry Pie have made it possible to easily test various sensors, and it is possible to collect sensor data directly by using database application tools such as MySQL. These directly collected data can be used for various research and can be useful as data for data mining. However, there are many difficulties in using the board to collect data, and there are many difficulties in using it when the user is not a computer programmer, or when using it for the first time. Even if data are collected, lack of expert knowledge or experience may cause difficulties in data analysis and visualization. In this paper, we aim to construct a library for sensor data collection and analysis to overcome these problems.Keywords: clustering, data mining, DBSCAN, k-means, k-medoids, sensor data
Procedia PDF Downloads 37825166 Government (Big) Data Ecosystem: Definition, Classification of Actors, and Their Roles
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Organizations, including governments, generate (big) data that are high in volume, velocity, veracity, and come from a variety of sources. Public Administrations are using (big) data, implementing base registries, and enforcing data sharing within the entire government to deliver (big) data related integrated services, provision of insights to users, and for good governance. Government (Big) data ecosystem actors represent distinct entities that provide data, consume data, manipulate data to offer paid services, and extend data services like data storage, hosting services to other actors. In this research work, we perform a systematic literature review. The key objectives of this paper are to propose a robust definition of government (big) data ecosystem and a classification of government (big) data ecosystem actors and their roles. We showcase a graphical view of actors, roles, and their relationship in the government (big) data ecosystem. We also discuss our research findings. We did not find too much published research articles about the government (big) data ecosystem, including its definition and classification of actors and their roles. Therefore, we lent ideas for the government (big) data ecosystem from numerous areas that include scientific research data, humanitarian data, open government data, industry data, in the literature.Keywords: big data, big data ecosystem, classification of big data actors, big data actors roles, definition of government (big) data ecosystem, data-driven government, eGovernment, gaps in data ecosystems, government (big) data, public administration, systematic literature review
Procedia PDF Downloads 16225165 Government Big Data Ecosystem: A Systematic Literature Review
Authors: Syed Iftikhar Hussain Shah, Vasilis Peristeras, Ioannis Magnisalis
Abstract:
Data that is high in volume, velocity, veracity and comes from a variety of sources is usually generated in all sectors including the government sector. Globally public administrations are pursuing (big) data as new technology and trying to adopt a data-centric architecture for hosting and sharing data. Properly executed, big data and data analytics in the government (big) data ecosystem can be led to data-driven government and have a direct impact on the way policymakers work and citizens interact with governments. In this research paper, we conduct a systematic literature review. The main aims of this paper are to highlight essential aspects of the government (big) data ecosystem and to explore the most critical socio-technical factors that contribute to the successful implementation of government (big) data ecosystem. The essential aspects of government (big) data ecosystem include definition, data types, data lifecycle models, and actors and their roles. We also discuss the potential impact of (big) data in public administration and gaps in the government data ecosystems literature. As this is a new topic, we did not find specific articles on government (big) data ecosystem and therefore focused our research on various relevant areas like humanitarian data, open government data, scientific research data, industry data, etc.Keywords: applications of big data, big data, big data types. big data ecosystem, critical success factors, data-driven government, egovernment, gaps in data ecosystems, government (big) data, literature review, public administration, systematic review
Procedia PDF Downloads 22825164 Everyday-Life Vocabulary: A Missing Component in Iranian EFL Context
Authors: Yasser Aminifard, Hamdollah Askari
Abstract:
This study aimed at investigating any difference between Iranian senior high school students' performance on Academic Words (AWs) and Everyday-Life Words (ELWs). To this end, in the first phase, a number of 120 male senior high school students were randomly selected from among twelve high schools in Gachsaran to serve as the participants of the study. In the second phase, using purposive sampling, six high school teachers holding an MA in TEFL and with over twenty years of teaching experience were interviewed. Two multiple-choice tests, each comprising 40 items, were given to the participants in order to determine their performance on AWs and ELWs and follow-up semi-structured interviews were conducted to explore teachers' opinions about participants' performance on the two tests. To analyze the data, a paired-samples t-test was carried out to compare the results of both tests and the interviews were also transcribed to pinpoint important themes. The results of the t-test indicated that the participants performed significantly better on AWs than on ELWs. Additionally, results of the interviews boiled down to the fact that the English textbooks designed for Iranian high school students are fundamentally flawed on the grounds that there is a mismatch between students' real language learning needs and what is presented to them as "teaching-to-the-test" materials via these books. Finally, the implications and suggestions for further research are discussed.Keywords: everyday-life words, academic words, textbooks, washback
Procedia PDF Downloads 45625163 A Machine Learning Decision Support Framework for Industrial Engineering Purposes
Authors: Anli Du Preez, James Bekker
Abstract:
Data is currently one of the most critical and influential emerging technologies. However, the true potential of data is yet to be exploited since, currently, about 1% of generated data are ever actually analyzed for value creation. There is a data gap where data is not explored due to the lack of data analytics infrastructure and the required data analytics skills. This study developed a decision support framework for data analytics by following Jabareen’s framework development methodology. The study focused on machine learning algorithms, which is a subset of data analytics. The developed framework is designed to assist data analysts with little experience, in choosing the appropriate machine learning algorithm given the purpose of their application.Keywords: Data analytics, Industrial engineering, Machine learning, Value creation
Procedia PDF Downloads 16825162 Providing Security to Private Cloud Using Advanced Encryption Standard Algorithm
Authors: Annapureddy Srikant Reddy, Atthanti Mahendra, Samala Chinni Krishna, N. Neelima
Abstract:
In our present world, we are generating a lot of data and we, need a specific device to store all these data. Generally, we store data in pen drives, hard drives, etc. Sometimes we may loss the data due to the corruption of devices. To overcome all these issues, we implemented a cloud space for storing the data, and it provides more security to the data. We can access the data with just using the internet from anywhere in the world. We implemented all these with the java using Net beans IDE. Once user uploads the data, he does not have any rights to change the data. Users uploaded files are stored in the cloud with the file name as system time and the directory will be created with some random words. Cloud accepts the data only if the size of the file is less than 2MB.Keywords: cloud space, AES, FTP, NetBeans IDE
Procedia PDF Downloads 20625161 Evaluation of the Impact of Neuropathic Pain on the Quality of Life of Patients
Authors: A. Ibovi Mouondayi, S. Zaher, R. Assadi, K. Erraoui, S. Sboul, J. Daoudim, S. Bousselham, K. Nassar, S. Janani
Abstract:
Introduction: Neuropathic pain (NP) is chronic pain; it can be observed in a large number of clinical situations. This pain results from a lesion of the peripheral or central nervous system. It is a frequent reason for consultations in rheumatology. This pain being chronic, can become disabling for the patient, thereby altering his quality of life. Objective: The objective of this study was to evaluate the impact of neuropathic pain on the quality of life of patients followed-up for chronic neuropathic pain. Material and Method: This is a monocentric, cross-sectional, descriptive, retrospective study conducted in our department over a period of 19 months from October 2020 to April 2022. The missing parameters were collected during phone calls of the patients concerned. The diagnostic tool adopted was the DN4 questionnaire in the dialectal Arabic version. The impact of NP was assessed by the visual analog scale (VAS) on pain, sleep, and function. The impact of PN on mood was assessed by the hospital anxiety, and depression scale (HAD) score in the validated Arabic version. The exclusion criteria were patients followed up for depression and other psychiatric pathologies. Results: A total of 1528 patient data were collected; the average age of the patients was 57 years (standard deviation: 13 years) with extremes ranging from 17 years to 94 years, 91% were women and 9% men with a sex ratio man/woman equal to 0.10. 67% of our patients were married, and 63% of our patients were housewives. 43% of patients were followed-up for degenerative pathology. The NP was cervical radiculopathy in 26%, lumbosacral radiculopathy in 51%, and carpal tunnel syndrome in 20%. 23% of our patients had poor sleep quality, and 54% had average sleep quality. The pain was very intense in 5% of patients; 33% had severe pain, and 58% had moderate pain. The function was limited in 55% of patients. The average HAD score for anxiety and depression was 4.39 (standard deviation: 2.77) and 3.21 (standard deviation: 2.89), respectively. Conclusion: Our data clearly illustrate that neuropathic pain has a negative impact on the quality of sleep and function, as well as the mood of patients, thus influencing their quality of life.Keywords: neuropathic pain, sleep, quality of life, chronic pain
Procedia PDF Downloads 13125160 A Fast and Robust Protocol for Reconstruction and Re-Enactment of Historical Sites
Authors: Sanaa I. Abu Alasal, Madleen M. Esbeih, Eman R. Fayyad, Rami S. Gharaibeh, Mostafa Z. Ali, Ahmed A. Freewan, Monther M. Jamhawi
Abstract:
This research proposes a novel reconstruction protocol for restoring missing surfaces and low-quality edges and shapes in photos of artifacts at historical sites. The protocol starts with the extraction of a cloud of points. This extraction process is based on four subordinate algorithms, which differ in the robustness and amount of resultant. Moreover, they use different -but complementary- accuracy to some related features and to the way they build a quality mesh. The performance of our proposed protocol is compared with other state-of-the-art algorithms and toolkits. The statistical analysis shows that our algorithm significantly outperforms its rivals in the resultant quality of its object files used to reconstruct the desired model.Keywords: meshes, point clouds, surface reconstruction protocols, 3D reconstruction
Procedia PDF Downloads 45625159 Nimart-trained Nurses' Perspectives Regarding Virally Unsuppressed Children HIV-positive on Antiretroviral Therapy and Missing Scheduled Clinic Visits: Mopani District, Limpopo Province
Authors: Linneth Nkateko Mabila, Patrick Hulisani Demana, Tebogo Maria Mothiba
Abstract:
Background: Sustaining adherence to antiretroviral therapy (ART) over the long term by people, especially children living with Human-Immunodeficiency Virus (HIV), requires accurate and consistent monitoring, and this is a particular challenge for countries in sub-Saharan Africa. However, the regularity and punctuality in monthly antiretroviral treatment collections indicate medication adherence to a certain extent since it has been revealed to be a significant determinant of the outcome of ART. Aim: This study assessed and described the pattern of monthly antiretroviral treatment collections among a cohort of virally unsuppressed HIV-positive children initiated and managed on ART in the rural public clinics of Mopani District, Limpopo, and explored the nurses' perceptions and views of the findings. Methods: A facility-based mixed-methods study was conducted to assess the honoring of scheduled monthly treatment collection practices by a cohort of HIV-positive children under 15 years initiated and managed on ART by Nurse Initiated Management of Antiretroviral Treatment (NIMART)-trained professional nurses (PNs) from 01 January 2015 to 31 December 2015 in public PHC clinics of Mopani District Municipality. This was followed by the exploration of the nurses' perceptions and views regarding this issue to share their experiences and knowledge acquired through managing these children on ART. Results: From a total of 7105 analysable visits, only 44% (3134) were honored as scheduled, with 40% (2828) of children presenting to the clinics after the scheduled appointment date – they missed their appointments, and 11% (768) of treatment collections that took place before the scheduled appointment date. This finding was further confirmed by 90% (97) of the nurses, who reported that they have children who miss scheduled appointments in their public clinics. The primary reasons for children missing appointments were related to caregivers' forgetfulness and conflict between the school schedule and the dates of clinic visits. Conclusion: We confirmed a high prevalence of non-adherence to scheduled monthly ART collections and the existence of health system, social, and caregiver-related factors that threaten treatment adherence and proper clinical outcomes. These findings suggest an urgent need for intervention since non-adherence to ARV therapy can be life-threatening to the child and poses the danger of reduced life expectancy.Keywords: antiretroviral therapy (art), nimart, virally unsuppressed children, missed appointments
Procedia PDF Downloads 10025158 Business Intelligence for Profiling of Telecommunication Customer
Authors: Rokhmatul Insani, Hira Laksmiwati Soemitro
Abstract:
Business Intelligence is a methodology that exploits the data to produce information and knowledge systematically, business intelligence can support the decision-making process. Some methods in business intelligence are data warehouse and data mining. A data warehouse can store historical data from transactional data. For data modelling in data warehouse, we apply dimensional modelling by Kimball. While data mining is used to extracting patterns from the data and get insight from the data. Data mining has many techniques, one of which is segmentation. For profiling of telecommunication customer, we use customer segmentation according to customer’s usage of services, customer invoice and customer payment. Customers can be grouped according to their characteristics and can be identified the profitable customers. We apply K-Means Clustering Algorithm for segmentation. The input variable for that algorithm we use RFM (Recency, Frequency and Monetary) model. All process in data mining, we use tools IBM SPSS modeller.Keywords: business intelligence, customer segmentation, data warehouse, data mining
Procedia PDF Downloads 48325157 Trend of Foot and Mouth Disease and Adopted Control Measures in Limpopo Province during the Period 2014 to 2020
Authors: Temosho Promise Chuene, T. Chitura
Abstract:
Background: Foot and mouth disease is a real challenge in South Africa. The disease is a serious threat to the viability of livestock farming initiatives and affects local and international livestock trade. In Limpopo Province, the Kruger National Park and other game reserves are home to the African buffalo (Syncerus caffer), a notorious reservoir of the picornavirus, which causes foot and mouth disease. Out of the virus’s seven (7) distinct serotypes, Southern African Territories (SAT) 1, 2, and 3 are commonly endemic in South Africa. The broad objective of the study was to establish the trend of foot and mouth disease in Limpopo Province over a seven-year period (2014-2020), as well as the adoption and comprehensive reporting of the measures that are taken to contain disease outbreaks in the study area. Methods: The study used secondary data from the World Organization for Animal Health (WOAH) on reported cases of foot and mouth disease in South Africa. Descriptive analysis (frequencies and percentages) and Analysis of variance (ANOVA) were used to present and analyse the data. Result: The year 2020 had the highest prevalence of foot and mouth disease (3.72%), while 2016 had the lowest prevalence (0.05%). Serotype SAT 2 was the most endemic, followed by SAT 1. Findings from the study demonstrated the seasonal nature of foot and mouth disease in the study area, as most disease cases were reported in the summer seasons. Slaughter of diseased and at-risk animals was the only documented disease control strategy, and information was missing for some of the years. Conclusion: The study identified serious underreporting of the adopted control strategies following disease outbreaks. Adoption of comprehensive disease control strategies coupled with thorough reporting can help to reduce outbreaks of foot and mouth disease and prevent losses to the livestock farming sector of South Africa and Limpopo Province in particular.Keywords: livestock farming, African buffalo, prevalence, serotype, slaughter
Procedia PDF Downloads 6425156 ISIS and Its Impact on Geographical Change in Iraq’s Population
Authors: Pshtiwan Shafiq Ahmed
Abstract:
The invasion of Iraq was a turning point in Iraq, destroying the economic infrastructure of several important strategic and historic cities, including Mosul, Anbar and Diyala, which will take decades to rebuild It left 18,805 people dead and 37,000 injured, destroying hundreds of villages and cities, displacing 2.3 million people, and increasing the number of orphans The increase in the number of windows and the destruction of society and the structure of the population so that the number of children, women and the elderly has increased. Religious clashes have increased and religious cleansing has begun, trying to eradicate Christianity, Yazidis and Kakais from the whole of Iraq, causing the largest number of Christians, Yazidis and Kakais to leave Iraq and many of them went missing.Keywords: ISIS, population change, geographical change, Iraq
Procedia PDF Downloads 9225155 Survival Analysis after a First Ischaemic Stroke Event: A Case-Control Study in the Adult Population of England.
Authors: Padma Chutoo, Elena Kulinskaya, Ilyas Bakbergenuly, Nicholas Steel, Dmitri Pchejetski
Abstract:
Stroke is associated with a significant risk of morbidity and mortality. There is scarcity of research on the long-term survival after first-ever ischaemic stroke (IS) events in England with regards to effects of different medical therapies and comorbidities. The objective of this study was to model the all-cause mortality after an IS diagnosis in the adult population of England. Using a retrospective case-control design, we extracted the electronic medical records of patients born prior to or in year 1960 in England with a first-ever ischaemic stroke diagnosis from January 1986 to January 2017 within the Health and Improvement Network (THIN) database. Participants with a history of ischaemic stroke were matched to 3 controls by sex and age at diagnosis and general practice. The primary outcome was the all-cause mortality. The hazards of the all-cause mortality were estimated using a Weibull-Cox survival model which included both scale and shape effects and a shared random effect of general practice. The model included sex, birth cohort, socio-economic status, comorbidities and medical therapies. 20,250 patients with a history of IS (cases) and 55,519 controls were followed up to 30 years. From 2008 to 2015, the one-year all-cause mortality for the IS patients declined with an absolute change of -0.5%. Preventive treatments to cases increased considerably over time. These included prescriptions of statins and antihypertensives. However, prescriptions for antiplatelet drugs decreased in the routine general practice since 2010. The survival model revealed a survival benefit of antiplatelet treatment to stroke survivors with hazard ratio (HR) of 0.92 (0.90 – 0.94). IS diagnosis had significant interactions with gender and age at entry and hypertension diagnosis. IS diagnosis was associated with high risk of all-cause mortality with HR= 3.39 (3.05-3.72) for cases compared to controls. Hypertension was associated with poor survival with HR = 4.79 (4.49 - 5.09) for hypertensive cases relative to non-hypertensive controls, though the detrimental effect of hypertension has not reached significance for hypertensive controls, HR = 1.19(0.82-1.56). This study of English primary care data showed that between 2008 and 2015, the rates of prescriptions of stroke preventive treatments increased, and a short-term all-cause mortality after IS stroke declined. However, stroke resulted in poor long-term survival. Hypertension, a modifiable risk factor, was found to be associated with poor survival outcomes in IS patients. Antiplatelet drugs were found to be protective to survival. Better efforts are required to reduce the burden of stroke through health service development and primary prevention.Keywords: general practice, hazard ratio, health improvement network (THIN), ischaemic stroke, multiple imputation, Weibull-Cox model.
Procedia PDF Downloads 18625154 Partner Selection for Horizontal Logistic Cooperation
Authors: Mario Winkelhaus, Franz Vallée
Abstract:
Many companies see horizontal cooperation as a promising possibility to increase their efficiency in outbound logistics. The selection of suitable partners has particular importance in the formation of horizontal cooperation. Up until now, literature mainly focused on general applicable methods for the identification of cooperation partners without a closer examination of the specific area where the cooperation takes place. Thus, specific criteria as a basis for the partner selection in the field of logistics cooperation are missing. To close this scientific gap, an explorative research approach is used to answer the open question of the article. To collect the needed criteria, a qualitative experiment with 20 participants from 16 companies was done. Within this workshop, general criteria, as well as sector-specific requirements, have been identified which were integrated in a partner selection model.Keywords: horizontal cooperation, logistics cooperation partnering criteria, partner selection
Procedia PDF Downloads 42625153 PDDA: Priority-Based, Dynamic Data Aggregation Approach for Sensor-Based Big Data Framework
Authors: Lutful Karim, Mohammed S. Al-kahtani
Abstract:
Sensors are being used in various applications such as agriculture, health monitoring, air and water pollution monitoring, traffic monitoring and control and hence, play the vital role in the growth of big data. However, sensors collect redundant data. Thus, aggregating and filtering sensors data are significantly important to design an efficient big data framework. Current researches do not focus on aggregating and filtering data at multiple layers of sensor-based big data framework. Thus, this paper introduces (i) three layers data aggregation and framework for big data and (ii) a priority-based, dynamic data aggregation scheme (PDDA) for the lowest layer at sensors. Simulation results show that the PDDA outperforms existing tree and cluster-based data aggregation scheme in terms of overall network energy consumptions and end-to-end data transmission delay.Keywords: big data, clustering, tree topology, data aggregation, sensor networks
Procedia PDF Downloads 34525152 Understanding the Reasons for Flooding in Chennai and Strategies for Making It Flood Resilient
Authors: Nivedhitha Venkatakrishnan
Abstract:
Flooding in urban areas in India has become a usual ritual phenomenon and a nightmare to most cities, which is a consequence of man-made disruption resulting in disaster. The City planning in India falls short of withstanding hydro generated disasters. This has become a barrier and challenge in the process of development put forth by urbanization, high population density, expanding informal settlements, environment degradation from uncollected and untreated waste that flows into natural drains and water bodies, this has disrupted the natural mechanism of hazard protection such as drainage channels, wetlands and floodplains. The magnitude and the impact of the mishap was high because of the failure of development policies, strategies, plans that the city had adopted. In the current scenario, cities are becoming the home for future, with economic diversification bringing in more investment into cities especially in domains of Urban infrastructure, planning and design. The uncertainty of the Urban futures in these low elevated coastal zones faces an unprecedented risk and threat. The study on focuses on three major pillars of resilience such as Recover, Resist and Restore. This process of getting ready to handle the situation bridges the gap between disaster response management and risk reduction requires a shift in paradigm. The study involved a qualitative research and a system design approach (framework). The initial stages involved mapping out of the urban water morphology with respect to the spatial growth gave an insight of the water bodies that have gone missing over the years during the process of urbanization. The major finding of the study was missing links between traditional water harvesting network was a major reason resulting in a manmade disaster. The research conceptualized the ideology of a sponge city framework which would guide the growth through institutional frameworks at different levels. The next stage was on understanding the implementation process at various stage to ensure the shift in paradigm. Demonstration of the concepts at a neighborhood level where, how, what are the functions and benefits of each component. Quantifying the design decision with rainwater harvest, surface runoff and how much water is collected and how it could be collected, stored and reused. The study came with further recommendation for Water Mitigation Spaces that will revive the traditional harvesting network.Keywords: flooding, man made disaster, resilient city, traditional harvesting network, waterbodies
Procedia PDF Downloads 14025151 Theoretical Study of Gas Adsorption in Zirconium Clusters
Authors: Rasha Al-Saedi, Anthony Meijer
Abstract:
The progress of new porous materials has increased rapidly over the past decade for use in applications such as catalysis, gas storage and removal of environmentally unfriendly species due to their high surface area and high thermal stability. In this work, a theoretical study of the zirconium-based metal organic framework (MOFs) were examined in order to determine their potential for gas adsorption of various guest molecules: CO2, N2, CH4 and H2. The zirconium cluster consists of an inner Zr6O4(OH)4 core in which the triangular faces of the Zr6- octahedron are alternatively capped by O and OH groups which bound to nine formate groups and three benzoate groups linkers. General formula is [Zr(μ-O)4(μ-OH)4(HCOO)9((phyO2C)3X))] where X= CH2OH, CH2NH2, CH2CONH2, n(NH2); (n = 1-3). Three types of adsorption sites on the Zr metal center have been studied, named according to capped chemical groups as the ‘−O site’; the H of (μ-OH) site removed and added to (μ-O) site, ‘–OH site’; (μ-OH) site removed, the ‘void site’ where H2O molecule removed; (μ-OH) from one site and H from other (μ-OH) site, in addition to no defect versions. A series of investigations have been performed aiming to address this important issue. First, density functional theory DFT-B3LYP method with 6-311G(d,p) basis set was employed using Gaussian 09 package in order to evaluate the gas adsorption performance of missing-linker defects in zirconium cluster. Next, study the gas adsorption behaviour on different functionalised zirconium clusters. Those functional groups as mentioned above include: amines, alcohol, amide, in comparison with non-substitution clusters. Then, dispersion-corrected density functional theory (DFT-D) calculations were performed to further understand the enhanced gas binding on zirconium clusters. Finally, study the water effect on CO2 and N2 adsorption. The small functionalized Zr clusters were found to result in good CO2 adsorption over N2, CH4, and H2 due to the quadrupole moment of CO2 while N2, CH4 and H2 weakly polar or non-polar. The adsorption efficiency was determined using the dispersion method where the adsorption binding improved as most of the interactions, for example, van der Waals interactions are missing with the conventional DFT method. The calculated gas binding strengths on the no defect site are higher than those on the −O site, −OH site and the void site, this difference is especially notable for CO2. It has been stated that the enhanced affinity of CO2 of no defect versions is most likely due to the electrostatic interactions between the negatively charged O of CO2 and the positively charged H of (μ-OH) metal site. The uptake of the gas molecule does not enhance in presence of water as the latter binds to Zr clusters more strongly than gas species which attributed to the competition on adsorption sites.Keywords: density functional theory, gas adsorption, metal- organic frameworks, molecular simulation, porous materials, theoretical chemistry
Procedia PDF Downloads 18425150 A Comparative Analysis of E-Government Quality Models
Authors: Abdoullah Fath-Allah, Laila Cheikhi, Rafa E. Al-Qutaish, Ali Idri
Abstract:
Many quality models have been used to measure e-government portals quality. However, the absence of an international consensus for e-government portals quality models results in many differences in terms of quality attributes and measures. The aim of this paper is to compare and analyze the existing e-government quality models proposed in literature (those that are based on ISO standards and those that are not) in order to propose guidelines to build a good and useful e-government portals quality model. Our findings show that, there is no e-government portal quality model based on the new international standard ISO 25010. Besides that, the quality models are not based on a best practice model to allow agencies to both; measure e-government portals quality and identify missing best practices for those portals.Keywords: e-government, portal, best practices, quality model, ISO, standard, ISO 25010, ISO 9126
Procedia PDF Downloads 56025149 Modeling Stream Flow with Prediction Uncertainty by Using SWAT Hydrologic and RBNN Neural Network Models for Agricultural Watershed in India
Authors: Ajai Singh
Abstract:
Simulation of hydrological processes at the watershed outlet through modelling approach is essential for proper planning and implementation of appropriate soil conservation measures in Damodar Barakar catchment, Hazaribagh, India where soil erosion is a dominant problem. This study quantifies the parametric uncertainty involved in simulation of stream flow using Soil and Water Assessment Tool (SWAT), a watershed scale model and Radial Basis Neural Network (RBNN), an artificial neural network model. Both the models were calibrated and validated based on measured stream flow and quantification of the uncertainty in SWAT model output was assessed using ‘‘Sequential Uncertainty Fitting Algorithm’’ (SUFI-2). Though both the model predicted satisfactorily, but RBNN model performed better than SWAT with R2 and NSE values of 0.92 and 0.92 during training, and 0.71 and 0.70 during validation period, respectively. Comparison of the results of the two models also indicates a wider prediction interval for the results of the SWAT model. The values of P-factor related to each model shows that the percentage of observed stream flow values bracketed by the 95PPU in the RBNN model as 91% is higher than the P-factor in SWAT as 87%. In other words the RBNN model estimates the stream flow values more accurately and with less uncertainty. It could be stated that RBNN model based on simple input could be used for estimation of monthly stream flow, missing data, and testing the accuracy and performance of other models.Keywords: SWAT, RBNN, SUFI 2, bootstrap technique, stream flow, simulation
Procedia PDF Downloads 37025148 Urban Metis Women’s Identity and Experiences with Health Services in Toronto, Ontario
Authors: Renee Monchalin
Abstract:
Métis peoples, while comprising over a third of the total Indigenous population in Canada, experience major gaps in health services that accommodate their cultural identities. This is problematic given Métis peoples experience severe disparities in health determinants and outcomes compared to the non-Indigenous Canadian population. At the same time, Métis are unlikely to engage in health services that do not value their cultural identities, often utilizing mainstream options. Given these contexts, this research aims to fill the culturally-safe health care gap for Métis peoples in Canada. It does this by engaging 56 urban Métis women who participated in a longitudinal cohort study, Our Health Counts (OHC) Toronto. Traditionally, Métis women were central to the health and well-being of their communities. However, due to decades of colonial legislation and forced land displacement, female narratives have been silenced, and Métis identities have been fractured. This has resulted in having direct implications on Métis people’s current health and access to health services. Solutions to filling the Métis health service gap may lie in the all too often unacknowledged or missing voices of Métis women. Through a conversational method, this research will explore urban Métis women’s perspectives on identity and their experiences with health services in Toronto. The goal of this research is to learn from urban Métis women on steps towards filling the health service gap. This research is currently in the data collection stage. Preliminary findings from the conversations will be disseminated. Policy recommendations for health service providers will be provided to better accommodate Métis people.Keywords: indigenous health, Metis health, urban, health service access, identity
Procedia PDF Downloads 21625147 Hypertension and Its Association with Oral Health Status in Adults: A Pilot Study in Padusunan Adults Community
Authors: Murniwati, Nurul Khairiyah, Putri Ovieza Maizar
Abstract:
The association between general and oral health is clearly important, particularly in adults with medical conditions. Many of the medical systemic conditions are either caused or aggravated by poor oral hygiene and vice versa. Hypertension is one of common medical systemic problem which has been a public health concern worldwide due to its known consequences. Those consequences must be related to oral health status as well, whether it may cause or worsen the oral health conditions. The objective of this study was to find out the association between hypertension and oral health status in adults. This study was an analytical observational study by using cross-sectional method. A total of 42 adults both male and female in Padusunan Village, Pariaman, West Sumatra, Indonesia were selected as subjects by using purposive sampling. Manual sphygmomanometer was used to measure blood pressure and dental examination was performed to calculate the decayed, missing, and filled teeth (DMFT) scores in order to represent oral health status. The data obtained was analyzed statistically using One Way ANOVA to determine the association between hypertensive adults and their oral health status. The result showed that majority age of the subjects was ranging from 51-70 years (40.5%). Based on blood pressure examination, 57.1% of subjects were classified to prehypertension. Overall, the mean of DMFT score calculated in normal, prehypertension and hypertension group was not considered statistically significant. There was no significant association (p>0.05) between hypertension and oral health status in adults.Keywords: blood pressure, hypertension, DMFT, oral health status
Procedia PDF Downloads 32725146 A Type-2 Fuzzy Model for Link Prediction in Social Network
Authors: Mansoureh Naderipour, Susan Bastani, Mohammad Fazel Zarandi
Abstract:
Predicting links that may occur in the future and missing links in social networks is an attractive problem in social network analysis. Granular computing can help us to model the relationships between human-based system and social sciences in this field. In this paper, we present a model based on granular computing approach and Type-2 fuzzy logic to predict links regarding nodes’ activity and the relationship between two nodes. Our model is tested on collaboration networks. It is found that the accuracy of prediction is significantly higher than the Type-1 fuzzy and crisp approach.Keywords: social network, link prediction, granular computing, type-2 fuzzy sets
Procedia PDF Downloads 325