Search results for: data reduction
27422 Investigation on Performance of Change Point Algorithm in Time Series Dynamical Regimes and Effect of Data Characteristics
Authors: Farhad Asadi, Mohammad Javad Mollakazemi
Abstract:
In this paper, Bayesian online inference in models of data series are constructed by change-points algorithm, which separated the observed time series into independent series and study the change and variation of the regime of the data with related statistical characteristics. variation of statistical characteristics of time series data often represent separated phenomena in the some dynamical system, like a change in state of brain dynamical reflected in EEG signal data measurement or a change in important regime of data in many dynamical system. In this paper, prediction algorithm for studying change point location in some time series data is simulated. It is verified that pattern of proposed distribution of data has important factor on simpler and smother fluctuation of hazard rate parameter and also for better identification of change point locations. Finally, the conditions of how the time series distribution effect on factors in this approach are explained and validated with different time series databases for some dynamical system.Keywords: time series, fluctuation in statistical characteristics, optimal learning, change-point algorithm
Procedia PDF Downloads 42827421 Technological Innovations and African Export Performances
Authors: Lukman Oyelami
Abstract:
Studies have identified trade as a veritable tool for inclusive economic growth and poverty reduction in developing countries. However, contrary to the overwhelming pieces of evidence of the Asian tiger as a success story of beneficial trade, many African countries still experience poverty unabatedly despite active engagement in trade. Consequently, this study seeks to investigate the contributory effect of technological innovation on total export performance and specifically manufacturing exports of African countries. This is with a view to exploring manufacturing exports as a viable option for diversification. To achieve the empirical investigation this study, require Systems Generalized Method of Moments (sys-GMM) estimation technique was adopted based on the econometric realities inherent in the data utilized. However, the static technique of panel estimation of the Fixed Effects (FE) model was utilized for baseline analysis and robustness check. The conclusion from this study is that innovation generally impacts export performance of African countries positively, however, manufacturing export shows more sensitivity to innovation than total export. And, this provides a clear pathway for export diversification for many African countries that run a resource-based economy.Keywords: innovation, export, GMM, Africa
Procedia PDF Downloads 22227420 Determination of the Risks of Heart Attack at the First Stage as Well as Their Control and Resource Planning with the Method of Data Mining
Authors: İbrahi̇m Kara, Seher Arslankaya
Abstract:
Frequently preferred in the field of engineering in particular, data mining has now begun to be used in the field of health as well since the data in the health sector have reached great dimensions. With data mining, it is aimed to reveal models from the great amounts of raw data in agreement with the purpose and to search for the rules and relationships which will enable one to make predictions about the future from the large amount of data set. It helps the decision-maker to find the relationships among the data which form at the stage of decision-making. In this study, it is aimed to determine the risk of heart attack at the first stage, to control it, and to make its resource planning with the method of data mining. Through the early and correct diagnosis of heart attacks, it is aimed to reveal the factors which affect the diseases, to protect health and choose the right treatment methods, to reduce the costs in health expenditures, and to shorten the durations of patients’ stay at hospitals. In this way, the diagnosis and treatment costs of a heart attack will be scrutinized, which will be useful to determine the risk of the disease at the first stage, to control it, and to make its resource planning.Keywords: data mining, decision support systems, heart attack, health sector
Procedia PDF Downloads 35827419 Determinants of Intensity of Greenhouse Gas Emission in Lithuanian Agriculture
Authors: D. Makuteniene
Abstract:
Agriculture, as one of the human activities, emits a significant amount of greenhouse gas emission and undoubtedly has an impact on climate change. The main gaseous products of agricultural greenhouse gases are carbon dioxide, methane, and nitroxadoxide. The sources and emission of these gases depend on land use, soil, crops, manure, livestock, and energy consumption. One of the indicators showing the agricultural impact on climate change is an intensity of GHG emission and its dynamics. This study analyzed the determinants of an intensity of greenhouse gas emission in Lithuanian agriculture using data decomposition. The research revealed that, although greenhouse gas emission increased during the research period, however, agricultural net value added grew more rapidly, which contributed to a reduction of intensity of greenhouse gas emission in Lithuania between 2000 and 2015. It was identified that during the research period intensity of greenhouse gas emission was mostly increased by the change of the use of nitrogen in agriculture, as compared to the change of the area of agricultural land, and by the change of the number of full-time employees, as compared to the change of net value added. Conversely, the change of energy consumption in agriculture, as compared to the change of the use of nitrogen in agriculture, had a bigger impact in decreasing intensity of greenhouse gas emission.Keywords: agriculture, determinants of intensity, greenhouse gas emission, intensity
Procedia PDF Downloads 18627418 Bayesian Borrowing Methods for Count Data: Analysis of Incontinence Episodes in Patients with Overactive Bladder
Authors: Akalu Banbeta, Emmanuel Lesaffre, Reynaldo Martina, Joost Van Rosmalen
Abstract:
Including data from previous studies (historical data) in the analysis of the current study may reduce the sample size requirement and/or increase the power of analysis. The most common example is incorporating historical control data in the analysis of a current clinical trial. However, this only applies when the historical control dataare similar enough to the current control data. Recently, several Bayesian approaches for incorporating historical data have been proposed, such as the meta-analytic-predictive (MAP) prior and the modified power prior (MPP) both for single control as well as for multiple historical control arms. Here, we examine the performance of the MAP and the MPP approaches for the analysis of (over-dispersed) count data. To this end, we propose a computational method for the MPP approach for the Poisson and the negative binomial models. We conducted an extensive simulation study to assess the performance of Bayesian approaches. Additionally, we illustrate our approaches on an overactive bladder data set. For similar data across the control arms, the MPP approach outperformed the MAP approach with respect to thestatistical power. When the means across the control arms are different, the MPP yielded a slightly inflated type I error (TIE) rate, whereas the MAP did not. In contrast, when the dispersion parameters are different, the MAP gave an inflated TIE rate, whereas the MPP did not.We conclude that the MPP approach is more promising than the MAP approach for incorporating historical count data.Keywords: count data, meta-analytic prior, negative binomial, poisson
Procedia PDF Downloads 12127417 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis
Authors: John Gaber
Abstract:
Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)
Procedia PDF Downloads 48527416 Leveraging Digital Technologies for Smart Waste Management in CE: A Literature Review
Authors: Anne-Marie Tuomala
Abstract:
The study focuses on literature review of leveraging digital technologies such as Internet of Things (IoT), big data analytics (BDA), and artificial intelligence (AI) to optimize waste collection, sorting, and recycling processes, thus promoting a circular economy (CE). The purpose of the study is to introduce how the smart waste management (SWM) systems boost the field compared with the traditional waste management. 27 articles highlight the tangible benefits of digitalization, but addressing barriers to adoption is essential for realizing the full potential of SWM technologies. The results show how digital technologies can be used to monitor and optimize waste collection, resource allocation, and improve efficiency and reduction of the contamination rates. In conclusion, this literature review underscores the transformative potential of digital technologies in advancing SWM systems and promoting CE. Future application should focus strategically 9R or other R strategies to speed up the transformation. Future research should focus on especially addressing challenges and identifying innovative strategies to accelerate the transition toward a more sustainable and circular waste management ecosystem.Keywords: circular economy, digital technologies, smart waste management, waste management strategies
Procedia PDF Downloads 527415 Data Augmentation for Automatic Graphical User Interface Generation Based on Generative Adversarial Network
Authors: Xulu Yao, Moi Hoon Yap, Yanlong Zhang
Abstract:
As a branch of artificial neural network, deep learning is widely used in the field of image recognition, but the lack of its dataset leads to imperfect model learning. By analysing the data scale requirements of deep learning and aiming at the application in GUI generation, it is found that the collection of GUI dataset is a time-consuming and labor-consuming project, which is difficult to meet the needs of current deep learning network. To solve this problem, this paper proposes a semi-supervised deep learning model that relies on the original small-scale datasets to produce a large number of reliable data sets. By combining the cyclic neural network with the generated countermeasure network, the cyclic neural network can learn the sequence relationship and characteristics of data, make the generated countermeasure network generate reasonable data, and then expand the Rico dataset. Relying on the network structure, the characteristics of collected data can be well analysed, and a large number of reasonable data can be generated according to these characteristics. After data processing, a reliable dataset for model training can be formed, which alleviates the problem of dataset shortage in deep learning.Keywords: GUI, deep learning, GAN, data augmentation
Procedia PDF Downloads 18627414 Modelling Rainfall-Induced Shallow Landslides in the Northern New South Wales
Authors: S. Ravindran, Y.Liu, I. Gratchev, D.Jeng
Abstract:
Rainfall-induced shallow landslides are more common in the northern New South Wales (NSW), Australia. From 2009 to 2017, around 105 rainfall-induced landslides occurred along the road corridors and caused temporary road closures in the northern NSW. Rainfall causing shallow landslides has different distributions of rainfall varying from uniform, normal, decreasing to increasing rainfall intensity. The duration of rainfall varied from one day to 18 days according to historical data. The objective of this research is to analyse slope instability of some of the sites in the northern NSW by varying cumulative rainfall using SLOPE/W and SEEP/W and compare with field data of rainfall causing shallow landslides. The rainfall data and topographical data from public authorities and soil data obtained from laboratory tests will be used for this modelling. There is a likelihood of shallow landslides if the cumulative rainfall is between 100 mm to 400 mm in accordance with field data.Keywords: landslides, modelling, rainfall, suction
Procedia PDF Downloads 18527413 Machine Learning-Enabled Classification of Climbing Using Small Data
Authors: Nicholas Milburn, Yu Liang, Dalei Wu
Abstract:
Athlete performance scoring within the climbing do-main presents interesting challenges as the sport does not have an objective way to assign skill. Assessing skill levels within any sport is valuable as it can be used to mark progress while training, and it can help an athlete choose appropriate climbs to attempt. Machine learning-based methods are popular for complex problems like this. The dataset available was composed of dynamic force data recorded during climbing; however, this dataset came with challenges such as data scarcity, imbalance, and it was temporally heterogeneous. Investigated solutions to these challenges include data augmentation, temporal normalization, conversion of time series to the spectral domain, and cross validation strategies. The investigated solutions to the classification problem included light weight machine classifiers KNN and SVM as well as the deep learning with CNN. The best performing model had an 80% accuracy. In conclusion, there seems to be enough information within climbing force data to accurately categorize climbers by skill.Keywords: classification, climbing, data imbalance, data scarcity, machine learning, time sequence
Procedia PDF Downloads 14527412 Supplier Carbon Footprint Methodology Development for Automotive Original Equipment Manufacturers
Authors: Nur A. Özdemir, Sude Erkin, Hatice K. Güney, Cemre S. Atılgan, Enes Huylu, Hüseyin Y. Altıntaş, Aysemin Top, Özak Durmuş
Abstract:
Carbon emissions produced during a product’s life cycle, from extraction of raw materials up to waste disposal and market consumption activities are the major contributors to global warming. In the light of the science-based targets (SBT) leading the way to a zero-carbon economy for sustainable growth of the companies, carbon footprint reporting of the purchased goods has become critical for identifying hotspots and best practices for emission reduction opportunities. In line with Ford Otosan's corporate sustainability strategy, research was conducted to evaluate the carbon footprint of purchased products in accordance with Scope 3 of the Greenhouse Gas Protocol (GHG). The purpose of this paper is to develop a systematic and transparent methodology to calculate carbon footprint of the products produced by automotive OEMs (Original Equipment Manufacturers) within the context of automobile supply chain management. To begin with, primary material data were collected through IMDS (International Material Database System) corresponds to company’s three distinct types of vehicles including Light Commercial Vehicle (Courier), Medium Commercial Vehicle (Transit and Transit Custom), Heavy Commercial Vehicle (F-MAX). Obtained material data was classified as metals, plastics, liquids, electronics, and others to get insights about the overall material distribution of produced vehicles and matched to the SimaPro Ecoinvent 3 database which is one of the most extent versions for modelling material data related to the product life cycle. Product life cycle analysis was calculated within the framework of ISO 14040 – 14044 standards by addressing the requirements and procedures. A comprehensive literature review and cooperation with suppliers were undertaken to identify the production methods of parts used in vehicles and to find out the amount of scrap generated during part production. Cumulative weight and material information with related production process belonging the components were listed by multiplying with current sales figures. The results of the study show a key modelling on carbon footprint of products and processes based on a scientific approach to drive sustainable growth by setting straightforward, science-based emission reduction targets. Hence, this study targets to identify the hotspots and correspondingly provide broad ideas about our understanding of how to integrate carbon footprint estimates into our company's supply chain management by defining convenient actions in line with climate science. According to emission values arising from the production phase including raw material extraction and material processing for Ford OTOSAN vehicles subjected in this study, GHG emissions from the production of metals used for HCV, MCV and LCV account for more than half of the carbon footprint of the vehicle's production. Correspondingly, aluminum and steel have the largest share among all material types and achieving carbon neutrality in the steel and aluminum industry is of great significance to the world, which will also present an immense impact on the automobile industry. Strategic product sustainability plan which includes the use of secondary materials, conversion to green energy and low-energy process design is required to reduce emissions of steel, aluminum, and plastics due to the projected increase in total volume by 2030.Keywords: automotive, carbon footprint, IMDS, scope 3, SimaPro, sustainability
Procedia PDF Downloads 11027411 Analysis of Expression Data Using Unsupervised Techniques
Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe
Abstract:
his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation
Procedia PDF Downloads 15027410 Influence of Atmospheric Pollutants on Child Respiratory Disease in Cartagena De Indias, Colombia
Authors: Jose A. Alvarez Aldegunde, Adrian Fernandez Sanchez, Matthew D. Menden, Bernardo Vila Rodriguez
Abstract:
Up to five statistical pre-processings have been carried out considering the pollutant records of the stations present in Cartagena de Indias, Colombia, also taking into account the childhood asthma incidence surveys conducted in hospitals in the city by the Health Ministry of Colombia for this study. These pre-processings have consisted of different techniques such as the determination of the quality of data collection, determination of the quality of the registration network, identification and debugging of errors in data collection, completion of missing data and purified data, as well as the improvement of the time scale of records. The characterization of the quality of the data has been conducted by means of density analysis of the pollutant registration stations using ArcGis Software and through mass balance techniques, making it possible to determine inconsistencies in the records relating the registration data between stations following the linear regression. The results obtained in this process have highlighted the positive quality in the pollutant registration process. Consequently, debugging of errors has allowed us to identify certain data as statistically non-significant in the incidence and series of contamination. This data, together with certain missing records in the series recorded by the measuring stations, have been completed by statistical imputation equations. Following the application of these prior processes, the basic series of incidence data for respiratory disease and pollutant records have allowed the characterization of the influence of pollutants on respiratory diseases such as, for example, childhood asthma. This characterization has been carried out using statistical correlation methods, including visual correlation, simple linear regression correlation and spectral analysis with PAST Software which identifies maximum periodicity cycles and minimums under the formula of the Lomb periodgram. In relation to part of the results obtained, up to eleven maximums and minimums considered contemporary between the incidence records and the particles have been identified taking into account the visual comparison. The spectral analyses that have been performed on the incidence and the PM2.5 have returned a series of similar maximum periods in both registers, which are at a maximum during a period of one year and another every 25 days (0.9 and 0.07 years). The bivariate analysis has managed to characterize the variable "Daily Vehicular Flow" in the ninth position of importance of a total of 55 variables. However, the statistical correlation has not obtained a favorable result, having obtained a low value of the R2 coefficient. The series of analyses conducted has demonstrated the importance of the influence of pollutants such as PM2.5 in the development of childhood asthma in Cartagena. The quantification of the influence of the variables has been able to determine that there is a 56% probability of dependence between PM2.5 and childhood respiratory asthma in Cartagena. Considering this justification, the study could be completed through the application of the BenMap Software, throwing a series of spatial results of interpolated values of the pollutant contamination records that exceeded the established legal limits (represented by homogeneous units up to the neighborhood level) and results of the impact on the exacerbation of pediatric asthma. As a final result, an economic estimate (in Colombian Pesos) of the monthly and individual savings derived from the percentage reduction of the influence of pollutants in relation to visits to the Hospital Emergency Room due to asthma exacerbation in pediatric patients has been granted.Keywords: Asthma Incidence, BenMap, PM2.5, Statistical Analysis
Procedia PDF Downloads 11827409 Learning Analytics in a HiFlex Learning Environment
Authors: Matthew Montebello
Abstract:
Student engagement within a virtual learning environment generates masses of data points that can significantly contribute to the learning analytics that lead to decision support. Ideally, similar data is collected during student interaction with a physical learning space, and as a consequence, data is present at a large scale, even in relatively small classes. In this paper, we report of such an occurrence during classes held in a HiFlex modality as we investigate the advantages of adopting such a methodology. We plan to take full advantage of the learner-generated data in an attempt to further enhance the effectiveness of the adopted learning environment. This could shed crucial light on operating modalities that higher education institutions around the world will switch to in a post-COVID era.Keywords: HiFlex, big data in higher education, learning analytics, virtual learning environment
Procedia PDF Downloads 20227408 The Feasibility of Anaerobic Digestion at 45⁰C
Authors: Nuruol S. Mohd, Safia Ahmed, Rumana Riffat, Baoqiang Li
Abstract:
Anaerobic digestion at mesophilic and thermophilic temperatures have been widely studied and evaluated by numerous researchers. Limited extensive research has been conducted on anaerobic digestion in the intermediate zone of 45°C, mainly due to the notion that limited microbial activity occurs within this zone. The objectives of this research were to evaluate the performance and the capability of anaerobic digestion at 45°C in producing class A biosolids, in comparison to a mesophilic and thermophilic anaerobic digestion system operated at 35°C and 55°C, respectively. In addition to that, the investigation on the possible inhibition factors affecting the performance of the digestion system at this temperature will be conducted as well. The 45°C anaerobic digestion systems were not able to achieve comparable methane yield and high-quality effluent compared to the mesophilic system, even though the systems produced biogas with about 62-67% methane. The 45°C digesters suffered from high acetate accumulation, but sufficient buffering capacity was observed as the pH, alkalinity and volatile fatty acids (VFA)-to-alkalinity ratio were within recommended values. The accumulation of acetate observed in 45°C systems were presumably due to the high temperature which contributed to high hydrolysis rate. Consequently, it produced a large amount of toxic salts that combined with the substrate making them not readily available to be consumed by methanogens. Acetate accumulation, even though contributed to 52 to 71% reduction in acetate degradation process, could not be considered as completely inhibitory. Additionally, at 45°C, no ammonia inhibition was observed and the digesters were able to achieve volatile solids (VS) reduction of 47.94±4.17%. The pathogen counts were less than 1,000 MPN/g total solids, thus, producing Class A biosolids.Keywords: 45°C anaerobic digestion, acetate accumulation, class A biosolids, salt toxicity
Procedia PDF Downloads 30627407 Li-Fi Technology: Data Transmission through Visible Light
Authors: Shahzad Hassan, Kamran Saeed
Abstract:
People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.Keywords: communication, LED, Li-Fi, Wi-Fi
Procedia PDF Downloads 34827406 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques
Authors: John Onyima, Ikechukwu Ezepue
Abstract:
Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection
Procedia PDF Downloads 30827405 An Analysis of Humanitarian Data Management of Polish Non-Governmental Organizations in Ukraine Since February 2022 and Its Relevance for Ukrainian Humanitarian Data Ecosystem
Authors: Renata Kurpiewska-Korbut
Abstract:
Making an assumption that the use and sharing of data generated in humanitarian action constitute a core function of humanitarian organizations, the paper analyzes the position of the largest Polish humanitarian non-governmental organizations in the humanitarian data ecosystem in Ukraine and their approach to non-personal and personal data management since February of 2022. Both expert interviews and document analysis of non-profit organizations providing a direct response in the Ukrainian crisis context, i.e., the Polish Humanitarian Action, Caritas, Polish Medical Mission, Polish Red Cross, and the Polish Center for International Aid and the applicability of theoretical perspective of contingency theory – with its central point that the context or specific set of conditions determining the way of behavior and the choice of methods of action – help to examine the significance of data complexity and adaptive approach to data management by relief organizations in the humanitarian supply chain network. The purpose of this study is to determine how the existence of well-established and accurate internal procedures and good practices of using and sharing data (including safeguards for sensitive data) by the surveyed organizations with comparable human and technological capabilities are implemented and adjusted to Ukrainian humanitarian settings and data infrastructure. The study also poses a fundamental question of whether this crisis experience will have a determining effect on their future performance. The obtained finding indicate that Polish humanitarian organizations in Ukraine, which have their own unique code of conduct and effective managerial data practices determined by contingencies, have limited influence on improving the situational awareness of other assistance providers in the data ecosystem despite their attempts to undertake interagency work in the area of data sharing.Keywords: humanitarian data ecosystem, humanitarian data management, polish NGOs, Ukraine
Procedia PDF Downloads 9327404 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases
Authors: Daniel C. Bonzo
Abstract:
Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.Keywords: clustered data, estimand, extrapolation, mixed model
Procedia PDF Downloads 13927403 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System
Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu
Abstract:
Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.Keywords: communication, GEO satellite, data relay system, coverage
Procedia PDF Downloads 44327402 Immunohistochemical Evaluation of Moringa oleifera Seed Oil in Cadmium Induced Frontal Cortex Damage in Wistar Rats
Authors: Olusegun D. Omotoso
Abstract:
The use of Moringa oleifera seed oil in the prevention and cure of many ailments particularly, neurodegenerative diseases have been on increasing trend in Nigeria. The study was aimed at investigating the ameliorative or reversal effects by the intervention of Moringa oleifera seed oil on the damage to frontal cortex of Wistar rats by cadmium. Twenty-eight Wistar rats of both sexes weighed between 73g-151g were used. The animals were acclimatized and were fed on rat chow and water ad libitum. The rats were randomly divided into four groups A, B, C and D of 7 rats each. Group A served as control which received 2.5mg/kgbw phosphate buffer intra-peritoneally, while group D served as Moringa-treated control and received oral administration of 2.0 mg/kgbw Moringa oleifera oil. Groups B and C were injected intra-peritoneally with 3.5mg/kgbw CdSO₄.8H₂O single dose. Group C received orally administration of 2.0mg/kgbw Moringa oleifera oil. The intervention lasted for four weeks after which the animals were sacrificed by cervical dislocation and the tissues processed histologically. The immuno-histoarchitecture of the frontal cortex was characterized by pyknosis of nuclei as well as activation of astrocytes which was evidence in group B rats, while those animals in group C showed ameliorative effect that were evidence in reduction in the number of pyknotic nuclei and reduction of activated astrocytes as compared with control group A and Moringa-treated group D. It can be deduced that Moringa oleifera seed oil has natural antioxidant constituents that might have ameliorated the immuno-histoarchitectural damage caused by cadmium.Keywords: cadmium, immuno-histoarchitecture, Moringa oleifera, pyknotic nuclei
Procedia PDF Downloads 21227401 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product
Authors: Tanawat Hongthai, Dusit Thanapatay
Abstract:
This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC
Procedia PDF Downloads 28127400 Quantification of Effects of Structure-Soil-Structure Interactions on Urban Environment under Rayleigh Wave Loading
Authors: Neeraj Kumar, J. P. Narayan
Abstract:
The effects of multiple Structure-Soil-Structure Interactions (SSSI) on the seismic wave-field is generally disregarded by earthquake engineers, particularly the surface waves which cause more damage to buildings. Closely built high rise buildings exchange substantial seismic energy with each other and act as a full-coupled dynamic system. In this paper, SSI effects on the building responses and the free field motion due to a small city consisting 25- homogenous buildings blocks of 10-storey are quantified. The rocking and translational behavior of building under Rayleigh wave loading is studied for different dimensions of the building. The obtained dynamic parameters of buildings revealed a reduction in building roof drift with an increase in number of buildings ahead of the considered building. The strain developed by vertical component of Rayleigh may cause tension in structural components of building. A matching of fundamental frequency of building for the horizontal component of Rayleigh wave with that for vertically incident SV-wave is obtained. Further, the fundamental frequency of building for the vertical vibration is approximately twice to that for horizontal vibration. The city insulation has caused a reduction of amplitude of Rayleigh wave up to 19.3% and 21.6% in the horizontal and vertical components, respectively just outside the city. Further, the insulating effect of city was very large at fundamental frequency of buildings for both the horizontal and vertical components. Therefore, it is recommended to consider the insulating effects of city falling in the path of Rayleigh wave propagation in seismic hazard assessment for an area.Keywords: structure-soil-structure interactions, Rayleigh wave propagation, finite difference simulation, dynamic response of buildings
Procedia PDF Downloads 22027399 Data Hiding by Vector Quantization in Color Image
Authors: Yung Gi Wu
Abstract:
With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.Keywords: data hiding, vector quantization, watermark, color image
Procedia PDF Downloads 36727398 Device for Reversible Hydrogen Isotope Storage with Aluminum Oxide Ceramic Case
Authors: Igor P. Maximkin, Arkady A. Yukhimchuk, Victor V. Baluev, Igor L. Malkov, Rafael K. Musyaev, Damir T. Sitdikov, Alexey V. Buchirin, Vasily V. Tikhonov
Abstract:
Minimization of tritium diffusion leakage when developing devices handling tritium-containing media is key problems whose solution will at least allow essential enhancement of radiation safety and minimization of diffusion losses of expensive tritium. One of the ways to solve this problem is to use Al₂O₃ high-strength non-porous ceramics as a structural material of the bed body. This alumina ceramics offers high strength characteristics, but its main advantages are low hydrogen permeability (as against the used structural material) and high dielectric properties. The latter enables direct induction heating of an hydride-forming metal without essential heating of the pressure and containment vessel. The use of alumina ceramics and induction heating allows: - essential reduction of tritium extraction time; - several orders reduction of tritium diffusion leakage; - more complete extraction of tritium from metal hydrides due to its higher heating up to melting in the event of final disposal of the device. The paper presents computational and experimental results for the tritium bed designed to absorb 6 liters of tritium. Titanium was used as hydrogen isotope sorbent. Results of hydrogen realize kinetic from hydride-forming metal, strength and cyclic service life tests are reported. Recommendations are also provided for the practical use of the given bed type.Keywords: aluminum oxide ceramic, hydrogen pressure, hydrogen isotope storage, titanium hydride
Procedia PDF Downloads 40927397 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19527396 Economic Valuation of Emissions from Mobile Sources in the Urban Environment of Bogotá
Authors: Dayron Camilo Bermudez Mendoza
Abstract:
Road transportation is a significant source of externalities, notably in terms of environmental degradation and the emission of pollutants. These emissions adversely affect public health, attributable to criteria pollutants like particulate matter (PM2.5 and PM10) and carbon monoxide (CO), and also contribute to climate change through the release of greenhouse gases, such as carbon dioxide (CO2). It is, therefore, crucial to quantify the emissions from mobile sources and develop a methodological framework for their economic valuation, aiding in the assessment of associated costs and informing policy decisions. The forthcoming congress will shed light on the externalities of transportation in Bogotá, showcasing methodologies and findings from the construction of emission inventories and their spatial analysis within the city. This research focuses on the economic valuation of emissions from mobile sources in Bogotá, employing methods like hedonic pricing and contingent valuation. Conducted within the urban confines of Bogotá, the study leverages demographic, transportation, and emission data sourced from the Mobility Survey, official emission inventories, and tailored estimates and measurements. The use of hedonic pricing and contingent valuation methodologies facilitates the estimation of the influence of transportation emissions on real estate values and gauges the willingness of Bogotá's residents to invest in reducing these emissions. The findings are anticipated to be instrumental in the formulation and execution of public policies aimed at emission reduction and air quality enhancement. In compiling the emission inventory, innovative data sources were identified to determine activity factors, including information from automotive diagnostic centers and used vehicle sales websites. The COPERT model was utilized to ascertain emission factors, requiring diverse inputs such as data from the national transit registry (RUNT), OpenStreetMap road network details, climatological data from the IDEAM portal, and Google API for speed analysis. Spatial disaggregation employed GIS tools and publicly available official spatial data. The development of the valuation methodology involved an exhaustive systematic review, utilizing platforms like the EVRI (Environmental Valuation Reference Inventory) portal and other relevant sources. The contingent valuation method was implemented via surveys in various public settings across the city, using a referendum-style approach for a sample of 400 residents. For the hedonic price valuation, an extensive database was developed, integrating data from several official sources and basing analyses on the per-square meter property values in each city block. The upcoming conference anticipates the presentation and publication of these results, embodying a multidisciplinary knowledge integration and culminating in a master's thesis.Keywords: economic valuation, transport economics, pollutant emissions, urban transportation, sustainable mobility
Procedia PDF Downloads 6027395 A Study on the Effect of Cod to Sulphate Ratio on Performance of Lab Scale Upflow Anaerobic Sludge Blanket Reactor
Authors: Neeraj Sahu, Ahmad Saadiq
Abstract:
Anaerobic sulphate reduction has the potential for being effective and economically viable over conventional treatment methods for the treatment of sulphate-rich wastewater. However, a major challenge in anaerobic sulphate reduction is the diversion of a fraction of organic carbon towards methane production and some minor problem such as odour problems, corrosion, and increase of effluent chemical oxygen demand. A high-rate anaerobic technology has encouraged researchers to extend its application to the treatment of complex wastewaters with relatively low cost and energy consumption compared to physicochemical methods. Therefore, the aim of this study was to investigate the effects of COD/SO₄²⁻ ratio on the performance of lab scale UASB reactor. A lab-scale upflow anaerobic sludge blanket (UASB) reactor was operated for 170 days. In which first 60 days, for successful start-up with acclimation under methanogenesis and sulphidogenesis at COD/SO₄²⁻ of 18 and were operated at COD/SO₄²⁻ ratios of 12, 8, 4 and 1 to evaluate the effects of the presence of sulfate on the reactor performance. The reactor achieved maximum COD removal efficiency and biogas evolution at the end of acclimation (control). This phase lasted 53 days with 89.5% efficiency. The biogas was 0.6 L/d at (OLR) of 1.0 kg COD/m³d when it was treating synthetic wastewater with effective volume of reactor as 2.8 L. When COD/SO₄²⁻ ratio changed from 12 to 1, slight decrease in COD removal efficiencies (76.8–87.4%) was observed, biogas production decreased from 0.58 to 0.32 L/d, while the sulfate removal efficiency increased from 42.5% to 72.7%.Keywords: anaerobic, chemical oxygen demand, organic loading rate, sulphate, up-flow anaerobic sludge blanket reactor
Procedia PDF Downloads 21927394 Conflicts and Epidemiology of HIV/AIDS: Gender Dimension in Rain Forest Zone of Nigeria
Authors: K. K. Bolarinwa, A. F. O. Ayinde, B. B. Abiona, O. Oyekunle
Abstract:
Conflict and HIV/AIDS infection have had a profound impact on the Sub-Saharan African societies, individually and collectively. Nigeria has been experiencing several violent conflicts in many communities across the geographical spread of the country. These conflicts which often lead to loss of lives, properties and loss of livelihoods are mainly felt by women in terms of increased responsibility towards affected family members with attendant decrease in livelihood options. Despite these, conflict issues have not really received enough focal attention by Nigerian academics. It is against this backdrop that this study was undertaken to describe the respondents, the most prevalent conflict repercussions and most prevalent STDs, in conflict areas. Data were collected using interview schedule to elicit a response from 122 respondents in Southwest Nigeria, through a multi-stage sampling technique involving stratification of respondents into violent conflict areas (VCA) and non-violent conflict areas (NVCA). The data collected were analysed using descriptive statistics and correlation analysis. Results revealed that majority (86.5% and 70.5 %) of the respondents were in the age bracket of 10-39 years in both the VCA and NVCA respectively; 35.5% and 40.2% of the respondents were literate in VCA and NVCA, respectively while 76.5% and 55.8% of the respondents were in the lower income groups in VCA and NVCA, respectively. HIV/AIDS and gonorrhoea were the more predominant (75.2% and 55.6% respectively) STDs in the VCA as against 33.2% and 38.3% respectively in the NVCA. Further, significant (p<0.05) correlation existed between conflict incidence and spread of HIV/AIDS, rape and torture, maltreatment of women as well as sexual harassment; in both VCA and NVCA among others. The study concluded that conflict situations in the study area aggravated incidence of HIV/AIDS and made the women more vulnerable to inhuman treatments such as rape, torture and harassment with attendant reduction in sources of livelihoods. The study recommended among others that sensitisation on control and preventive measures of HIV/AID and other sexually transmitted diseases should be included in programme designed to mitigate against conflicts in the study areas.Keywords: conflict, gender dimension, HIV/AIDS epidemiology, Nigeria
Procedia PDF Downloads 26027393 Application of Agile Project Management to Construction Projects: Case Study
Authors: Ran Etgar, Sarit Freund
Abstract:
Agile project management (APM) has been developed originally for software development project. Construction projects seemed to be more apt to traditional water-fall approach than to APM. However, Construction project suffers from similar problems that necessitated the invention of APM, mainly the need to break down the project structure to small increments, thus minimizing the needed managerial planning and design. Since the classical structure of APM is not applicable the way it is to construction project, a modified version of APM was devised. This method, nicknamed 'The anchor method', exploits the fundamentals of APM (i.e., iterations, or sprints of short time frames or timeboxes, cross-functional teams, risk reduction and adaptation to changes) and adjust them to the construction world. The projects had to be structured appropriately to proactively and quickly adapt to change. The method aims to encompass human behavior and lean towards adaptivity rather than predictability. To enable smooth application of the method, a special project management software was developed, so as to provide solid administrational help and accurate data. The method is tested on a bunch of construction projects and some key performance indicators (KPIs) are collected. According to preliminary results the method is indeed very advantageous and with proper assimilation can radically change the construction project management paradigm.Keywords: agile project management, construction, information systems, project management
Procedia PDF Downloads 132