Search results for: data source
27150 Energy and Economic Analysis of Heat Recovery from Boiler Exhaust Flue Gas
Authors: Kemal Comakli, Meryem Terhan
Abstract:
In this study, the potential of heat recovery from waste flue gas was examined in 60 MW district heating system of a university, and fuel saving was aimed by using the recovered heat in the system as a source again. Various scenarios are intended to make use of waste heat. For this purpose, actual operation data of the system were taken. Besides, the heat recovery units that consist of heat exchangers such as flue gas condensers, economizers or air pre-heaters were designed theoretically for each scenario. Energy analysis of natural gas-fired boiler’s exhaust flue gas in the system, and economic analysis of heat recovery units to predict payback periods were done. According to calculation results, the waste heat loss ratio from boiler flue gas in the system was obtained as average 16%. Thanks to the heat recovery units, thermal efficiency of the system can be increased, and fuel saving can be provided. At the same time, a huge amount of green gas emission can be decreased by installing the heat recovery units.Keywords: heat recovery from flue gas, energy analysis of flue gas, economical analysis, payback period
Procedia PDF Downloads 28827149 Image Processing-Based Maize Disease Detection Using Mobile Application
Authors: Nathenal Thomas
Abstract:
In the food chain and in many other agricultural products, corn, also known as maize, which goes by the scientific name Zea mays subsp, is a widely produced agricultural product. Corn has the highest adaptability. It comes in many different types, is employed in many different industrial processes, and is more adaptable to different agro-climatic situations. In Ethiopia, maize is among the most widely grown crop. Small-scale corn farming may be a household's only source of food in developing nations like Ethiopia. The aforementioned data demonstrates that the country's requirement for this crop is excessively high, and conversely, the crop's productivity is very low for a variety of reasons. The most damaging disease that greatly contributes to this imbalance between the crop's supply and demand is the corn disease. The failure to diagnose diseases in maize plant until they are too late is one of the most important factors influencing crop output in Ethiopia. This study will aid in the early detection of such diseases and support farmers during the cultivation process, directly affecting the amount of maize produced. The diseases in maize plants, such as northern leaf blight and cercospora leaf spot, have distinct symptoms that are visible. This study aims to detect the most frequent and degrading maize diseases using the most efficiently used subset of machine learning technology, deep learning so, called Image Processing. Deep learning uses networks that can be trained from unlabeled data without supervision (unsupervised). It is a feature that simulates the exercises the human brain goes through when digesting data. Its applications include speech recognition, language translation, object classification, and decision-making. Convolutional Neural Network (CNN) for Image Processing, also known as convent, is a deep learning class that is widely used for image classification, image detection, face recognition, and other problems. it will also use this algorithm as the state-of-the-art for my research to detect maize diseases by photographing maize leaves using a mobile phone.Keywords: CNN, zea mays subsp, leaf blight, cercospora leaf spot
Procedia PDF Downloads 7427148 Local Development and Community Participation in Owo Local Government Area of Ondo State, Nigeria
Authors: Tolu Lawal
Abstract:
The genuine development of the grassroots particularly in the developing societies depends largely on the participation of the rural populace in policy conception and implementation, especially in the area of development policies, fundamentally, the rural people play a vital and significance role in economic and political development of the nation. This is because the bulk of the economic produce as well as votes come from these areas. However, the much needed development has continued to elude the rural communities inspire of the various development policies carried out by successive governments in the state. The exclusion of rural communities from planning and implementation of facilities meant to benefit them, and the international debate on sustainable rural development led Ondo State government to re-think its rural development policy with a view to establishing more effective strategies for rural development. The 31s initiatives introduced in 2009 emphasizes the important role of communities in their own development. The paper therefore critically assessed the 31s initiative of the present government in Ondo State with a view to knowing its impact on rural people. The study adopted both primary and secondary data to source its information. Interviews were conducted with the key informants, and field survey (visit) was also part of method of collecting data. Documents, reports and records on 31s initiatives in the selected villages and from outside were also consulted. The paper submitted that 31s initiative has not impacted positively on the lives of rural dwellers in Ondo-State, most especially in the areas of infrastructure and integrated development. The findings also suggested that 31s initiatives is not hopeless, but needs a different kind of investment, for example introducing measures of accountability, addressing the politicization of the initiative and exploiting key principles of development and service delivery.Keywords: development, infrastructure, rural development, participation
Procedia PDF Downloads 30627147 Dry-Extrusion of Asian Carp, a Sustainable Source of Natural Methionine for Organic Poultry Production
Authors: I. Upadhyaya, K. Arsi, A. M. Donoghue, C. N. Coon, M. Schlumbohm, M. N. Riaz, M. B. Farnell, A. Upadhyay, A. J. Davis, D. J. Donoghue
Abstract:
Methionine, a sulfur containing amino acid, is essential for healthy poultry production. Synthetic methionine is commonly used as a supplement in conventional poultry. However, for organic poultry, a natural, cost effective source of methionine that can replace synthetic methionine is unavailable. Invasive Asian carp (AC) are a potential natural methionine source; however, there is no proven technology to utilize this fish methionine. Commercially available rendering is environmentally challenging due to the offensive smell produced during production. We explored extrusion technology as a potential cost effective alternative to fish rendering. We also determined the amino acid composition, digestible amino acids and total metabolizable energy (TMEn) for the extruded AC fish meal. Dry extrusion of AC was carried out by mixing the fish with soybean meal (SBM) in a 1:1 proportion to reduce high moisture in the fishmeal using an Insta Pro Jr. dry extruder followed by drying and grinding of the product. To determine the digestible amino acids and TMEn of the extruded product, a colony of cecectomized Bovans White Roosters was used. Adult roosters (48 weeks of age) were fasted for 30 h and tube fed 35 grams of 3 treatments: (1) extruded AC fish meal, (2) SBM and (3) corn. Excreta from each individual bird was collected for the next 48 h. An additional 10 unfed roosters served as endogenous controls. The gross energy and protein content of the feces from the treatments were determined to calculate the TMEn. Fecal samples and treatment feeds were analyzed for amino acid content and percent digestible amino acid. Results from the analysis suggested that addition of Asian carp increased the methionine content of SBM from 0.63 to 0.83%. Also, the digestibility of amino acid and the TMEn values were greater for the AC meal with SBM than SBM alone. The dry extruded AC meal analysis is indicative that the product can replace SBM alone and enhance natural methionine in a standard poultry ration. The results from feed formulation using different concentrations of the AC fish meal depict a potential diet which can supplement the required methionine content in organic poultry production.Keywords: Asian carp, extrusion, natural methionine, organic poultry
Procedia PDF Downloads 21727146 A Modular Framework for Enabling Analysis for Educators with Different Levels of Data Mining Skills
Authors: Kyle De Freitas, Margaret Bernard
Abstract:
Enabling data mining analysis among a wider audience of educators is an active area of research within the educational data mining (EDM) community. The paper proposes a framework for developing an environment that caters for educators who have little technical data mining skills as well as for more advanced users with some data mining expertise. This framework architecture was developed through the review of the strengths and weaknesses of existing models in the literature. The proposed framework provides a modular architecture for future researchers to focus on the development of specific areas within the EDM process. Finally, the paper also highlights a strategy of enabling analysis through either the use of predefined questions or a guided data mining process and highlights how the developed questions and analysis conducted can be reused and extended over time.Keywords: educational data mining, learning management system, learning analytics, EDM framework
Procedia PDF Downloads 32627145 Establishing a Microbial Co-Culture for Production of Cellulases Using Banana (Musa Paradisiaca) Pseudostem
Authors: Mulanga Luscious Mulaudzi, Ignatious Ncube
Abstract:
In nature, enzymatic degradation of lignocellulose is more efficient compared to in vivo bioprocessing. Thus, a co-culture should enable production of more efficient enzyme preparations that would mimic the natural decomposition of lignocellulose. The aim of the study was to establish a microbial co-culture for the production of highly active cellulase preparations. The objectives were to determine the use of a variety of culture media to isolate cellulose degrading microorganisms from decomposing banana pseudo stem and to optimize production of cellulase by co-cultures of microorganisms producing high levels of cellulose. Screening of fungal isolates was done on carboxylmethylcellulose agar plates which were stained with Congo red to show hydrolytic activity of the isolates. Co-culture and mixed culture of these microorganisms were cultured using Mendel salts with Avicel as the carbon source. Cultures were incubated at 30 °C with shaking at 200 rpm for 240 hrs. Enzyme activity assays were performed to determine endoglycosidase and β-glucosidase. Mixed culture of fungi-dead bacterial cells showed to be the best co-culture/ mixed culture to produce higher levels of cellulase activity in submerged fermentations (SmF) using Avicel™ as a carbon source. The study concludes use microorganism 5A in co-cultures is highly recommended in order to produce high amounts of β-glucosidases, no matter the combination used.Keywords: avicel, co-culture, submerged fermentation, pseudostem
Procedia PDF Downloads 12427144 Using Audit Tools to Maintain Data Quality for ACC/NCDR PCI Registry Abstraction
Authors: Vikrum Malhotra, Manpreet Kaur, Ayesha Ghotto
Abstract:
Background: Cardiac registries such as ACC Percutaneous Coronary Intervention Registry require high quality data to be abstracted, including data elements such as nuclear cardiology, diagnostic coronary angiography, and PCI. Introduction: The audit tool created is used by data abstractors to provide data audits and assess the accuracy and inter-rater reliability of abstraction performed by the abstractors for a health system. This audit tool solution has been developed across 13 registries, including ACC/NCDR registries, PCI, STS, Get with the Guidelines. Methodology: The data audit tool was used to audit internal registry abstraction for all data elements, including stress test performed, type of stress test, data of stress test, results of stress test, risk/extent of ischemia, diagnostic catheterization detail, and PCI data elements for ACC/NCDR PCI registries. This is being used across 20 hospital systems internally and providing abstraction and audit services for them. Results: The data audit tool had inter-rater reliability and accuracy greater than 95% data accuracy and IRR score for the PCI registry in 50 PCI registry cases in 2021. Conclusion: The tool is being used internally for surgical societies and across hospital systems. The audit tool enables the abstractor to be assessed by an external abstractor and includes all of the data dictionary fields for each registry.Keywords: abstraction, cardiac registry, cardiovascular registry, registry, data
Procedia PDF Downloads 10527143 Artificial Intelligence Based Comparative Analysis for Supplier Selection in Multi-Echelon Automotive Supply Chains via GEP and ANN Models
Authors: Seyed Esmail Seyedi Bariran, Laysheng Ewe, Amy Ling
Abstract:
Since supplier selection appears as a vital decision, selecting supplier based on the best and most accurate ways has a lot of importance for enterprises. In this study, a new Artificial Intelligence approach is exerted to remove weaknesses of supplier selection. The paper has three parts. First part is choosing the appropriate criteria for assessing the suppliers’ performance. Next one is collecting the data set based on experts. Afterwards, the data set is divided into two parts, the training data set and the testing data set. By the training data set the best structure of GEP and ANN are selected and to evaluate the power of the mentioned methods the testing data set is used. The result obtained shows that the accuracy of GEP is more than ANN. Moreover, unlike ANN, a mathematical equation is presented by GEP for the supplier selection.Keywords: supplier selection, automotive supply chains, ANN, GEP
Procedia PDF Downloads 63127142 Non-Invasive Imaging of Human Tissue Using NIR Light
Authors: Ashwani Kumar
Abstract:
Use of NIR light for imaging the biological tissue and to quantify its optical properties is a good choice over other invasive methods. Optical tomography involves two steps. One is the forward problem and the other is the reconstruction problem. The forward problem consists of finding the measurements of transmitted light through the tissue from source to detector, given the spatial distribution of absorption and scattering properties. The second step is the reconstruction problem. In X-ray tomography, there is standard method for reconstruction called filtered back projection method or the algebraic reconstruction methods. But this method cannot be applied as such, in optical tomography due to highly scattering nature of biological tissue. A hybrid algorithm for reconstruction has been implemented in this work which takes into account the highly scattered path taken by photons while back projecting the forward data obtained during Monte Carlo simulation. The reconstructed image suffers from blurring due to point spread function.Keywords: NIR light, tissue, blurring, Monte Carlo simulation
Procedia PDF Downloads 49427141 Crowdfunding for Saudi Arabia Green Projects
Authors: Saleh Komies, Mona Alharbi, Razan Alhayyani, Mozah Almulhim, Roseanne Khawaja, Ahmed Alradhi
Abstract:
One of the proposed solutions that faces some challenges is encouraging sustainable energy consumption across Saudi Arabia through crowdfunding platforms. To address these challenges, we need to determine the level of awareness of crowdfunding and green projects, as well as the preferences and willingness of Saudis to utilize crowdfunding as an alternative funding source for green projects in Saudi Arabia. In this study, we aim to determine the influence of environmental awareness and concern on the propensity to crowdfund green projects. The survey is being conducted as part of environmental initiatives to assess public perceptions and opinions on crowdfunding green projects in Saudi Arabia. A total of 450 responses to an online questionnaire distributed via convenience and snowball sampling were utilized for data analysis. The survey reveals that Saudis have a low understanding of crowdfunding concepts and a relatively high understanding of implementing green projects. The public is interested in crowdfunding green projects if there is a return on investment.Keywords: crowdfunding, green projects, awareness, Saudi Arabia, energy, solar, wind
Procedia PDF Downloads 9927140 Algorithmic Obligations: Proactive Liability for AI-Generated Content and Copyright Compliance
Authors: Aleksandra Czubek
Abstract:
As AI systems increasingly shape content creation, existing copyright frameworks face significant challenges in determining liability for AI-generated outputs. Current legal discussions largely focus on who bears responsibility for infringing works, be it developers, users, or entities benefiting from AI outputs. This paper introduces a novel concept of algorithmic obligations, proposing that AI developers be subject to proactive duties that ensure their models prevent copyright infringement before it occurs. Building on principles of obligations law traditionally applied to human actors, the paper suggests a shift from reactive enforcement to proactive legal requirements. AI developers would be legally mandated to incorporate copyright-aware mechanisms within their systems, turning optional safeguards into enforceable standards. These obligations could vary in implementation across international, EU, UK, and U.S. legal frameworks, creating a multi-jurisdictional approach to copyright compliance. This paper explores how the EU’s existing copyright framework, exemplified by the Copyright Directive (2019/790), could evolve to impose a duty of foresight on AI developers, compelling them to embed mechanisms that prevent infringing outputs. By drawing parallels to GDPR’s “data protection by design,” a similar principle could be applied to copyright law, where AI models are designed to minimize copyright risks. In the UK, post-Brexit text and data mining exemptions are seen as pro-innovation but pose risks to copyright protections. This paper proposes a balanced approach, introducing algorithmic obligations to complement these exemptions. AI systems benefiting from text and data mining provisions should integrate safeguards that flag potential copyright violations in real time, ensuring both innovation and protection. In the U.S., where copyright law focuses on human-centric works, this paper suggests an evolution toward algorithmic due diligence. AI developers would have a duty similar to product liability, ensuring that their systems do not produce infringing outputs, even if the outputs themselves cannot be copyrighted. This framework introduces a shift from post-infringement remedies to preventive legal structures, where developers actively mitigate risks. The paper also breaks new ground by addressing obligations surrounding the training data of large language models (LLMs). Currently, training data is often treated under exceptions such as the EU’s text and data mining provisions or U.S. fair use. However, this paper proposes a proactive framework where developers are obligated to verify and document the legal status of their training data, ensuring it is licensed or otherwise cleared for use. In conclusion, this paper advocates for an obligations-centered model that shifts AI-related copyright law from reactive litigation to proactive design. By holding AI developers to a heightened standard of care, this approach aims to prevent infringement at its source, addressing both the outputs of AI systems and the training processes that underlie them.Keywords: ip, technology, copyright, data, infringement, comparative analysis
Procedia PDF Downloads 1827139 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method
Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito
Abstract:
In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.
Procedia PDF Downloads 49327138 Demographic Factors Influence on Awareness of Islamic Financing among Micro, Small and Medium Enterprises Entrepreneurs in the North East Region of Nigeria
Authors: Bashir Ahmad, Daneji, Hamidu Aminu, Ahmad, Aliyu Mukhtar, Daneji, Haruna Mohammed
Abstract:
It has been established and universally agreed that vibrant Micro, Small and Medium Enterprises (MSMEs) play significant roles in economic growth and development. In Nigeria, MSMEs are not playing the expected roles. Notable among the plethora of reasons is lack of prompt and sufficient finance. Government and other stakeholders attempted in several ways at different times to provide the required finance to MSMEs but the results were not encouraging and consequently, many failed. In recent past, Islamic financing emerged world over as promising alternative source of financing. However, its awareness among MSMEs entrepreneurs in north east region of Nigeria stands to be questioned. This study explored the 'Demographic Factors Influence on Awareness of Islamic Financing among MSMEs entrepreneurs in the North East Region of Nigeria'. The primary data used in this study were collected through questionnaire. In analyzing the collected data, the study used frequency, percentages, Pearson correlation, ANOVA and test of homogeneity test (Levene’s test) parameters generated from SPSS (version 15). The findings of the study revealed that entrepreneurs’ age, state of origin, religion and educational level influence their MSMEs awareness of Islamic Financing in the north east region of Nigeria. The study recommended that Islamic Financing institutions, government and relevant agencies should do more to enhance the awareness of Islamic financing among MSMEs entrepreneurs in the north east region of Nigeria.Keywords: awareness, demographic factors, entrepreneurs, Islamic financing
Procedia PDF Downloads 36627137 Sound Noise Control of a Steam Ejector in a Typical Power Plant: Design, Manufacturing, and Testing a Silencer-Muffler
Authors: Ali Siami, Masoud Asayesh, Asghar Najafi, Amirhosein Hamedanian
Abstract:
There are so many noise sources in power generation units that these sources can produce high-level sound noise. Therefore, sound noise reduction methods can assist these industries, especially in these days that laws related to environmental issues become more strict. In a typical power plant, so many machines and devices with high-level sound noise are arranged beside of each others. Therefore, the sound source identification and reducing the noise level can be very vital. In this paper, the procedure for designing, manufacturing and testing of a silencer-muffler used for a power plant steam vent is mentioned. This unit is placed near the residential area and so it is very important to reduce the noise emission. For this purpose, in the first step, measurements have done to identify the sound source and the frequency content of noise. The overall level of noise was so high and it was more than 120dB. Then, the appropriate noise control device is designed according to the measurement results and operational conditions. In the next step, the designed silencer-muffler has been manufactured and installed on the steam discharge of the ejector. For validation of the silencer-muffler effect, the acoustic test was done again in operating mode. Finally, the measurement results before and after the installation are compared. The results have confirmed a considerable reduction in noise level resultant of using silencer-muffler in the designed frequency range.Keywords: silencer-muffler, sound noise control, sound measurement, steam ejector
Procedia PDF Downloads 38427136 Urban Spatial Metamorphoses: The Case of Kazan City With Using GIS-Technologies
Authors: Irna Malganova
Abstract:
The paper assessed the effectiveness of the use of urban functional zoning using the method of M.A. Kramer by the example of Kazan city (Republic of Tatarstan, Russian Federation) using geoinformation technologies. On the basis of the data obtained, the calculations were carried out to obtain data on population density, overcoming geographic determinism, as well as the effectiveness of the formation of urban frameworks. The authors proposed recommendations for the effectiveness of municipal frameworks in the period from 2018 to 2021: economic, social, environmental and social. The study of effective territorial planning in a given period allows to display of the dynamics of planning changes, as well as assessment of changes in the formation of urban frameworks. Based on the incoming data obtained from the master plan of the municipal formation of Kazan, in the period from 2018 to 2021, there was an increase in population by 13841 people or 1.1% of the values of 2018. In addition, the area of Kazan increased by 2419.6 hectares. In the structure of the distribution of areas of functional zones, there was an increase in such zones of the municipality as zones of residential and public purpose. Changes in functional zoning, as well as territories requiring reorganization, are presented using geoinformation technologies in open-source software Quantum Geographic Information System (QGIS 3.32). According to the calculations based on the method of functional zoning efficiency by M.A. Kreimer, the territorial-planning structure of Kazan City is quite effective. However, in the development of spatial planning concepts, it is possible to emphasize the weakened interest of the population in the development of territorial planning documents. Thus, the approach to spatial planning of Kazan differs from foreign methods and approaches based on the joint development of planning directions and development of territories of municipalities between the developers of the planning structure, business representatives and the population. The population plays the role of the target audience on which territorial planning is oriented. It follows that there is a need to satisfy the opinions and demands of the population.Keywords: spatial development, metamorphosis, Kazan city, spatial planning, efficiency, geographic determinism., GIS, QGIS
Procedia PDF Downloads 8727135 Fine-Scale Modeling the Influencing Factors of Multi-Time Dimensions of Transit Ridership at Station Level: The Study of Guangzhou City
Authors: Dijiang Lyu, Shaoying Li, Zhangzhi Tan, Zhifeng Wu, Feng Gao
Abstract:
Nowadays, China is experiencing rapidly urban rail transit expansions in the world. The purpose of this study is to finely model factors influencing transit ridership at multi-time dimensions within transit stations’ pedestrian catchment area (PCA) in Guangzhou, China. This study was based on multi-sources spatial data, including smart card data, high spatial resolution images, points of interest (POIs), real-estate online data and building height data. Eight multiple linear regression models using backward stepwise method and Geographic Information System (GIS) were created at station-level. According to Chinese code for classification of urban land use and planning standards of development land, residential land-use were divided into three categories: first-level (e.g. villa), second-level (e.g. community) and third-level (e.g. urban villages). Finally, it concluded that: (1) four factors (CBD dummy, number of feeder bus route, number of entrance or exit and the years of station operation) were proved to be positively correlated with transit ridership, but the area of green land-use and water land-use negative correlated instead. (2) The area of education land-use, the second-level and third-level residential land-use were found to be highly connected to the average value of morning peak boarding and evening peak alighting ridership. But the area of commercial land-use and the average height of buildings, were significantly positive associated with the average value of morning peak alighting and evening peak boarding ridership. (3) The area of the second-level residential land-use was rarely correlated with ridership in other regression models. Because private car ownership is still large in Guangzhou now, and some residents living in the community around the stations go to work by transit at peak time, but others are much more willing to drive their own car at non-peak time. The area of the third-level residential land-use, like urban villages, was highly positive correlated with ridership in all models, indicating that residents who live in the third-level residential land-use are the main passenger source of the Guangzhou Metro. (4) The diversity of land-use was found to have a significant impact on the passenger flow on the weekend, but was non-related to weekday. The findings can be useful for station planning, management and policymaking.Keywords: fine-scale modeling, Guangzhou city, multi-time dimensions, multi-sources spatial data, transit ridership
Procedia PDF Downloads 14227134 Review of Concepts and Tools Applied to Assess Risks Associated with Food Imports
Authors: A. Falenski, A. Kaesbohrer, M. Filter
Abstract:
Introduction: Risk assessments can be performed in various ways and in different degrees of complexity. In order to assess risks associated with imported foods additional information needs to be taken into account compared to a risk assessment on regional products. The present review is an overview on currently available best practise approaches and data sources used for food import risk assessments (IRAs). Methods: A literature review has been performed. PubMed was searched for articles about food IRAs published in the years 2004 to 2014 (English and German texts only, search string “(English [la] OR German [la]) (2004:2014 [dp]) import [ti] risk”). Titles and abstracts were screened for import risks in the context of IRAs. The finally selected publications were analysed according to a predefined questionnaire extracting the following information: risk assessment guidelines followed, modelling methods used, data and software applied, existence of an analysis of uncertainty and variability. IRAs cited in these publications were also included in the analysis. Results: The PubMed search resulted in 49 publications, 17 of which contained information about import risks and risk assessments. Within these 19 cross references were identified to be of interest for the present study. These included original articles, reviews and guidelines. At least one of the guidelines of the World Organisation for Animal Health (OIE) and the Codex Alimentarius Commission were referenced in any of the IRAs, either for import of animals or for imports concerning foods, respectively. Interestingly, also a combination of both was used to assess the risk associated with the import of live animals serving as the source of food. Methods ranged from full quantitative IRAs using probabilistic models and dose-response models to qualitative IRA in which decision trees or severity tables were set up using parameter estimations based on expert opinions. Calculations were done using @Risk, R or Excel. Most heterogeneous was the type of data used, ranging from general information on imported goods (food, live animals) to pathogen prevalence in the country of origin. These data were either publicly available in databases or lists (e.g., OIE WAHID and Handystatus II, FAOSTAT, Eurostat, TRACES), accessible on a national level (e.g., herd information) or only open to a small group of people (flight passenger import data at national airport customs office). In the IRAs, an uncertainty analysis has been mentioned in some cases, but calculations have been performed only in a few cases. Conclusion: The current state-of-the-art in the assessment of risks of imported foods is characterized by a great heterogeneity in relation to general methodology and data used. Often information is gathered on a case-by-case basis and reformatted by hand in order to perform the IRA. This analysis therefore illustrates the need for a flexible, modular framework supporting the connection of existing data sources with data analysis and modelling tools. Such an infrastructure could pave the way to IRA workflows applicable ad-hoc, e.g. in case of a crisis situation.Keywords: import risk assessment, review, tools, food import
Procedia PDF Downloads 30227133 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 1727132 Democracy and Security Challenge in Nigeria, 1999, Till Date
Authors: Abdulsalami M. Deji
Abstract:
Prolonged military incursion in Nigeria politics which favored the oligarchy brought agitation for democratic rule it exacerbated ethnicity integration of minority for fear of domination. The advent of democracy ushered in new breath of life to Nigerians from the shackle of military oppression to democratic governance. Democratic rule became a mirage as a result of prevalent insecurity in Nigeria; effort to bring lasting peace to all sections of the country had not yielded positive result till date. In the process of struggling for democracy among ethnic groups in Nigeria, they had instituted various militia groups defending the interest of their identity due to unequal distribution of wealth by military junta. When democracy came on board, these various militia groups became demons hunting democratic institutions. Quest by the successful government to find lasting solution has proved abortive. The security of politics which guaranteed stability is not visible in Nigeria, what we have now is politics of security. The unrest in Nigeria today has cripple socio-political and economy of the nation; the growth of economy favored elites without meaningful impact on the common man. This paper focus on the effects of democracy on Nigerians and, how security under democratic rule has hindered dividends of democracy since 1999-till date and way forward. The source is strictly base on secondary source from textbook, newspapers, internet, and journals.Keywords: democracy, interest, militia, security
Procedia PDF Downloads 33527131 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 15427130 A Stable Method for Determination of the Number of Independent Components
Authors: Yuyan Yi, Jingyi Zheng, Nedret Billor
Abstract:
Independent component analysis (ICA) is one of the most commonly used blind source separation (BSS) techniques for signal pre-processing, such as noise reduction and feature extraction. The main parameter in the ICA method is the number of independent components (IC). Although there have been several methods for the determination of the number of ICs, it has not been given sufficient attentionto this important parameter. In this study, wereview the mostused methods fordetermining the number of ICs and providetheir advantages and disadvantages. Further, wepropose an improved version of column-wise ICAByBlock method for the determination of the number of ICs.To assess the performance of the proposed method, we compare the column-wise ICAbyBlock with several existing methods through different ICA methods by using simulated and real signal data. Results show that the proposed column-wise ICAbyBlock is an effective and stable method for determining the optimal number of components in ICA. This method is simple, and results can be demonstrated intuitively with good visualizations.Keywords: independent component analysis, optimal number, column-wise, correlation coefficient, cross-validation, ICAByblock
Procedia PDF Downloads 9927129 Implementation of Geo-Crowdsourcing Mobile Applications in e-Government of V4 Countries: A State-of-the-Art Survey
Authors: Barbora Haltofová
Abstract:
In recent years, citizens have become an important source of geographic information and, therefore, geo-crowdsourcing, often known as volunteered geographic information, has provided an interesting alternative to traditional mapping practices which are becoming expensive, resource-intensive and unable to capture the dynamic nature of urban environments. In order to address a gap in research literature, this paper deals with a survey conducted to assess the current state of geo-crowdsourcing, a recent phenomenon popular with people who collect geographic information using their smartphones. This article points out that there is an increasing body of knowledge of geo-crowdsourcing mobile applications in the Visegrad countries marked by the ubiquitous Internet connection and the current massive proliferation of smartphones. This article shows how geo-crowdsourcing can be used as a complement, or in some cases a replacement, to traditionally generated sources of spatial data and information in public management. It discusses the new spaces of citizen participation constructed by these geo-crowdsourcing practices.Keywords: citizen participation, e-Government, geo-crowdsourcing, participatory mapping, mobile applications
Procedia PDF Downloads 33427128 Steps towards the Development of National Health Data Standards in Developing Countries
Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray
Abstract:
The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia
Procedia PDF Downloads 33827127 Teaching Behaviours of Effective Secondary Mathematics Teachers: A Study in Dhaka, Bangladesh
Authors: Asadullah Sheikh, Kerry Barnett, Paul Ayres
Abstract:
Despite significant progress in access, equity and public examination success, poor student performance in mathematics in secondary schools has become a major concern in Bangladesh. A substantial body of research has emphasised the important contribution of teaching practices to student achievement. However, this has not been investigated in Bangladesh. Therefore, the study sought to find out the effectiveness of mathematics teaching practices as a means of improving secondary school mathematics in Dhaka Municipality City (DMC) area, Bangladesh. The purpose of this study was twofold, first, to identify the 20 highest performing secondary schools in mathematics in DMC, and second, to investigate the teaching practices of mathematics teachers in these schools. A two-phase mixed method approach was adopted. In the first phase, secondary source data were obtained from the Board of Intermediate and Secondary Education (BISE), Dhaka and value-added measures used to identify the 20 highest performing secondary schools in mathematics. In the second phase, a concurrent mixed method design, where qualitative methods were embedded within a dominant quantitative approach was utilised. A purposive sampling strategy was used to select fifteen teachers from the 20 highest performing secondary schools. The main sources of data were classroom teaching observations, and teacher interviews. The data from teacher observations were analysed with descriptive and nonparametric statistics. The interview data were analysed qualitatively. The main findings showed teachers adopt a direct teaching approach which incorporates orientation, structuring, modelling, practice, questioning and teacher-student interaction that creates an individualistic learning environment. The variation in developmental levels of teaching skill indicate that teachers do not necessarily use the qualitative (i.e., focus, stage, quality and differentiation) aspects of teaching behaviours effectively. This is the first study to investigate teaching behaviours of effective secondary mathematics teachers within Dhaka, Bangladesh. It contributes in an international dimension to the field of educational effectiveness and raise questions about existing constructivist approaches. Further, it contributes to important insights about teaching behaviours that can be used to inform the development of evidence-based policy and practice on quality teaching in Bangladesh.Keywords: effective teaching, mathematics, secondary schools, student achievement, value-added measures
Procedia PDF Downloads 23827126 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-
Authors: Nieto Bernal Wilson, Carmona Suarez Edgar
Abstract:
The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects. Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse
Procedia PDF Downloads 40927125 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74327124 A Digital Twin Approach to Support Real-time Situational Awareness and Intelligent Cyber-physical Control in Energy Smart Buildings
Authors: Haowen Xu, Xiaobing Liu, Jin Dong, Jianming Lian
Abstract:
Emerging smart buildings often employ cyberinfrastructure, cyber-physical systems, and Internet of Things (IoT) technologies to increase the automation and responsiveness of building operations for better energy efficiency and lower carbon emission. These operations include the control of Heating, Ventilation, and Air Conditioning (HVAC) and lighting systems, which are often considered a major source of energy consumption in both commercial and residential buildings. Developing energy-saving control models for optimizing HVAC operations usually requires the collection of high-quality instrumental data from iterations of in-situ building experiments, which can be time-consuming and labor-intensive. This abstract describes a digital twin approach to automate building energy experiments for optimizing HVAC operations through the design and development of an adaptive web-based platform. The platform is created to enable (a) automated data acquisition from a variety of IoT-connected HVAC instruments, (b) real-time situational awareness through domain-based visualizations, (c) adaption of HVAC optimization algorithms based on experimental data, (d) sharing of experimental data and model predictive controls through web services, and (e) cyber-physical control of individual instruments in the HVAC system using outputs from different optimization algorithms. Through the digital twin approach, we aim to replicate a real-world building and its HVAC systems in an online computing environment to automate the development of building-specific model predictive controls and collaborative experiments in buildings located in different climate zones in the United States. We present two case studies to demonstrate our platform’s capability for real-time situational awareness and cyber-physical control of the HVAC in the flexible research platforms within the Oak Ridge National Laboratory (ORNL) main campus. Our platform is developed using adaptive and flexible architecture design, rendering the platform generalizable and extendable to support HVAC optimization experiments in different types of buildings across the nation.Keywords: energy-saving buildings, digital twins, HVAC, cyber-physical system, BIM
Procedia PDF Downloads 11027123 Automated Testing to Detect Instance Data Loss in Android Applications
Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai
Abstract:
Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.Keywords: Android, automated testing, activity, data loss
Procedia PDF Downloads 23727122 Solar-Thermal-Electric Stirling Engine-Powered System for Residential Units
Authors: Florian Misoc, Cyril Okhio, Joshua Tolbert, Nick Carlin, Thomas Ramey
Abstract:
This project is focused on designing a Stirling engine system for a solar-thermal-electrical system that can supply electric power to a single residential unit. Since Stirling engines are heat engines operating any available heat source, is notable for its ability to generate clean and reliable energy without emissions. Due to the need of finding alternative energy sources, the Stirling engines are making a comeback with the recent technologies, which include thermal energy conservation during the heat transfer process. Recent reviews show mounting evidence and positive test results that Stirling engines are able to produce constant energy supply that ranges from 5kW to 20kW. Solar Power source is one of the many uses for Stirling engines. Using solar energy to operate Stirling engines is an idea considered by many researchers, due to the ease of adaptability of the Stirling engine. In this project, the Stirling engine developed was designed and tested to operate from biomass source of energy, i.e., wood pellets stove, during low solar radiation, with good results. A 20% efficiency of the engine was estimated, and 18% efficiency was measured, making it suitable and appropriate for residential applications. The effort reported was aimed at exploring parameters necessary to design, build and test a ‘Solar Powered Stirling Engine (SPSE)’ using Water (H₂O) as the Heat Transfer medium, with Nitrogen as the working gas that can reach or exceed an efficiency of 20%. The main objectives of this work consisted in: converting a V-twin cylinder air compressor into an alpha-type Stirling engine, construct a Solar Water Heater, by using an automotive radiator as the high-temperature reservoir for the Stirling engine, and an array of fixed mirrors that concentrate the solar radiation on the automotive radiator/high-temperature reservoir. The low-temperature reservoir is the surrounding air at ambient temperature. This work has determined that a low-cost system is sufficiently efficient and reliable. Off-the-shelf components have been used and estimates of the ability of the Engine final design to meet the electricity needs of small residence have been determined.Keywords: stirling engine, solar-thermal, power inverter, alternator
Procedia PDF Downloads 27827121 Big Data: Appearance and Disappearance
Authors: James Moir
Abstract:
The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.Keywords: big data, appearance, disappearance, surface, epistemology
Procedia PDF Downloads 421