Search results for: hydraulic data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24836

Search results for: hydraulic data

24386 High Rate Bio-Methane Generation from Petrochemical Wastewater Using Improved CSTR

Authors: Md. Nurul Islam Siddique, A. W. Zularisam

Abstract:

The effect of gradual increase in organic loading rate (OLR) and temperature on biomethanation from petrochemical wastewater treatment was investigated using CSTR. The digester performance was measured at hydraulic retention time (HRT) of 4 to 2d, and start up procedure of the reactor was monitored for 60 days via chemical oxygen demand (COD) removal, biogas and methane production. By enhancing the temperature from 30 to 55 ˚C Thermophilic condition was attained, and pH was adjusted at 7 ± 0.5 during the experiment. Supreme COD removal competence was 98±0.5% (r = 0.84) at an OLR of 7.5 g-COD/Ld and 4d HRT. Biogas and methane yield were logged to an extreme of 0.80 L/g-CODremoved d (r = 0.81), 0.60 L/g-CODremoved d (r = 0.83), and mean methane content of biogas was 65.49%. The full acclimatization was established at 55 ˚C with high COD removal efficiency and biogas production. An OLR of 7.5 g-COD/L d and HRT of 4 days were apposite for petrochemical wastewater treatment.

Keywords: anaerobic digestion, petrochemical wastewater, CSTR, methane

Procedia PDF Downloads 334
24385 The Role of Data Protection Officer in Managing Individual Data: Issues and Challenges

Authors: Nazura Abdul Manap, Siti Nur Farah Atiqah Salleh

Abstract:

For decades, the misuse of personal data has been a critical issue. Malaysia has accepted responsibility by implementing the Malaysian Personal Data Protection Act 2010 to secure personal data (PDPA 2010). After more than a decade, this legislation is set to be revised by the current PDPA 2023 Amendment Bill to align with the world's key personal data protection regulations, such as the European Union General Data Protection Regulations (GDPR). Among the other suggested adjustments is the Data User's appointment of a Data Protection Officer (DPO) to ensure the commercial entity's compliance with the PDPA 2010 criteria. The change is expected to be enacted in parliament fairly soon; nevertheless, based on the experience of the Personal Data Protection Department (PDPD) in implementing the Act, it is projected that there will be a slew of additional concerns associated with the DPO mandate. Consequently, the goal of this article is to highlight the issues that the DPO will encounter and how the Personal Data Protection Department should respond to this subject. The study result was produced using a qualitative technique based on an examination of the current literature. This research reveals that there are probable obstacles experienced by the DPO, and thus, there should be a definite, clear guideline in place to aid DPO in executing their tasks. It is argued that appointing a DPO is a wise measure in ensuring that the legal data security requirements are met.

Keywords: guideline, law, data protection officer, personal data

Procedia PDF Downloads 59
24384 Gravitational Water Vortex Power Plant: Experimental-Parametric Design of a Hydraulic Structure Capable of Inducing the Artificial Formation of a Gravitational Water Vortex Appropriate for Hydroelectric Generation

Authors: Henrry Vicente Rojas Asuero, Holger Manuel Benavides Muñoz

Abstract:

Approximately 80% of the energy consumed worldwide is generated from fossil sources, which are responsible for the emission of a large volume of greenhouse gases. For this reason, the global trend, at present, is the widespread use of energy produced from renewable sources. This seeks safety and diversification of energy supply, based on social cohesion, economic feasibility and environmental protection. In this scenario, small hydropower systems (P ≤ 10MW) stand out due to their high efficiency, economic competitiveness and low environmental impact. Small hydropower systems, along with wind and solar energy, are expected to represent a significant percentage of the world's energy matrix in the near term. Among the various technologies present in the state of the art, relating to small hydropower systems, is the Gravitational Water Vortex Power Plant, a recent technology that excels because of its versatility of operation, since it can operate with jumps in the range of 0.70 m-2.00 m and flow rates from 1 m3/s to 20 m3/s. Its operating system is based on the utilization of the energy of rotation contained within a large water vortex artificially induced. This paper presents the study and experimental design of an optimal hydraulic structure with the capacity to induce the artificial formation of a gravitational water vortex trough a system of easy application and high efficiency, able to operate in conditions of very low head and minimum flow. The proposed structure consists of a channel, with variable base, vortex inductor, tangential flow generator, coupled to a circular tank with a conical transition bottom hole. In the laboratory test, the angular velocity of the water vortex was related to the geometric characteristics of the inductor channel, as well as the influence of the conical transition bottom hole on the physical characteristics of the water vortex. The results show angular velocity values of greater magnitude as a function of depth, in addition the presence of the conical transition in the bottom hole of the circular tank improves the water vortex formation conditions while increasing the angular velocity values. Thus, the proposed system is a sustainable solution for the energy supply of rural areas near to watercourses.

Keywords: experimental model, gravitational water vortex power plant, renewable energy, small hydropower

Procedia PDF Downloads 271
24383 Data Collection Based on the Questionnaire Survey In-Hospital Emergencies

Authors: Nouha Mhimdi, Wahiba Ben Abdessalem Karaa, Henda Ben Ghezala

Abstract:

The methods identified in data collection are diverse: electronic media, focus group interviews and short-answer questionnaires [1]. The collection of poor-quality data resulting, for example, from poorly designed questionnaires, the absence of good translators or interpreters, and the incorrect recording of data allow conclusions to be drawn that are not supported by the data or to focus only on the average effect of the program or policy. There are several solutions to avoid or minimize the most frequent errors, including obtaining expert advice on the design or adaptation of data collection instruments; or use technologies allowing better "anonymity" in the responses [2]. In this context, we opted to collect good quality data by doing a sizeable questionnaire-based survey on hospital emergencies to improve emergency services and alleviate the problems encountered. At the level of this paper, we will present our study, and we will detail the steps followed to achieve the collection of relevant, consistent and practical data.

Keywords: data collection, survey, questionnaire, database, data analysis, hospital emergencies

Procedia PDF Downloads 87
24382 Federated Learning in Healthcare

Authors: Ananya Gangavarapu

Abstract:

Convolutional Neural Networks (CNN) based models are providing diagnostic capabilities on par with the medical specialists in many specialty areas. However, collecting the medical data for training purposes is very challenging because of the increased regulations around data collections and privacy concerns around personal health data. The gathering of the data becomes even more difficult if the capture devices are edge-based mobile devices (like smartphones) with feeble wireless connectivity in rural/remote areas. In this paper, I would like to highlight Federated Learning approach to mitigate data privacy and security issues.

Keywords: deep learning in healthcare, data privacy, federated learning, training in distributed environment

Procedia PDF Downloads 120
24381 Erosion Modeling of Surface Water Systems for Long Term Simulations

Authors: Devika Nair, Sean Bellairs, Ken Evans

Abstract:

Flow and erosion modeling provides an avenue for simulating the fine suspended sediment in surface water systems like streams and creeks. Fine suspended sediment is highly mobile, and many contaminants that may have been released by any sort of catchment disturbance attach themselves to these sediments. Therefore, a knowledge of fine suspended sediment transport is important in assessing contaminant transport. The CAESAR-Lisflood Landform Evolution Model, which includes a hydrologic model (TOPMODEL) and a hydraulic model (Lisflood), is being used to assess the sediment movement in tropical streams on account of a disturbance in the catchment of the creek and to determine the dynamics of sediment quantity in the creek through the years by simulating the model for future years. The accuracy of future simulations depends on the calibration and validation of the model to the past and present events. Calibration and validation of the model involve finding a combination of parameters of the model, which, when applied and simulated, gives model outputs similar to those observed for the real site scenario for corresponding input data. Calibrating the sediment output of the CAESAR-Lisflood model at the catchment level and using it for studying the equilibrium conditions of the landform is an area yet to be explored. Therefore, the aim of the study was to calibrate the CAESAR-Lisflood model and then validate it so that it could be run for future simulations to study how the landform evolves over time. To achieve this, the model was run for a rainfall event with a set of parameters, plus discharge and sediment data for the input point of the catchment, to analyze how similar the model output would behave when compared with the discharge and sediment data for the output point of the catchment. The model parameters were then adjusted until the model closely approximated the real site values of the catchment. It was then validated by running the model for a different set of events and checking that the model gave similar results to the real site values. The outcomes demonstrated that while the model can be calibrated to a greater extent for hydrology (discharge output) throughout the year, the sediment output calibration may be slightly improved by having the ability to change parameters to take into account the seasonal vegetation growth during the start and end of the wet season. This study is important to assess hydrology and sediment movement in seasonal biomes. The understanding of sediment-associated metal dispersion processes in rivers can be used in a practical way to help river basin managers more effectively control and remediate catchments affected by present and historical metal mining.

Keywords: erosion modelling, fine suspended sediments, hydrology, surface water systems

Procedia PDF Downloads 61
24380 The Utilization of Big Data in Knowledge Management Creation

Authors: Daniel Brian Thompson, Subarmaniam Kannan

Abstract:

The huge weightage of knowledge in this world and within the repository of organizations has already reached immense capacity and is constantly increasing as time goes by. To accommodate these constraints, Big Data implementation and algorithms are utilized to obtain new or enhanced knowledge for decision-making. With the transition from data to knowledge provides the transformational changes which will provide tangible benefits to the individual implementing these practices. Today, various organization would derive knowledge from observations and intuitions where this information or data will be translated into best practices for knowledge acquisition, generation and sharing. Through the widespread usage of Big Data, the main intention is to provide information that has been cleaned and analyzed to nurture tangible insights for an organization to apply to their knowledge-creation practices based on facts and figures. The translation of data into knowledge will generate value for an organization to make decisive decisions to proceed with the transition of best practices. Without a strong foundation of knowledge and Big Data, businesses are not able to grow and be enhanced within the competitive environment.

Keywords: big data, knowledge management, data driven, knowledge creation

Procedia PDF Downloads 90
24379 Survey on Data Security Issues Through Cloud Computing Amongst Sme’s in Nairobi County, Kenya

Authors: Masese Chuma Benard, Martin Onsiro Ronald

Abstract:

Businesses have been using cloud computing more frequently recently because they wish to take advantage of its advantages. However, employing cloud computing also introduces new security concerns, particularly with regard to data security, potential risks and weaknesses that could be exploited by attackers, and various tactics and strategies that could be used to lessen these risks. This study examines data security issues on cloud computing amongst sme’s in Nairobi county, Kenya. The study used the sample size of 48, the research approach was mixed methods, The findings show that data owner has no control over the cloud merchant's data management procedures, there is no way to ensure that data is handled legally. This implies that you will lose control over the data stored in the cloud. Data and information stored in the cloud may face a range of availability issues due to internet outages; this can represent a significant risk to data kept in shared clouds. Integrity, availability, and secrecy are all mentioned.

Keywords: data security, cloud computing, information, information security, small and medium-sized firms (SMEs)

Procedia PDF Downloads 63
24378 Cloud Design for Storing Large Amount of Data

Authors: M. Strémy, P. Závacký, P. Cuninka, M. Juhás

Abstract:

Main goal of this paper is to introduce our design of private cloud for storing large amount of data, especially pictures, and to provide good technological backend for data analysis based on parallel processing and business intelligence. We have tested hypervisors, cloud management tools, storage for storing all data and Hadoop to provide data analysis on unstructured data. Providing high availability, virtual network management, logical separation of projects and also rapid deployment of physical servers to our environment was also needed.

Keywords: cloud, glusterfs, hadoop, juju, kvm, maas, openstack, virtualization

Procedia PDF Downloads 336
24377 Estimation of Missing Values in Aggregate Level Spatial Data

Authors: Amitha Puranik, V. S. Binu, Seena Biju

Abstract:

Missing data is a common problem in spatial analysis especially at the aggregate level. Missing can either occur in covariate or in response variable or in both in a given location. Many missing data techniques are available to estimate the missing data values but not all of these methods can be applied on spatial data since the data are autocorrelated. Hence there is a need to develop a method that estimates the missing values in both response variable and covariates in spatial data by taking account of the spatial autocorrelation. The present study aims to develop a model to estimate the missing data points at the aggregate level in spatial data by accounting for (a) Spatial autocorrelation of the response variable (b) Spatial autocorrelation of covariates and (c) Correlation between covariates and the response variable. Estimating the missing values of spatial data requires a model that explicitly account for the spatial autocorrelation. The proposed model not only accounts for spatial autocorrelation but also utilizes the correlation that exists between covariates, within covariates and between a response variable and covariates. The precise estimation of the missing data points in spatial data will result in an increased precision of the estimated effects of independent variables on the response variable in spatial regression analysis.

Keywords: spatial regression, missing data estimation, spatial autocorrelation, simulation analysis

Procedia PDF Downloads 356
24376 Association Rules Mining and NOSQL Oriented Document in Big Data

Authors: Sarra Senhadji, Imene Benzeguimi, Zohra Yagoub

Abstract:

Big Data represents the recent technology of manipulating voluminous and unstructured data sets over multiple sources. Therefore, NOSQL appears to handle the problem of unstructured data. Association rules mining is one of the popular techniques of data mining to extract hidden relationship from transactional databases. The algorithm for finding association dependencies is well-solved with Map Reduce. The goal of our work is to reduce the time of generating of frequent itemsets by using Map Reduce and NOSQL database oriented document. A comparative study is given to evaluate the performances of our algorithm with the classical algorithm Apriori.

Keywords: Apriori, Association rules mining, Big Data, Data Mining, Hadoop, MapReduce, MongoDB, NoSQL

Procedia PDF Downloads 139
24375 Immunization-Data-Quality in Public Health Facilities in the Pastoralist Communities: A Comparative Study Evidence from Afar and Somali Regional States, Ethiopia

Authors: Melaku Tsehay

Abstract:

The Consortium of Christian Relief and Development Associations (CCRDA), and the CORE Group Polio Partners (CGPP) Secretariat have been working with Global Alliance for Vac-cines and Immunization (GAVI) to improve the immunization data quality in Afar and Somali Regional States. The main aim of this study was to compare the quality of immunization data before and after the above interventions in health facilities in the pastoralist communities in Ethiopia. To this end, a comparative-cross-sectional study was conducted on 51 health facilities. The baseline data was collected in May 2019, while the end line data in August 2021. The WHO data quality self-assessment tool (DQS) was used to collect data. A significant improvment was seen in the accuracy of the pentavalent vaccine (PT)1 (p = 0.012) data at the health posts (HP), while PT3 (p = 0.010), and Measles (p = 0.020) at the health centers (HC). Besides, a highly sig-nificant improvment was observed in the accuracy of tetanus toxoid (TT)2 data at HP (p < 0.001). The level of over- or under-reporting was found to be < 8%, at the HP, and < 10% at the HC for PT3. The data completeness was also increased from 72.09% to 88.89% at the HC. Nearly 74% of the health facilities timely reported their respective immunization data, which is much better than the baseline (7.1%) (p < 0.001). These findings may provide some hints for the policies and pro-grams targetting on improving immunization data qaulity in the pastoralist communities.

Keywords: data quality, immunization, verification factor, pastoralist region

Procedia PDF Downloads 72
24374 Identifying Critical Success Factors for Data Quality Management through a Delphi Study

Authors: Maria Paula Santos, Ana Lucas

Abstract:

Organizations support their operations and decision making on the data they have at their disposal, so the quality of these data is remarkably important and Data Quality (DQ) is currently a relevant issue, the literature being unanimous in pointing out that poor DQ can result in large costs for organizations. The literature review identified and described 24 Critical Success Factors (CSF) for Data Quality Management (DQM) that were presented to a panel of experts, who ordered them according to their degree of importance, using the Delphi method with the Q-sort technique, based on an online questionnaire. The study shows that the five most important CSF for DQM are: definition of appropriate policies and standards, control of inputs, definition of a strategic plan for DQ, organizational culture focused on quality of the data and obtaining top management commitment and support.

Keywords: critical success factors, data quality, data quality management, Delphi, Q-Sort

Procedia PDF Downloads 193
24373 Development of IDF Curves for Precipitation in Western Watershed of Guwahati, Assam

Authors: Rajarshi Sharma, Rashidul Alam, Visavino Seleyi, Yuvila Sangtam

Abstract:

The Intensity-Duration-Frequency (IDF) relationship of rainfall amounts is one of the most commonly used tools in water resources engineering for planning, design and operation of water resources project, or for various engineering projects against design floods. The establishment of such relationships was reported as early as in 1932 (Bernard). Since then many sets of relationships have been constructed for several parts of the globe. The objective of this research is to derive IDF relationship of rainfall for western watershed of Guwahati, Assam. These relationships are useful in the design of urban drainage works, e.g. storm sewers, culverts and other hydraulic structures. In the study, rainfall depth for 10 years viz. 2001 to 2010 has been collected from the Regional Meteorological Centre Borjhar, Guwahati. Firstly, the data has been used to construct the mass curve for duration of more than 7 hours rainfall to calculate the maximum intensity and to form the intensity duration curves. Gumbel’s frequency analysis technique has been used to calculate the probable maximum rainfall intensities for a period of 2 yr, 5 yr, 10 yr, 50 yr, 100 yr from the maximum intensity. Finally, regression analysis has been used to develop the intensity-duration-frequency (IDF) curve. Thus, from the analysis the values for the constants ‘a’,‘b’ &‘c’ have been found out. The values of ‘a’ for which the sum of the squared deviation is minimum has been found out to be 40 and when the corresponding value of ‘c’ and ‘b’ for the minimum squared deviation of ‘a’ are 0.744 and 1981.527 respectively. The results obtained showed that in all the cases the correlation coefficient is very high indicating the goodness of fit of the formulae to estimate IDF curves in the region of interest.

Keywords: intensity-duration-frequency relationship, mass curve, regression analysis, correlation coefficient

Procedia PDF Downloads 221
24372 Cavitating Flow through a Venturi Using Computational Fluid Dynamics

Authors: Imane Benghalia, Mohammed Zamoum, Rachid Boucetta

Abstract:

Hydrodynamic cavitation is a complex physical phenomenon that appears in hydraulic systems (pumps, turbines, valves, Venturi tubes, etc.) when the fluid pressure decreases below the saturated vapor pressure. The works carried out in this study aimed to get a better understanding of the cavitating flow phenomena. For this, we have numerically studied a cavitating bubbly flow through a Venturi nozzle. The cavitation model is selected and solved using a commercial computational fluid dynamics (CFD) code. The obtained results show the effect of the inlet pressure (10, 7, 5, and 2 bars) of the Venturi on pressure, the velocity of the fluid flow, and the vapor fraction. We found that the inlet pressure of the Venturi strongly affects the evolution of the pressure, velocity, and vapor fraction formation in the cavitating flow.

Keywords: cavitating flow, CFD, phase change, venturi

Procedia PDF Downloads 67
24371 Strength and Permeability of the Granular Pavement Materials Treated with Polyacrylamide Based Additive

Authors: Romel N. Georgees, Rayya A Hassan, Robert P. Evans, Piratheepan Jegatheesan

Abstract:

Among other traditional and non-traditional additives, polymers have shown an efficient performance in the field and improved sustainability. Polyacrylamide (PAM) is one such additive that has demonstrated many advantages including a reduction in permeability, an increase in durability and the provision of strength characteristics. However, information about its effect on the improved geotechnical characteristics is very limited to the field performance monitoring. Therefore, a laboratory investigation was carried out to examine the basic and engineering behaviors of three types of soils treated with a PAM additive. The results showed an increase in dry density and unconfined compressive strength for all the soils. The results further demonstrated an increase in unsoaked CBR and a reduction in permeability for all stabilized samples.

Keywords: CBR, hydraulic conductivity, PAM, unconfined compressive strength

Procedia PDF Downloads 359
24370 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine

Authors: Djamila Benhaddouche, Abdelkader Benyettou

Abstract:

In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.

Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction

Procedia PDF Downloads 528
24369 Analysis of Different Classification Techniques Using WEKA for Diabetic Disease

Authors: Usama Ahmed

Abstract:

Data mining is the process of analyze data which are used to predict helpful information. It is the field of research which solve various type of problem. In data mining, classification is an important technique to classify different kind of data. Diabetes is most common disease. This paper implements different classification technique using Waikato Environment for Knowledge Analysis (WEKA) on diabetes dataset and find which algorithm is suitable for working. The best classification algorithm based on diabetic data is Naïve Bayes. The accuracy of Naïve Bayes is 76.31% and take 0.06 seconds to build the model.

Keywords: data mining, classification, diabetes, WEKA

Procedia PDF Downloads 130
24368 Process Performance and Nitrogen Removal Kinetics in Anammox Hybrid Reactor

Authors: Swati Tomar, Sunil Kumar Gupta

Abstract:

Anammox is a promising and cost effective alternative to conventional treatment systems that facilitates direct oxidation of ammonium nitrogen under anaerobic conditions with nitrite as an electron acceptor without addition of any external carbon sources. The present study investigates the process kinetics of laboratory scale anammox hybrid reactor (AHR) which combines the dual advantages of attached and suspended growth. The performance & behaviour of AHR was studied under varying hydraulic retention time (HRTs) and nitrogen loading rate (NLRs). The experimental unit consisted of 4 numbers of 5L capacity anammox hybrid reactor inoculated with mixed seed culture containing anoxic and activated sludge. Pseudo steady state (PSS) ammonium and nitrite removal efficiencies of 90.6% and 95.6%, respectively, were achieved during acclimation phase. After establishment of PSS, the performance of AHR was monitored at seven different HRTs of 3.0, 2.5, 2.0, 1.5, 1.0, 0.5 and 0.25 d with increasing NLR from 0.4 to 4.8 kg N/m3d. The results showed that with increase in NLR and decrease in HRT (3.0 to 0.25 d), AHR registered appreciable decline in nitrogen removal efficiency from 92.9% to 67.4 %, respectively. The HRT of 2.0 d was considered optimal to achieve substantial nitrogen removal of 89%, because on further decrease in HRT below 1.5 days, remarkable decline in the values of nitrogen removal efficiency were observed. Analysis of data indicated that attached growth system contributes an additional 15.4 % ammonium removal and reduced the sludge washout rate (additional 29% reduction). This enhanced performance may be attributed to 25% increase in sludge retention time due to the attached growth media. Three kinetic models, namely, first order, Monod and Modified Stover-Kincannon model were applied to assess the substrate removal kinetics of nitrogen removal in AHR. Validation of the models were carried out by comparing experimental set of data with the predicted values obtained from the respective models. For substrate removal kinetics, model validation revealed that Modified Stover-Kincannon is most precise (R2=0.943) and can be suitably applied to predict the kinetics of nitrogen removal in AHR. Lawrence and McCarty model described the kinetics of bacterial growth. The predicted value of yield coefficient and decay constant were in line with the experimentally observed values.

Keywords: anammox, kinetics, modelling, nitrogen removal, sludge wash out rate, AHR

Procedia PDF Downloads 294
24367 Thermo-Hydro-Mechanical-Chemical Coupling in Enhanced Geothermal Systems: Challenges and Opportunities

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Geothermal reservoirs (GTRs) have garnered global recognition as a sustainable energy source. The Thermo-Hydro-Mechanical-Chemical (THMC) integration coupling proves to be a practical and effective method for optimizing production in GTRs. The study outcomes demonstrate that THMC coupling serves as a versatile and valuable tool, offering in-depth insights into GTRs and enhancing their operational efficiency. This is achieved through temperature analysis and pressure changes and their impacts on mechanical properties, structural integrity, fracture aperture, permeability, and heat extraction efficiency. Moreover, THMC coupling facilitates potential benefits assessment and risks associated with different geothermal technologies, considering the complex thermal, hydraulic, mechanical, and chemical interactions within the reservoirs. However, THMC-coupling utilization in GTRs presents a multitude of challenges. These challenges include accurately modeling and predicting behavior due to the interconnected nature of processes, limited data availability leading to uncertainties, induced seismic events risks to nearby communities, scaling and mineral deposition reducing operational efficiency, and reservoirs' long-term sustainability. In addition, material degradation, environmental impacts, technical challenges in monitoring and control, accurate assessment of resource potential, and regulatory and social acceptance further complicate geothermal projects. Addressing these multifaceted challenges is crucial for successful geothermal energy resources sustainable utilization. This paper aims to illuminate the challenges and opportunities associated with THMC coupling in enhanced geothermal systems. Practical solutions and strategies for mitigating these challenges are discussed, emphasizing the need for interdisciplinary approaches, improved data collection and modeling techniques, and advanced monitoring and control systems. Overcoming these challenges is imperative for unlocking the full potential of geothermal energy making a substantial contribution to the global energy transition and sustainable development.

Keywords: geothermal reservoirs, THMC coupling, interdisciplinary approaches, challenges and opportunities, sustainable utilization

Procedia PDF Downloads 44
24366 Comprehensive Study of Data Science

Authors: Asifa Amara, Prachi Singh, Kanishka, Debargho Pathak, Akshat Kumar, Jayakumar Eravelly

Abstract:

Today's generation is totally dependent on technology that uses data as its fuel. The present study is all about innovations and developments in data science and gives an idea about how efficiently to use the data provided. This study will help to understand the core concepts of data science. The concept of artificial intelligence was introduced by Alan Turing in which the main principle was to create an artificial system that can run independently of human-given programs and can function with the help of analyzing data to understand the requirements of the users. Data science comprises business understanding, analyzing data, ethical concerns, understanding programming languages, various fields and sources of data, skills, etc. The usage of data science has evolved over the years. In this review article, we have covered a part of data science, i.e., machine learning. Machine learning uses data science for its work. Machines learn through their experience, which helps them to do any work more efficiently. This article includes a comparative study image between human understanding and machine understanding, advantages, applications, and real-time examples of machine learning. Data science is an important game changer in the life of human beings. Since the advent of data science, we have found its benefits and how it leads to a better understanding of people, and how it cherishes individual needs. It has improved business strategies, services provided by them, forecasting, the ability to attend sustainable developments, etc. This study also focuses on a better understanding of data science which will help us to create a better world.

Keywords: data science, machine learning, data analytics, artificial intelligence

Procedia PDF Downloads 56
24365 Simulation of Behaviour Dynamics and Optimization of the Energy System

Authors: Iva Dvornik, Sandro Božić, Žana Božić Brkić

Abstract:

System-dynamic simulating modelling is one of the most appropriate and successful scientific methods of the complex, non-linear, natural, technical and organizational systems. In the recent practice its methodology proved to be efficient in solving the problems of control, behavior, sensitivity and flexibility of the system dynamics behavior having a high degree of complexity, all these by computing simulation i.e. “under laboratory conditions” what means without any danger for observed realities. This essay deals with the research of the gas turbine dynamic process as well as the operating pump units and transformation of gas energy into hydraulic energy has been simulated. In addition, system mathematical model has been also researched (gas turbine- centrifugal pumps – pipeline pressure system – storage vessel).

Keywords: system dynamics, modelling, centrifugal pump, turbine, gases, continuous and discrete simulation, heuristic optimisation

Procedia PDF Downloads 87
24364 Application of Artificial Neural Network Technique for Diagnosing Asthma

Authors: Azadeh Bashiri

Abstract:

Introduction: Lack of proper diagnosis and inadequate treatment of asthma leads to physical and financial complications. This study aimed to use data mining techniques and creating a neural network intelligent system for diagnosis of asthma. Methods: The study population is the patients who had visited one of the Lung Clinics in Tehran. Data were analyzed using the SPSS statistical tool and the chi-square Pearson's coefficient was the basis of decision making for data ranking. The considered neural network is trained using back propagation learning technique. Results: According to the analysis performed by means of SPSS to select the top factors, 13 effective factors were selected, in different performances, data was mixed in various forms, so the different models were made for training the data and testing networks and in all different modes, the network was able to predict correctly 100% of all cases. Conclusion: Using data mining methods before the design structure of system, aimed to reduce the data dimension and the optimum choice of the data, will lead to a more accurate system. Therefore, considering the data mining approaches due to the nature of medical data is necessary.

Keywords: asthma, data mining, Artificial Neural Network, intelligent system

Procedia PDF Downloads 253
24363 Interpreting Privacy Harms from a Non-Economic Perspective

Authors: Christopher Muhawe, Masooda Bashir

Abstract:

With increased Internet Communication Technology(ICT), the virtual world has become the new normal. At the same time, there is an unprecedented collection of massive amounts of data by both private and public entities. Unfortunately, this increase in data collection has been in tandem with an increase in data misuse and data breach. Regrettably, the majority of data breach and data misuse claims have been unsuccessful in the United States courts for the failure of proof of direct injury to physical or economic interests. The requirement to express data privacy harms from an economic or physical stance negates the fact that not all data harms are physical or economic in nature. The challenge is compounded by the fact that data breach harms and risks do not attach immediately. This research will use a descriptive and normative approach to show that not all data harms can be expressed in economic or physical terms. Expressing privacy harms purely from an economic or physical harm perspective negates the fact that data insecurity may result into harms which run counter the functions of privacy in our lives. The promotion of liberty, selfhood, autonomy, promotion of human social relations and the furtherance of the existence of a free society. There is no economic value that can be placed on these functions of privacy. The proposed approach addresses data harms from a psychological and social perspective.

Keywords: data breach and misuse, economic harms, privacy harms, psychological harms

Procedia PDF Downloads 173
24362 Flow Duration Curve Method to Evaluate Environmental Flow: Case Study of Gharasou River, Ardabil, Iran

Authors: Mehdi Fuladipanah, Mehdi Jorabloo

Abstract:

Water flow management is one of the most important parts of river engineering. Non-uniformity distribution of rainfall and various flow demand with unreasonable flow management will be caused destroyed of river ecosystem. Then, it is very serious to determine ecosystem flow requirement. In this paper, flow duration curve indices method which has hydrological based was used to evaluate environmental flow in Gharasou River, Ardabil, Iran. Using flow duration curve, Q90 and Q95 for different return periods were calculated. Their magnitude were determined as 1-day, 3-day, 7-day, and 30 day. According the second method, hydraulic alteration indices often had low and medium range. In order to maintain river at an acceptable ecological condition, minimum daily discharge of index Q95 is 0.7 m3.s-1.

Keywords: ardabil, environmental flow, flow duration curve, Gharasou river

Procedia PDF Downloads 654
24361 Machine Learning Analysis of Student Success in Introductory Calculus Based Physics I Course

Authors: Chandra Prayaga, Aaron Wade, Lakshmi Prayaga, Gopi Shankar Mallu

Abstract:

This paper presents the use of machine learning algorithms to predict the success of students in an introductory physics course. Data having 140 rows pertaining to the performance of two batches of students was used. The lack of sufficient data to train robust machine learning models was compensated for by generating synthetic data similar to the real data. CTGAN and CTGAN with Gaussian Copula (Gaussian) were used to generate synthetic data, with the real data as input. To check the similarity between the real data and each synthetic dataset, pair plots were made. The synthetic data was used to train machine learning models using the PyCaret package. For the CTGAN data, the Ada Boost Classifier (ADA) was found to be the ML model with the best fit, whereas the CTGAN with Gaussian Copula yielded Logistic Regression (LR) as the best model. Both models were then tested for accuracy with the real data. ROC-AUC analysis was performed for all the ten classes of the target variable (Grades A, A-, B+, B, B-, C+, C, C-, D, F). The ADA model with CTGAN data showed a mean AUC score of 0.4377, but the LR model with the Gaussian data showed a mean AUC score of 0.6149. ROC-AUC plots were obtained for each Grade value separately. The LR model with Gaussian data showed consistently better AUC scores compared to the ADA model with CTGAN data, except in two cases of the Grade value, C- and A-.

Keywords: machine learning, student success, physics course, grades, synthetic data, CTGAN, gaussian copula CTGAN

Procedia PDF Downloads 26
24360 Analysis of Various Factors Affecting Hardness and Content of Phases Resulting from 1030 Carbon Steel Heat Treatment Using AC3 Software

Authors: Saeid Shahraki, Mohammad Mahdi Kaekha

Abstract:

1030 steel, a kind of carbon steel used in homogenization, cold-forming, quenching, and tempering conditions, is generally utilized in small parts resisting medium stress, such as connection foundations, hydraulic cylinders, tiny gears, pins, clamps, automotive normal forging parts, camshafts, levers, pundits, and nuts. In this study, AC3 software was used to measure the effect of carbon and manganese percentage, dimensions and geometry of pieces, the type of the cooling fluid, temperature, and time on hardness and the content of 1030 steel phases. Next, the results are compared with the analytical values obtained from the Lumped Capacity Method.

Keywords: 1030Steel, AC3software, heat treatment, lumped capacity method

Procedia PDF Downloads 262
24359 Recovery of Proteins from EDAM Whey Using Membrane Ultrafiltration

Authors: F. Yelles-Allam, A. A. Nouani

Abstract:

In Algeria, whey is discarded without any treatment and this causes not only pollution problem, but also a loss in nutritive components of milk. In this paper, characterization of EDAM whey, which is resulted from pasteurised mixture of cow’s milk and skim milk, and recovery of whey protein by ultrafiltration / diafiltration, was studied. The physical-chemical analysis of whey has emphasized on its pollutant and nutritive characteristics. In fact, its DBO5 and DCO are 49.33, and 127.71 gr of O2/l of whey respectively. It contains: fat (1,90±0,1 gr/l), lactose (47.32±1,57 gr/l), proteins (8.04±0,2 gr/l) and ashes (5,20±0,15 gr/l), calcium (0,48±0,04 gr/l), Na (1.104gr/l), K (1.014 gr/l), Mg (0.118 gr/l) and P (0.482 gr/l). Ultrafiltration was carried out in a polyetersulfone membrane with a cut-off of 10K. Its hydraulic intrinsic resistance and permeability are respectively: 2.041.1012 m-1 and 176,32 l/h.m2 at PTM of 1 bar. The retentate obtained at FC6, contains 16,33g/l of proteins and 70,25 g/l of dry matter. The retention rate of protein is 97, 7% and the decrease in DBO5 and DCO are at 18.875 g /l and 42.818 g/l respectively. Diafiltration performed on protein concentrates allowed the complete removal of lactose and minerals. The ultrafiltration of the whey before the disposal is an alternative for Algéria dairy industry.

Keywords: diafiltration, DBO, DCO, protein, ultrafiltration, whey

Procedia PDF Downloads 236
24358 Data Access, AI Intensity, and Scale Advantages

Authors: Chuping Lo

Abstract:

This paper presents a simple model demonstrating that ceteris paribus countries with lower barriers to accessing global data tend to earn higher incomes than other countries. Therefore, large countries that inherently have greater data resources tend to have higher incomes than smaller countries, such that the former may be more hesitant than the latter to liberalize cross-border data flows to maintain this advantage. Furthermore, countries with higher artificial intelligence (AI) intensity in production technologies tend to benefit more from economies of scale in data aggregation, leading to higher income and more trade as they are better able to utilize global data.

Keywords: digital intensity, digital divide, international trade, scale of economics

Procedia PDF Downloads 43
24357 Secured Transmission and Reserving Space in Images Before Encryption to Embed Data

Authors: G. R. Navaneesh, E. Nagarajan, C. H. Rajam Raju

Abstract:

Nowadays the multimedia data are used to store some secure information. All previous methods allocate a space in image for data embedding purpose after encryption. In this paper, we propose a novel method by reserving space in image with a boundary surrounded before encryption with a traditional RDH algorithm, which makes it easy for the data hider to reversibly embed data in the encrypted images. The proposed method can achieve real time performance, that is, data extraction and image recovery are free of any error. A secure transmission process is also discussed in this paper, which improves the efficiency by ten times compared to other processes as discussed.

Keywords: secure communication, reserving room before encryption, least significant bits, image encryption, reversible data hiding

Procedia PDF Downloads 389