Search results for: database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1593

Search results for: database

903 CFD Prediction of the Round Elbow Fitting Loss Coefficient

Authors: Ana Paula P. dos Santos, Claudia R. Andrade, Edson L. Zaparoli

Abstract:

Pressure loss in ductworks is an important factor to be considered in design of engineering systems such as power-plants, refineries, HVAC systems to reduce energy costs. Ductwork can be composed by straight ducts and different types of fittings (elbows, transitions, converging and diverging tees and wyes). Duct fittings are significant sources of pressure loss in fluid distribution systems. Fitting losses can be even more significant than equipment components such as coils, filters, and dampers. At the present work, a conventional 90o round elbow under turbulent incompressible airflow is studied. Mass, momentum, and k-e turbulence model equations are solved employing the finite volume method. The SIMPLE algorithm is used for the pressure-velocity coupling. In order to validate the numerical tool, the elbow pressure loss coefficient is determined using the same conditions to compare with ASHRAE database. Furthermore, the effect of Reynolds number variation on the elbow pressure loss coefficient is investigated. These results can be useful to perform better preliminary design of air distribution ductworks in air conditioning systems.

Keywords: duct fitting, pressure loss, elbow, thermodynamics

Procedia PDF Downloads 374
902 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects

Authors: Mai Ghazal, Ahmed Hammad

Abstract:

Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.

Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management

Procedia PDF Downloads 348
901 A Comparative Analysis on QRS Peak Detection Using BIOPAC and MATLAB Software

Authors: Chandra Mukherjee

Abstract:

The present paper is a representation of the work done in the field of ECG signal analysis using MATLAB 7.1 Platform. An accurate and simple ECG feature extraction algorithm is presented in this paper and developed algorithm is validated using BIOPAC software. To detect the QRS peak, ECG signal is processed by following mentioned stages- First Derivative, Second Derivative and then squaring of that second derivative. Efficiency of developed algorithm is tested on ECG samples from different database and real time ECG signals acquired using BIOPAC system. Firstly we have lead wise specified threshold value the samples above that value is marked and in the original signal, where these marked samples face change of slope are spotted as R-peak. On the left and right side of the R-peak, faces change of slope identified as Q and S peak, respectively. Now the inbuilt Detection algorithm of BIOPAC software is performed on same output sample and both outputs are compared. ECG baseline modulation correction is done after detecting characteristics points. The efficiency of the algorithm is tested using some validation parameters like Sensitivity, Positive Predictivity and we got satisfied value of these parameters.

Keywords: first derivative, variable threshold, slope reversal, baseline modulation correction

Procedia PDF Downloads 394
900 The Use of Technology in Mathematics Learning (1995-2024): A Bibliometric Analysis

Authors: Rahma Adinda Sartika

Abstract:

The use of technology in learning mathematics has received a positive response from both students and teachers, so many researchers have conducted research on this theme. Based on the findings carried out in this study, 807 documents relevant to this theme have been published in Scopus from 1995-2024. After going through the stages of identification, screening, eligibility, and including, the documents that meet the criteria are 227 documents. These documents are then analyzed using the bibliometric method so that it can be seen that the most published documents in the Scopus database occurred in 2020, with 38 documents, and the lowest was from 1996 to 2000 and 2004 to 2007, namely, no documents published. The highest number of citations is in documents published in 2018, with a total of 349 citations, so the h-index is higher than the others. The country that published the most documents relevant to this theme is Indonesia with a total of 91 documents. The second largest is the United States, with a total of 28 published documents, and the third largest is China, with a total of 15 documents. Indonesia and the United States have the most working relationships between countries compared to other countries. The focus of research related to this theme is 1) mathematics learning, 2) learning systems, 3) engineering education, 4) technology and 5) mathematical concepts.

Keywords: technology, bibliometric, mathematics learning, mathematical concepts

Procedia PDF Downloads 2
899 Digitization and Morphometric Characterization of Botanical Collection of Indian Arid Zones as Informatics Initiatives Addressing Conservation Issues in Climate Change Scenario

Authors: Dipankar Saha, J. P. Singh, C. B. Pandey

Abstract:

Indian Thar desert being the seventh largest in the world is the main hot sand desert occupies nearly 385,000km2 and about 9% of the area of the country harbours several species likely the flora of 682 species (63 introduced species) belonging to 352 genera and 87 families. The degree of endemism of plant species in the Thar desert is 6.4 percent, which is relatively higher than the degree of endemism in the Sahara desert which is very significant for the conservationist to envisage. The advent and development of computer technology for digitization and data base management coupled with the rapidly increasing importance of biodiversity conservation resulted in the invention of biodiversity informatics as discipline of basic sciences with multiple applications. Aichi Target 19 as an outcome of Convention of Biological Diversity (CBD) specifically mandates the development of an advanced and shared biodiversity knowledge base. Information on species distributions in space is the crux of effective management of biodiversity in the rapidly changing world. The efficiency of biodiversity management is being increased rapidly by various stakeholders like researchers, policymakers, and funding agencies with the knowledge and application of biodiversity informatics. Herbarium specimens being a vital repository for biodiversity conservation especially in climate change scenario the digitization process usually aims to improve access and to preserve delicate specimens and in doing so creating large sets of images as a part of the existing repository as arid plant information facility for long-term future usage. As the leaf characters are important for describing taxa and distinguishing between them and they can be measured from herbarium specimens as well. As a part of this activity, laminar characterization (leaves being the most important characters in assessing climate change impact) initially resulted in classification of more than thousands collections belonging to ten families like Acanthaceae, Aizoaceae, Amaranthaceae, Asclepiadaceae, Anacardeaceae, Apocynaceae, Asteraceae, Aristolochiaceae, Berseraceae and Bignoniaceae etc. Taxonomic diversity indices has also been worked out being one of the important domain of biodiversity informatics approaches. The digitization process also encompasses workflows which incorporate automated systems to enable us to expand and speed up the digitisation process. The digitisation workflows used to be on a modular system which has the potential to be scaled up. As they are being developed with a geo-referencing tool and additional quality control elements and finally placing specimen images and data into a fully searchable, web-accessible database. Our effort in this paper is to elucidate the role of BIs, present effort of database development of the existing botanical collection of institute repository. This effort is expected to be considered as a part of various global initiatives having an effective biodiversity information facility. This will enable access to plant biodiversity data that are fit-for-use by scientists and decision makers working on biodiversity conservation and sustainable development in the region and iso-climatic situation of the world.

Keywords: biodiversity informatics, climate change, digitization, herbarium, laminar characters, web accessible interface

Procedia PDF Downloads 207
898 Improved Dynamic Bayesian Networks Applied to Arabic On Line Characters Recognition

Authors: Redouane Tlemsani, Abdelkader Benyettou

Abstract:

Work is in on line Arabic character recognition and the principal motivation is to study the Arab manuscript with on line technology. This system is a Markovian system, which one can see as like a Dynamic Bayesian Network (DBN). One of the major interests of these systems resides in the complete models training (topology and parameters) starting from training data. Our approach is based on the dynamic Bayesian Networks formalism. The DBNs theory is a Bayesians networks generalization to the dynamic processes. Among our objective, amounts finding better parameters, which represent the links (dependences) between dynamic network variables. In applications in pattern recognition, one will carry out the fixing of the structure, which obliges us to admit some strong assumptions (for example independence between some variables). Our application will relate to the Arabic isolated characters on line recognition using our laboratory database: NOUN. A neural tester proposed for DBN external optimization. The DBN scores and DBN mixed are respectively 70.24% and 62.50%, which lets predict their further development; other approaches taking account time were considered and implemented until obtaining a significant recognition rate 94.79%.

Keywords: Arabic on line character recognition, dynamic Bayesian network, pattern recognition, computer vision

Procedia PDF Downloads 413
897 The Use of Modern Technologies and Computers in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar MehrAfarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: Iran, sistan, archaeological surveys, computer use, modern technologies

Procedia PDF Downloads 55
896 Spare Part Inventory Optimization Policy: A Study Literature

Authors: Zukhrof Romadhon, Nani Kurniati

Abstract:

Availability of Spare parts is critical to support maintenance tasks and the production system. Managing spare part inventory deals with some parameters and objective functions, as well as the tradeoff between inventory costs and spare parts availability. Several mathematical models and methods have been developed to optimize the spare part policy. Many researchers who proposed optimization models need to be considered to identify other potential models. This work presents a review of several pertinent literature on spare part inventory optimization and analyzes the gaps for future research. Initial investigation on scholars and many journal database systems under specific keywords related to spare parts found about 17K papers. Filtering was conducted based on five main aspects, i.e., replenishment policy, objective function, echelon network, lead time, model solving, and additional aspects of part classification. Future topics could be identified based on the number of papers that haven’t addressed specific aspects, including joint optimization of spare part inventory and maintenance.

Keywords: spare part, spare part inventory, inventory model, optimization, maintenance

Procedia PDF Downloads 39
895 Utilizing Google Earth for Internet GIS

Authors: Alireza Derambakhsh

Abstract:

The objective of this examination is to explore the capability of utilizing Google Earth for Internet GIS applications. The study particularly analyzes the utilization of vector and characteristic information and the capability of showing and preparing this information in new ways utilizing the Google Earth stage. It has progressively been perceived that future improvements in GIS will fixate on Internet GIS, and in three noteworthy territories: GIS information access, spatial data scattering and GIS displaying/preparing. Google Earth is one of the group of geobrowsers that offer a free and simple to utilize administration that empower information with a spatial part to be overlain on top of a 3-D model of the Earth. This examination makes a methodological structure to accomplish its objective that comprises of three noteworthy parts: A database level, an application level and a customer level. As verification of idea a web model has been produced, which incorporates a differing scope of datasets and lets clients direst inquiries and make perceptions of this custom information. The outcomes uncovered that both vector and property information can be successfully spoken to and imagined utilizing Google Earth. In addition, the usefulness to question custom information and envision results has been added to the Google Earth stage.

Keywords: Google earth, internet GIS, vector, characteristic information

Procedia PDF Downloads 288
894 Validation of Existing Index Properties-Based Correlations for Estimating the Soil–Water Characteristic Curve of Fine-Grained Soils

Authors: Karim Kootahi, Seyed Abolhasan Naeini

Abstract:

The soil-water characteristic curve (SWCC), which represents the relationship between suction and water content (or degree of saturation), is an important property of unsaturated soils. The conventional method for determining SWCC is through specialized testing procedures. Since these procedures require specialized unsaturated soil testing apparatus and lengthy testing programs, several index properties-based correlations have been developed for estimating the SWCC of fine-grained soils. There are, however, considerable inconsistencies among the published correlations and there is no validation study on the predictive ability of existing correlations. In the present study, all existing index properties-based correlations are evaluated using a high quality worldwide database. The performances of existing correlations are assessed both graphically and quantitatively using statistical measures. The results of the validation indicate that most of the existing correlations provide unacceptable estimates of degree of saturation but the most recent model appears to be promising.

Keywords: SWCC, correlations, index properties, validation

Procedia PDF Downloads 155
893 Rheological Characteristics of Ice Slurries Based on Propylene- and Ethylene-Glycol at High Ice Fractions

Authors: Senda Trabelsi, Sébastien Poncet, Michel Poirier

Abstract:

Ice slurries are considered as a promising phase-changing secondary fluids for air-conditioning, packaging or cooling industrial processes. An experimental study has been here carried out to measure the rheological characteristics of ice slurries. Ice slurries consist in a solid phase (flake ice crystals) and a liquid phase. The later is composed of a mixture of liquid water and an additive being here either (1) Propylene-Glycol (PG) or (2) Ethylene-Glycol (EG) used to lower the freezing point of water. Concentrations of 5%, 14% and 24% of both additives are investigated with ice mass fractions ranging from 5% to 85%. The rheological measurements are carried out using a Discovery HR-2 vane-concentric cylinder with four full-length blades. The experimental results show that the behavior of ice slurries is generally non-Newtonian with shear-thinning or shear-thickening behaviors depending on the experimental conditions. In order to determine the consistency and the flow index, the Herschel-Bulkley model is used to describe the behavior of ice slurries. The present results are finally validated against an experimental database found in the literature and the predictions of an Artificial Neural Network model.

Keywords: ice slurry, propylene-glycol, ethylene-glycol, rheology

Procedia PDF Downloads 245
892 Cloud Computing in Data Mining: A Technical Survey

Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham

Abstract:

Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.

Keywords: cloud computing, data mining, computing models, cloud services

Procedia PDF Downloads 456
891 Morphological Study of Sesamoid Bones of Thumb in South Indians

Authors: B. V. Murlimanju, R. Abisshek Balaji, Apoorva Aggarwal, Mangala M. Pai

Abstract:

Background: Since the literature is scarce from the South Indian population about the sesamoid bones of the thumb, the present study was undertaken. The objective of the present study was to figure out the muscle of the thumb which contain these sesamoid bones. Methods: The present study included 25 cadaveric thumbs, which were obtained from the anatomy laboratory of our institution. Thumbs were studied for the prevalence of sesamoid bones at the metacarpophalangeal and interphalangeal joints. The muscle which contain these sesamoid bones were identified. Results: The present study observed that, there were 2 sesamoid bones (92%) at the metacarpophalangeal joint of the thumb each at its medial and lateral aspect. The medial sesamoid bone was found inside the adductor pollicis muscle and lateral one was found either in the flexor pollicis brevis muscle or abductor pollicis brevis muscle. However, among the 25 thumbs being studied, 2 thumbs (8%) had solitary sesamoid bone. The interphalangeal joint of the thumb exhibited only one sesamoid bone at the median plane. Conclusion: The morphological data of the present study from the South Indians can be used as a database, which is enlightening to the operating hand surgeon and radiologist.

Keywords: morphology, muscles, sesamoid bones, thumb

Procedia PDF Downloads 189
890 Quantum Modelling of AgHMoO4, CsHMoO4 and AgCsMoO4 Chemistry in the Field of Nuclear Power Plant Safety

Authors: Mohamad Saab, Sidi Souvi

Abstract:

In a major nuclear accident, the released fission products (FPs) and the structural materials are likely to influence the transport of iodine in the reactor coolant system (RCS) of a pressurized water reactor (PWR). So far, the thermodynamic data on cesium and silver species used to estimate the magnitude of FP release show some discrepancies, data are scarce and not reliable. For this reason, it is crucial to review the thermodynamic values related to cesium and silver materials. To this end, we have used state-of-the-art quantum chemical methods to compute the formation enthalpies and entropies of AgHMoO₄, CsHMoO₄, and AgCsMoO₄ in the gas phase. Different quantum chemical methods have been investigated (DFT and CCSD(T)) in order to predict the geometrical parameters and the energetics including the correlation energy. The geometries were optimized with TPSSh-5%HF method, followed by a single point calculation of the total electronic energies using the CCSD(T) wave function method. We thus propose with a final uncertainty of about 2 kJmol⁻¹ standard enthalpies of formation of AgHMoO₄, CsHMoO₄, and AgCsMoO₄.

Keywords: nuclear accident, ASTEC code, thermochemical database, quantum chemical methods

Procedia PDF Downloads 174
889 Analysis of Production Forecasting in Unconventional Gas Resources Development Using Machine Learning and Data-Driven Approach

Authors: Dongkwon Han, Sangho Kim, Sunil Kwon

Abstract:

Unconventional gas resources have dramatically changed the future energy landscape. Unlike conventional gas resources, the key challenges in unconventional gas have been the requirement that applies to advanced approaches for production forecasting due to uncertainty and complexity of fluid flow. In this study, artificial neural network (ANN) model which integrates machine learning and data-driven approach was developed to predict productivity in shale gas. The database of 129 wells of Eagle Ford shale basin used for testing and training of the ANN model. The Input data related to hydraulic fracturing, well completion and productivity of shale gas were selected and the output data is a cumulative production. The performance of the ANN using all data sets, clustering and variables importance (VI) models were compared in the mean absolute percentage error (MAPE). ANN model using all data sets, clustering, and VI were obtained as 44.22%, 10.08% (cluster 1), 5.26% (cluster 2), 6.35%(cluster 3), and 32.23% (ANN VI), 23.19% (SVM VI), respectively. The results showed that the pre-trained ANN model provides more accurate results than the ANN model using all data sets.

Keywords: unconventional gas, artificial neural network, machine learning, clustering, variables importance

Procedia PDF Downloads 178
888 Emotional Intelligence in the Modern World: A Quantitative and Qualitative Study of the UMCS Students

Authors: Anna Dabrowska

Abstract:

Taking Daniel Goleman’s (1994) belief that success in life depends on IQ in 20% and in 80% on emotional intelligence, and that it is worth considering emotional intelligence as an important factor in human performance and development potential, the aim of the paper is to explore the range of emotions experienced by university students who represent Society 5.0. This quantitative and qualitative study is meant to explore not only the list of the most and least experienced emotions by the students, but also the main reasons behind these feelings. The database of the study consists of 115 respondents out of 129 students of the 1st and 5th year of Applied Linguistics at Maria Curie-Skłodowska University, which constitutes 89% of those being surveyed. The data is extracted from the anonymous questionnaire, which comprises young people’s answers and discourse concerning the causes of their most experienced emotions. Following Robert Plutchik’s theory of eight primary emotions, i.e. anger, fear, sadness, disgust, surprise, anticipation, trust, and joy, we adopt his argument for the primacy of these emotions by showing each to be the trigger of behaviour with high survival value. In fact, all other emotions are mixed or derivative states; that is, they occur as combinations, mixtures, or compounds of the primary emotions. Accordingly, the eight primary emotions, and their mixed states, are checked in the study on the students.

Keywords: emotions, intelligence, students, discourse study, emotional intelligence

Procedia PDF Downloads 15
887 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: rice disease, data analysis system, mobile application, iOS operating system

Procedia PDF Downloads 269
886 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System

Authors: J. K. Adedeji, M. O. Oyekanmi

Abstract:

This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.

Keywords: biometric characters, facial recognition, neural network, OpenCV

Procedia PDF Downloads 235
885 Assessment of Naturally Occurring Radionuclides of the Surface Water in Vaal River, South Africa

Authors: Kgantsi B. T., Ochwelwang A. R., Mathuthu M., Jegede O. A.

Abstract:

Anthropogenic activities near water bodies contribute to poor water quality, which degrades the condition of the biota and elevates the risk to human health. The Vaal River is essential in supplying Gauteng and neighboring regions of South Africa with portable water for a variety of consumers and industries. Consequently, it is necessary to monitor and assess the radioactive risk in relation to the river's water quality. This study used an inductive coupled plasma mass spectrometer (ICPMS) to analyze the radionuclide activity concentration in the Vaal River, South Africa. Along with thorium and potassium, the total uranium concentration was calculated using the isotopic content of uranium. The elemental concentration of ²³⁸U, ²³⁵U, ²³⁴U, ²³²Th, and 40K were translated into activity concentrations. To assess the water safety for all users and consumers, all values were compared to world average activity concentrations 35, 30, and 400 Bqkg⁻¹ for ²³⁸U, ²³⁴Th, and ⁴⁰K, respectively, according to the UNSCEAR report. The results will serve as a database for further monitoring and evaluation of the radionuclide from the river, taking cognisance of potential health hazards.

Keywords: Val Rivers, ICPMS, uranium, risks

Procedia PDF Downloads 149
884 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population

Authors: Ye Xue, Zhenhua Deng

Abstract:

Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.

Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool

Procedia PDF Downloads 47
883 Aggregate Fluctuations and the Global Network of Input-Output Linkages

Authors: Alexander Hempfing

Abstract:

The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.

Keywords: economic integration, industrial organization, input-output economics, network economics, production networks

Procedia PDF Downloads 252
882 Computing Continuous Skyline Queries without Discriminating between Static and Dynamic Attributes

Authors: Ibrahim Gomaa, Hoda M. O. Mokhtar

Abstract:

Although most of the existing skyline queries algorithms focused basically on querying static points through static databases; with the expanding number of sensors, wireless communications and mobile applications, the demand for continuous skyline queries has increased. Unlike traditional skyline queries which only consider static attributes, continuous skyline queries include dynamic attributes, as well as the static ones. However, as skyline queries computation is based on checking the domination of skyline points over all dimensions, considering both the static and dynamic attributes without separation is required. In this paper, we present an efficient algorithm for computing continuous skyline queries without discriminating between static and dynamic attributes. Our algorithm in brief proceeds as follows: First, it excludes the points which will not be in the initial skyline result; this pruning phase reduces the required number of comparisons. Second, the association between the spatial positions of data points is examined; this phase gives an idea of where changes in the result might occur and consequently enables us to efficiently update the skyline result (continuous update) rather than computing the skyline from scratch. Finally, experimental evaluation is provided which demonstrates the accuracy, performance and efficiency of our algorithm over other existing approaches.

Keywords: continuous query processing, dynamic database, moving object, skyline queries

Procedia PDF Downloads 196
881 Bibliometrics of 'Community Garden' and Associated Keywords

Authors: Guilherme Reis Ranieri, Guilherme Leite Gaudereto, Michele Toledo, Luis Fernando Amato-Lourenco, Thais Mauad

Abstract:

Given the importance to urban sustainability and the growing relevance of the term ‘community garden’, this paper aims to conduct a bibliometric analysis of the term. Using SCOPUS as database, we analyzed 105 articles that contained the keywords ‘community garden’, and conducted a cluster analysis with the associated keywords. As results, we found 205 articles and 404 different keywords. Among the keywords, 334 are not repeated anytime, 44 are repeated 2 times and 9 appear 3 times. The most frequent keywords are: community food systems (74), urban activism (14), Communities of practice (6), food production (6) and public rethoric (5). Within the areas, which contains more articles are: social sciences (74), environmental science (29) and agricultural and biological sciences (24).The three main countries that concentrated the papers are United States (54), Canada (15) and Australia (12). The main journal with these keywords is Local Environment (10). The first publication was in 1999, and by 2010 concentrated 30,5% of the publications. The other 69,5% occurred 2010 to 2015, indicating an increase in frequency. We can conclude that the papers, based on the distribution of the keywords, are still scattered in various research topics and presents high variability between subjects.

Keywords: bibliometrics, community garden, metrics, urban agriculture

Procedia PDF Downloads 341
880 Use of computer and peripherals in the Archaeological Surveys of Sistan in Eastern Iran

Authors: Mahyar Mehrafarin, Reza Mehrafarin

Abstract:

The Sistan region in eastern Iran is a significant archaeological area in Iran and the Middle East, encompassing 10,000 square kilometers. Previous archeological field surveys have identified 1662 ancient sites dating from prehistoric periods to the Islamic period. Research Aim: This article aims to explore the utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, and the benefits derived from their implementation. Methodology: The research employs a descriptive-analytical approach combined with field methods. New technologies and software, such as GPS, drones, magnetometers, equipped cameras, satellite images, and software programs like GIS, Map source, and Excel, were utilized to collect information and analyze data. Findings: The use of modern technologies and computers in archaeological field surveys proved to be essential. Traditional archaeological activities, such as excavation and field surveys, are time-consuming and costly. Employing modern technologies helps in preserving ancient sites, accurately recording archaeological data, reducing errors and mistakes, and facilitating correct and accurate analysis. Creating a comprehensive and accessible database, generating statistics, and producing graphic designs and diagrams are additional advantages derived from the use of efficient technologies in archaeology. Theoretical Importance: The integration of computers and modern technologies in archaeology contributes to interdisciplinary collaborations and facilitates the involvement of specialists from various fields, such as geography, history, art history, anthropology, laboratory sciences, and computer engineering. The utilization of computers in archaeology spanned across diverse areas, including database creation, statistical analysis, graphics implementation, laboratory and engineering applications, and even artificial intelligence, which remains an unexplored area in Iranian archaeology. Data Collection and Analysis Procedures: Information was collected using modern technologies and software, capturing geographic coordinates, aerial images, archeogeophysical data, and satellite images. This data was then inputted into various software programs for analysis, including GIS, Map source, and Excel. The research employed both descriptive and analytical methods to present findings effectively. Question Addressed: The primary question addressed in this research is how the use of modern technologies and computers in archeological field surveys in Sistan, Iran, can enhance archaeological data collection, preservation, analysis, and accessibility. Conclusion: The utilization of modern technologies and computers in archaeological field surveys in Sistan, Iran, has proven to be necessary and beneficial. These technologies aid in preserving ancient sites, accurately recording archaeological data, reducing errors, and facilitating comprehensive analysis. The creation of accessible databases, statistics generation, graphic designs, and interdisciplinary collaborations are further advantages observed. It is recommended to explore the potential of artificial intelligence in Iranian archaeology as an unexplored area. The research has implications for cultural heritage organizations, archaeology students, and universities involved in archaeological field surveys in Sistan and Baluchistan province. Additionally, it contributes to enhancing the understanding and preservation of Iran's archaeological heritage.

Keywords: archaeological surveys, computer use, iran, modern technologies, sistan

Procedia PDF Downloads 62
879 European Food Safety Authority (EFSA) Safety Assessment of Food Additives: Data and Methodology Used for the Assessment of Dietary Exposure for Different European Countries and Population Groups

Authors: Petra Gergelova, Sofia Ioannidou, Davide Arcella, Alexandra Tard, Polly E. Boon, Oliver Lindtner, Christina Tlustos, Jean-Charles Leblanc

Abstract:

Objectives: To assess chronic dietary exposure to food additives in different European countries and population groups. Method and Design: The European Food Safety Authority’s (EFSA) Panel on Food Additives and Nutrient Sources added to Food (ANS) estimates chronic dietary exposure to food additives with the purpose of re-evaluating food additives that were previously authorized in Europe. For this, EFSA uses concentration values (usage and/or analytical occurrence data) reported through regular public calls for data by food industry and European countries. These are combined, at individual level, with national food consumption data from the EFSA Comprehensive European Food Consumption Database including data from 33 dietary surveys from 19 European countries and considering six different population groups (infants, toddlers, children, adolescents, adults and the elderly). EFSA ANS Panel estimates dietary exposure for each individual in the EFSA Comprehensive Database by combining the occurrence levels per food group with their corresponding consumption amount per kg body weight. An individual average exposure per day is calculated, resulting in distributions of individual exposures per survey and population group. Based on these distributions, the average and 95th percentile of exposure is calculated per survey and per population group. Dietary exposure is assessed based on two different sets of data: (a) Maximum permitted levels (MPLs) of use set down in the EU legislation (defined as regulatory maximum level exposure assessment scenario) and (b) usage levels and/or analytical occurrence data (defined as refined exposure assessment scenario). The refined exposure assessment scenario is sub-divided into the brand-loyal consumer scenario and the non-brand-loyal consumer scenario. For the brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the highest reported usage/analytical level for one food group, and at the mean level for the remaining food groups. For the non-brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the mean reported usage/analytical level for all food groups. An additional exposure from sources other than direct addition of food additives (i.e. natural presence, contaminants, and carriers of food additives) is also estimated, as appropriate. Results: Since 2014, this methodology has been applied in about 30 food additive exposure assessments conducted as part of scientific opinions of the EFSA ANS Panel. For example, under the non-brand-loyal scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 5.9 and 8.7 mg/kg body weight/day, respectively. The same estimates under the brand-loyal scenario in toddlers resulted in exposures of 8.1 and 20.7 mg/kg body weight/day, respectively. For the regulatory maximum level exposure assessment scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 11.9 and 30.3 mg/kg body weight/day, respectively. Conclusions: Detailed and up-to-date information on food additive concentration values (usage and/or analytical occurrence data) and food consumption data enable the assessment of chronic dietary exposure to food additives to more realistic levels.

Keywords: α-tocopherol, ammonium phosphatides, dietary exposure assessment, European Food Safety Authority, food additives, food consumption data

Procedia PDF Downloads 297
878 Brain Computer Interface Implementation for Affective Computing Sensing: Classifiers Comparison

Authors: Ramón Aparicio-García, Gustavo Juárez Gracia, Jesús Álvarez Cedillo

Abstract:

A research line of the computer science that involve the study of the Human-Computer Interaction (HCI), which search to recognize and interpret the user intent by the storage and the subsequent analysis of the electrical signals of the brain, for using them in the control of electronic devices. On the other hand, the affective computing research applies the human emotions in the HCI process helping to reduce the user frustration. This paper shows the results obtained during the hardware and software development of a Brain Computer Interface (BCI) capable of recognizing the human emotions through the association of the brain electrical activity patterns. The hardware involves the sensing stage and analogical-digital conversion. The interface software involves algorithms for pre-processing of the signal in time and frequency analysis and the classification of patterns associated with the electrical brain activity. The methods used for the analysis and classification of the signal have been tested separately, by using a database that is accessible to the public, besides to a comparison among classifiers in order to know the best performing.

Keywords: affective computing, interface, brain, intelligent interaction

Procedia PDF Downloads 366
877 Maternal Smoking and Risk of Childhood Overweight and Obesity: A Meta-Analysis

Authors: Martina Kanciruk, Jac J. W. Andrews, Tyrone Donnon

Abstract:

The purpose of this study was to determine the significance of maternal smoking for the development of childhood overweight and/or obesity. Accordingly, a systematic literature review of English-language studies published from 1980 to 2012 using the following data bases: MEDLINE, PsychINFO, Cochrane Database of Systematic Reviews, and Dissertation Abstracts International was conducted. The following terms were used in the search: pregnancy, overweight, obesity, smoking, parents, childhood, risk factors. Eighteen studies of maternal smoking during pregnancy and obesity conducted in Europe, Asia, North America, and South America met the inclusion criteria. A meta-analysis of these studies indicated that maternal smoking during pregnancy is a significant risk factor for overweight and obesity; mothers who smoke during pregnancy are at a greater risk for developing obesity or overweight; the quantity of cigarettes consumed by the mother during pregnancy influenced the odds of offspring overweight and/or obesity. In addition, the results from moderator analyses suggest that part of the heterogeneity discovered between the studies can be explained by the region of world that the study occurred in and the age of the child at the time of weight assessment.

Keywords: childhood obesity, overweight, smoking, parents, risk factors

Procedia PDF Downloads 504
876 Content-Based Mammograms Retrieval Based on Breast Density Criteria Using Bidimensional Empirical Mode Decomposition

Authors: Sourour Khouaja, Hejer Jlassi, Nadia Feddaoui, Kamel Hamrouni

Abstract:

Most medical images, and especially mammographies, are now stored in large databases. Retrieving a desired image is considered of great importance in order to find previous similar cases diagnosis. Our method is implemented to assist radiologists in retrieving mammographic images containing breast with similar density aspect as seen on the mammogram. This is becoming a challenge seeing the importance of density criteria in cancer provision and its effect on segmentation issues. We used the BEMD (Bidimensional Empirical Mode Decomposition) to characterize the content of images and Euclidean distance measure similarity between images. Through the experiments on the MIAS mammography image database, we confirm that the results are promising. The performance was evaluated using precision and recall curves comparing query and retrieved images. Computing recall-precision proved the effectiveness of applying the CBIR in the large mammographic image databases. We found a precision of 91.2% for mammography with a recall of 86.8%.

Keywords: BEMD, breast density, contend-based, image retrieval, mammography

Procedia PDF Downloads 215
875 Research Trends on Magnetic Graphene for Water Treatment: A Bibliometric Analysis

Authors: J. C. M. Santos, J. C. A. Sousa, A. J. Rubio, L. S. Soletti, F. Gasparotto, N. U. Yamaguchi

Abstract:

Magnetic graphene has received widespread attention for their capability of water and wastewater treatment, which has been attracted many researchers in this field. A bibliometric analysis based on the Web of Science database was employed to analyze the global scientific outputs of magnetic graphene for water treatment until the present time (2012 to 2017), to improve the understanding of the research trends. The publication year, place of publication, institutes, funding agencies, journals, most cited articles, distribution outputs in thematic categories and applications were analyzed. Three major aspects analyzed including type of pollutant, treatment process and composite composition have further contributed to revealing the research trends. The most relevant research aspects of the main technologies using magnetic graphene for water treatment were summarized in this paper. The results showed that research on magnetic graphene for water treatment goes through a period of decline that might be related to a saturated field and a lack of bibliometric studies. Thus, the result of the present work will lead researchers to establish future directions in further studies using magnetic graphene for water treatment.

Keywords: composite, graphene oxide, nanomaterials, scientometrics

Procedia PDF Downloads 231
874 The Role of Pulmonary Resection in Complicated Primary Pediatric Pulmonary Tuberculosis: An Evidence-Based Case Report

Authors: Hendra Wibowo, Suprayitno Wardoyo, Dhama Shinta

Abstract:

Introduction: Pediatric pulmonary tuberculosis (TB) incidence was increasing, with many undetected cases. In complicated TB, treatment should consist of returning pulmonary function, preventing further complications, and eliminating bacteria. Complicated TB management was still controversial, and surgery was one of the treatments that should be evaluated in accordance with its role in the treatment of complicated TB. Method: This study was an evidence-based case report. The database used for the literature search were Cochrane, Medline, Proquest, and ScienceDirect. Keywords for the search were ‘primary pulmonary tuberculosis’, ‘surgery’, ‘lung resection’, and ‘children’. Inclusion criteria were studies in English or Indonesian, with children under 18 years old as subject, and full-text articles available. The assessment was done according to Oxford Centre for evidence-based medicine 2011. Results: Six cohort studies were analyzed. Surgery was indicated for patients with complicated TB that were unresponsive towards treatment. It should be noted that the experiments were done before the standard WHO antituberculosis therapy was applied; thus, the result may be different from the current application. Conclusion: Currently, there was no guideline on pulmonary resection. However, surgery yielded better mortality and morbidity in children with complicated pulmonary TB.

Keywords: pediatric, pulmonary, surgery, therapy, tuberculosis

Procedia PDF Downloads 93