Search results for: “User acceptance of computer technology:A comparison of two theoretical models ”
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 23406

Search results for: “User acceptance of computer technology:A comparison of two theoretical models ”

18756 Integrating Blockchain and Internet of Things Platforms: An Empirical Study on Immunization Cold Chain

Authors: Fawzia Abujalala, Asma Elmangoush, Majdi Ashibani

Abstract:

The adoption of Blockchain technology introduces the possibility to decentralize cold chain systems. This adaptation enhances them to be more efficient, accessible, verifiable, and data security. Additionally, the Internet of Things (IoT) concept is considered as an added-value to various application domains. Cargo tracking and cold chain are a few to name. However, the security of the IoT transactions and integrated devices remains one of the key challenges to the IoT application’s success. Consequently, Blockchain technology and its consensus protocols have been used to solve many information security problems. In this paper, the researchers discussed the advantages of integrating Blockchain technology into IoT platform to improve security and provide an overview of existing literature on integrating Blockchain and IoT platforms. Then, presented the immunization cold chain solution as a use-case that could apply to any critical goods based on integrating hyperledger fabric platform and IoT platform.

Keywords: blockchain, hyperledger fabric, internet of things, security, traceability

Procedia PDF Downloads 134
18755 On the Use of Analytical Performance Models to Design a High-Performance Active Queue Management Scheme

Authors: Shahram Jamali, Samira Hamed

Abstract:

One of the open issues in Random Early Detection (RED) algorithm is how to set its parameters to reach high performance for the dynamic conditions of the network. Although original RED uses fixed values for its parameters, this paper follows a model-based approach to upgrade performance of the RED algorithm. It models the routers queue behavior by using the Markov model and uses this model to predict future conditions of the queue. This prediction helps the proposed algorithm to make some tunings over RED's parameters and provide efficiency and better performance. Widespread packet level simulations confirm that the proposed algorithm, called Markov-RED, outperforms RED and FARED in terms of queue stability, bottleneck utilization and dropped packets count.

Keywords: active queue management, RED, Markov model, random early detection algorithm

Procedia PDF Downloads 527
18754 Using Historical Data for Stock Prediction

Authors: Sofia Stoica

Abstract:

In this paper, we use historical data to predict the stock price of a tech company. To this end, we use a dataset consisting of the stock prices in the past five years of ten major tech companies – Adobe, Amazon, Apple, Facebook, Google, Microsoft, Netflix, Oracle, Salesforce, and Tesla. We experimented with a variety of models– a linear regressor model, K nearest Neighbors (KNN), a sequential neural network – and algorithms - Multiplicative Weight Update, and AdaBoost. We found that the sequential neural network performed the best, with a testing error of 0.18%. Interestingly, the linear model performed the second best with a testing error of 0.73%. These results show that using historical data is enough to obtain high accuracies, and a simple algorithm like linear regression has a performance similar to more sophisticated models while taking less time and resources to implement.

Keywords: finance, machine learning, opening price, stock market

Procedia PDF Downloads 167
18753 Open Forging of Cylindrical Blanks Subjected to Lateral Instability

Authors: A. H. Elkholy, D. M. Almutairi

Abstract:

The successful and efficient execution of a forging process is dependent upon the correct analysis of loading and metal flow of blanks. This paper investigates the Upper Bound Technique (UBT) and its application in the analysis of open forging process when a possibility of blank bulging exists. The UBT is one of the energy rate minimization methods for the solution of metal forming process based on the upper bound theorem. In this regards, the kinematically admissible velocity field is obtained by minimizing the total forging energy rate. A computer program is developed in this research to implement the UBT. The significant advantages of this method is the speed of execution while maintaining a fairly high degree of accuracy and the wide prediction capability. The information from this analysis is useful for the design of forging processes and dies. Results for the prediction of forging loads and stresses, metal flow and surface profiles with the assured benefits in terms of press selection and blank preform design are outlined in some detail. The obtained predictions are ready for comparison with both laboratory and industrial results.

Keywords: forging, upper bound technique, metal forming, forging energy, forging die/platen

Procedia PDF Downloads 279
18752 Interconnections of Circular Economy, Circularity, and Sustainability: A Systematic Review and Conceptual Framework

Authors: Anteneh Dagnachew Sewenet, Paola Pisano

Abstract:

The concept of circular economy, circularity, and sustainability are interconnected and promote a more sustainable future. However, previous studies have mainly focused on each concept individually, neglecting the relationships and gaps in the existing literature. This study aims to integrate and link these concepts to expand the theoretical and practical methods of scholars and professionals in pursuit of sustainability. The aim of this systematic literature review is to comprehensively analyze and summarize the interconnections between circular economy, circularity, and sustainability. Additionally, it seeks to develop a conceptual framework that can guide practitioners and serve as a basis for future research. The review employed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) protocol. A total of 78 articles were analyzed, utilizing the Scopus and Web of Science databases. The analysis involved summarizing and systematizing the conceptualizations of circularity and its relationship with the circular economy and long-term sustainability. The review provided a comprehensive overview of the interconnections between circular economy, circularity, and sustainability. Key themes, theoretical frameworks, empirical findings, and conceptual gaps in the literature were identified. Through a rigorous analysis of scholarly articles, the study highlighted the importance of integrating these concepts for a more sustainable future. This study contributes to the existing literature by integrating and linking the concepts of circular economy, circularity, and sustainability. It expands the theoretical understanding of how these concepts relate to each other and provides a conceptual framework that can guide future research in this field. The findings emphasize the need for a holistic approach in achieving sustainability goals. The data collection for this review involved identifying relevant articles from the Scopus and Web of Science databases. The selection of articles was made based on predefined inclusion and exclusion criteria. The PRISMA protocol guided the systematic analysis of the selected articles, including summarizing and systematizing their content. This study addressed the question of how circularity is conceptualized and related to both the circular economy and long-term sustainability. It aimed to identify the interconnections between these concepts and bridge the gap in the existing literature. The review provided a comprehensive analysis of the interconnections between the circular economy, circularity, and sustainability. It presented a conceptual framework that can guide practitioners in implementing circular economy strategies and serve as a basis for future research. By integrating these concepts, scholars, and professionals can enhance the theoretical and practical methods in pursuit of a more sustainable future. The findings emphasize the importance of taking a holistic approach to achieve sustainability goals and highlight conceptual gaps that can be addressed in future studies.

Keywords: circularity, circular economy, sustainability, innovation

Procedia PDF Downloads 83
18751 Development of an Artificial Neural Network to Measure Science Literacy Leveraging Neuroscience

Authors: Amanda Kavner, Richard Lamb

Abstract:

Faster growth in science and technology of other nations may make staying globally competitive more difficult without shifting focus on how science is taught in US classes. An integral part of learning science involves visual and spatial thinking since complex, and real-world phenomena are often expressed in visual, symbolic, and concrete modes. The primary barrier to spatial thinking and visual literacy in Science, Technology, Engineering, and Math (STEM) fields is representational competence, which includes the ability to generate, transform, analyze and explain representations, as opposed to generic spatial ability. Although the relationship is known between the foundational visual literacy and the domain-specific science literacy, science literacy as a function of science learning is still not well understood. Moreover, the need for a more reliable measure is necessary to design resources which enhance the fundamental visuospatial cognitive processes behind scientific literacy. To support the improvement of students’ representational competence, first visualization skills necessary to process these science representations needed to be identified, which necessitates the development of an instrument to quantitatively measure visual literacy. With such a measure, schools, teachers, and curriculum designers can target the individual skills necessary to improve students’ visual literacy, thereby increasing science achievement. This project details the development of an artificial neural network capable of measuring science literacy using functional Near-Infrared Spectroscopy (fNIR) data. This data was previously collected by Project LENS standing for Leveraging Expertise in Neurotechnologies, a Science of Learning Collaborative Network (SL-CN) of scholars of STEM Education from three US universities (NSF award 1540888), utilizing mental rotation tasks, to assess student visual literacy. Hemodynamic response data from fNIRsoft was exported as an Excel file, with 80 of both 2D Wedge and Dash models (dash) and 3D Stick and Ball models (BL). Complexity data were in an Excel workbook separated by the participant (ID), containing information for both types of tasks. After changing strings to numbers for analysis, spreadsheets with measurement data and complexity data were uploaded to RapidMiner’s TurboPrep and merged. Using RapidMiner Studio, a Gradient Boosted Trees artificial neural network (ANN) consisting of 140 trees with a maximum depth of 7 branches was developed, and 99.7% of the ANN predictions are accurate. The ANN determined the biggest predictors to a successful mental rotation are the individual problem number, the response time and fNIR optode #16, located along the right prefrontal cortex important in processing visuospatial working memory and episodic memory retrieval; both vital for science literacy. With an unbiased measurement of science literacy provided by psychophysiological measurements with an ANN for analysis, educators and curriculum designers will be able to create targeted classroom resources to help improve student visuospatial literacy, therefore improving science literacy.

Keywords: artificial intelligence, artificial neural network, machine learning, science literacy, neuroscience

Procedia PDF Downloads 110
18750 Exploring How Online Applications Help Students to Learn Music Virtually: A Study in an Australian Music Academy

Authors: Ali Shah

Abstract:

This paper outlines the case study experience of using a variety of online strategies in an Australian music academy context during covid times. The study aimed at exploring how online applications help students to learn music, specifically playing musical instruments, composing songs, and performing virtually. To explore this, music teachers’ perceptions and experiences regarding online learning, the teaching strategies they implemented, and the challenges they faced were examined. For the purpose of this study, a qualitative research structure was adopted through the use of three data collection tools. These methods included pre- and post-research individual interviews of teachers and students, analysis of their lesson plans, virtual classroom observations of the teachers followed by the researcher’sown reflections, post-observation discussions, and teachers’ reflective journals. The findings revealed that teachers had a theoretical understanding of virtual learning and recent musical application such as Flowkey, Skoove, and Piano marvel, which are benefits of e-learning. While teachers faced challenges in implementing strategies to teach keyboard/piano online, overall, both students and teachers felt the positive impact of online applications and strategies on their learning and felt that modern technology made it possible for anyone to take music lessons at home.

Keywords: music, keyboard, piano, online learning, virtual learning

Procedia PDF Downloads 64
18749 A Theoretical Modelling and Simulation of a Surface Plasmon Resonance Biosensor for the Detection of Glucose Concentration in Blood and Urine

Authors: Natasha Mandal, Rakesh Singh Moirangthem

Abstract:

The present work reports a theoretical model to develop a plasmonic biosensor for the detection of glucose concentrations in human blood and urine as the abnormality of glucose label is the major cause of diabetes which becomes a life-threatening disease worldwide. This study is based on the surface plasmon resonance (SPR) sensor applications which is a well-established, highly sensitive, label-free, rapid optical sensing tool. Here we have introduced a sandwich assay of two dielectric spacer layers of MgF2 and BaTiO3which gives better performance compared to commonly used SiO2 and TiO2 dielectric spacers due to their low dielectric loss and higher refractive index. The sensitivity of our proposed sensor was found as 3242 nm/RIU approximately, with an excellent linear response of 0.958, which is higher than the conventional single-layer Au SPR sensor. Further, the sensitivity enhancement is also optimized by coating a few layers of two-dimensional (2D) nanomaterials (e.g., Graphene, h-BN, MXene, MoS2, WS2, etc.) on the sensor chip. Hence, our proposed SPR sensor has the potential for the detection of glucose concentration in blood and urine with enhanced sensitivity and high affinity and could be utilized as a reliable platform for the optical biosensing application in the field of medical diagnosis.

Keywords: biosensor, surface plasmon resonance, dielectric spacer, 2D nanomaterials

Procedia PDF Downloads 89
18748 Downscaling Grace Gravity Models Using Spectral Combination Techniques for Terrestrial Water Storage and Groundwater Storage Estimation

Authors: Farzam Fatolazadeh, Kalifa Goita, Mehdi Eshagh, Shusen Wang

Abstract:

The Gravity Recovery and Climate Experiment (GRACE) is a satellite mission with twin satellites for the precise determination of spatial and temporal variations in the Earth’s gravity field. The products of this mission are monthly global gravity models containing the spherical harmonic coefficients and their errors. These GRACE models can be used for estimating terrestrial water storage (TWS) variations across the globe at large scales, thereby offering an opportunity for surface and groundwater storage (GWS) assessments. Yet, the ability of GRACE to monitor changes at smaller scales is too limited for local water management authorities. This is largely due to the low spatial and temporal resolutions of its models (~200,000 km2 and one month, respectively). High-resolution GRACE data products would substantially enrich the information that is needed by local-scale decision-makers while offering the data for the regions that lack adequate in situ monitoring networks, including northern parts of Canada. Such products could eventually be obtained through downscaling. In this study, we extended the spectral combination theory to simultaneously downscale spatiotemporally the 3o spatial coarse resolution of GRACE to 0.25o degrees resolution and monthly coarse resolution to daily resolution. This method combines the monthly gravity field solution of GRACE and daily hydrological model products in the form of both low and high-frequency signals to produce high spatiotemporal resolution TWSA and GWSA products. The main contribution and originality of this study are to comprehensively and simultaneously consider GRACE and hydrological variables and their uncertainties to form the estimator in the spectral domain. Therefore, it is predicted that we reach downscale products with an acceptable accuracy.

Keywords: GRACE satellite, groundwater storage, spectral combination, terrestrial water storage

Procedia PDF Downloads 71
18747 Modeling of a Small Unmanned Aerial Vehicle

Authors: Ahmed Elsayed Ahmed, Ashraf Hafez, A. N. Ouda, Hossam Eldin Hussein Ahmed, Hala Mohamed ABD-Elkader

Abstract:

Unmanned Aircraft Systems (UAS) are playing increasingly prominent roles in defense programs and defense strategies around the world. Technology advancements have enabled the development of it to do many excellent jobs as reconnaissance, surveillance, battle fighters, and communications relays. Simulating a small unmanned aerial vehicle (SUAV) dynamics and analyzing its behavior at the preflight stage is too important and more efficient. The first step in the UAV design is the mathematical modeling of the nonlinear equations of motion. In this paper, a survey with a standard method to obtain the full non-linear equations of motion is utilized,and then the linearization of the equations according to a steady state flight condition (trimming) is derived. This modeling technique is applied to an Ultrastick-25e fixed wing UAV to obtain the valued linear longitudinal and lateral models. At the end, the model is checked by matching between the behavior of the states of the non-linear UAV and the resulted linear model with doublet at the control surfaces.

Keywords: UAV, equations of motion, modeling, linearization

Procedia PDF Downloads 723
18746 Comparative Analysis of the Computer Methods' Usage for Calculation of Hydrocarbon Reserves in the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Nowadays, the depletion of hydrocarbon deposits on the land of the Kaliningrad region leads to active geological exploration and development of oil and natural gas reserves in the southeastern part of the Baltic Sea. LLC 'Lukoil-Kaliningradmorneft' implements a comprehensive program for the development of the region's shelf in 2014-2023. Due to heterogeneity of reservoir rocks in various open fields, as well as with ambiguous conclusions on the contours of deposits, additional geological prospecting and refinement of the recoverable oil reserves are carried out. The key element is use of an effective technique of computer stock modeling at the first stage of processing of the received data. The following step uses information for the cluster analysis, which makes it possible to optimize the field development approaches. The article analyzes the effectiveness of various methods for reserves' calculation and computer modelling methods of the offshore hydrocarbon fields. Cluster analysis allows to measure influence of the obtained data on the development of a technical and economic model for mining deposits. The relationship between the accuracy of the calculation of recoverable reserves and the need of modernization of existing mining infrastructure, as well as the optimization of the scheme of opening and development of oil deposits, is observed.

Keywords: cluster analysis, computer modelling of deposits, correction of the feasibility study, offshore hydrocarbon fields

Procedia PDF Downloads 156
18745 Epic Consciousness: New possibilities for Epic Expression in Post-War American Literature During the Age of Late Capitalism

Authors: Safwa Yargui

Abstract:

This research examines the quest for a post-war American epic poem in the age of late capitalism. It explores the possibility of an epic poem in the context of post-war late capitalist America, despite the prevailing scholarly skepticism regarding the existence of epic poetry after Milton’s Paradise Lost. The aim of this paper is to demonstrate the possibility of a post-war American epic through the argument of epic consciousness. Epic consciousness provides a significant nuance to the reading of the post-war American epic by focusing on the epic’s responsiveness to late capitalism via various language forms; cultural manifestations; and conscious distortions of late capitalist media-related language; in addition to the epic’ conscious inclusion of the process of writing a post-war epic that requires a direct engagement with American-based materials. By focusing on interdisciplinary theoretical approaches, this paper includes both socio-cultural literary theories as well as literary and epic approaches developed by scholars in their critical texts that respectively contextualize the late capitalist situation and the question of post-war American epic poetry. The major findings of this research provides a new theoretical approach to the question of post-war American epic poetry. In examining the role of consciousness, this paper aims to suggest a re-thinking of the post-war American epic that is capable of self-commitment for the purpose of achieving a new sense of epic poetry in post-war late capitalist America.

Keywords: american epic, epic consciousness, late capitalism, post-wat poetry

Procedia PDF Downloads 84
18744 Early Prediction of Diseases in a Cow for Cattle Industry

Authors: Ghufran Ahmed, Muhammad Osama Siddiqui, Shahbaz Siddiqui, Rauf Ahmad Shams Malick, Faisal Khan, Mubashir Khan

Abstract:

In this paper, a machine learning-based approach for early prediction of diseases in cows is proposed. Different ML algos are applied to extract useful patterns from the available dataset. Technology has changed today’s world in every aspect of life. Similarly, advanced technologies have been developed in livestock and dairy farming to monitor dairy cows in various aspects. Dairy cattle monitoring is crucial as it plays a significant role in milk production around the globe. Moreover, it has become necessary for farmers to adopt the latest early prediction technologies as the food demand is increasing with population growth. This highlight the importance of state-ofthe-art technologies in analyzing how important technology is in analyzing dairy cows’ activities. It is not easy to predict the activities of a large number of cows on the farm, so, the system has made it very convenient for the farmers., as it provides all the solutions under one roof. The cattle industry’s productivity is boosted as the early diagnosis of any disease on a cattle farm is detected and hence it is treated early. It is done on behalf of the machine learning output received. The learning models are already set which interpret the data collected in a centralized system. Basically, we will run different algorithms on behalf of the data set received to analyze milk quality, and track cows’ health, location, and safety. This deep learning algorithm draws patterns from the data, which makes it easier for farmers to study any animal’s behavioral changes. With the emergence of machine learning algorithms and the Internet of Things, accurate tracking of animals is possible as the rate of error is minimized. As a result, milk productivity is increased. IoT with ML capability has given a new phase to the cattle farming industry by increasing the yield in the most cost-effective and time-saving manner.

Keywords: IoT, machine learning, health care, dairy cows

Procedia PDF Downloads 49
18743 DURAFILE: A Collaborative Tool for Preserving Digital Media Files

Authors: Santiago Macho, Miquel Montaner, Raivo Ruusalepp, Ferran Candela, Xavier Tarres, Rando Rostok

Abstract:

During our lives, we generate a lot of personal information such as photos, music, text documents and videos that link us with our past. This data that used to be tangible is now digital information stored in our computers, which implies a software dependence to make them accessible in the future. Technology, however, constantly evolves and goes through regular shifts, quickly rendering various file formats obsolete. The need for accessing data in the future affects not only personal users but also organizations. In a digital environment, a reliable preservation plan and the ability to adapt to fast changing technology are essential for maintaining data collections in the long term. We present in this paper the European FP7 project called DURAFILE that provides the technology to preserve media files for personal users and organizations while maintaining their quality.

Keywords: artificial intelligence, digital preservation, social search, digital preservation plans

Procedia PDF Downloads 432
18742 The Solid-Phase Sensor Systems for Fluorescent and SERS-Recognition of Neurotransmitters for Their Visualization and Determination in Biomaterials

Authors: Irina Veselova, Maria Makedonskaya, Olga Eremina, Alexandr Sidorov, Eugene Goodilin, Tatyana Shekhovtsova

Abstract:

Such catecholamines as dopamine, norepinephrine, and epinephrine are the principal neurotransmitters in the sympathetic nervous system. Catecholamines and their metabolites are considered to be important markers of socially significant diseases such as atherosclerosis, diabetes, coronary heart disease, carcinogenesis, Alzheimer's and Parkinson's diseases. Currently, neurotransmitters can be studied via electrochemical and chromatographic techniques that allow their characterizing and quantification, although these techniques can only provide crude spatial information. Besides, the difficulty of catecholamine determination in biological materials is associated with their low normal concentrations (~ 1 nM) in biomaterials, which may become even one more order lower because of some disorders. In addition, in blood they are rapidly oxidized by monoaminooxidases from thrombocytes and, for this reason, the determination of neurotransmitter metabolism indicators in an organism should be very rapid (15—30 min), especially in critical states. Unfortunately, modern instrumental analysis does not offer a complex solution of this problem: despite its high sensitivity and selectivity, HPLC-MS cannot provide sufficiently rapid analysis, while enzymatic biosensors and immunoassays for the determination of the considered analytes lack sufficient sensitivity and reproducibility. Fluorescent and SERS-sensors remain a compelling technology for approaching the general problem of selective neurotransmitter detection. In recent years, a number of catecholamine sensors have been reported including RNA aptamers, fluorescent ribonucleopeptide (RNP) complexes, and boronic acid based synthetic receptors and the sensor operated in a turn-off mode. In this work we present the fluorescent and SERS turn-on sensor systems based on the bio- or chemorecognizing nanostructured films {chitosan/collagen-Tb/Eu/Cu-nanoparticles-indicator reagents} that provide the selective recognition, visualization, and sensing of the above mentioned catecholamines on the level of nanomolar concentrations in biomaterials (cell cultures, tissue etc.). We have (1) developed optically transparent porous films and gels of chitosan/collagen; (2) ensured functionalization of the surface by molecules-'recognizers' (by impregnation and immobilization of components of the indicator systems: biorecognizing and auxiliary reagents); (3) performed computer simulation for theoretical prediction and interpretation of some properties of the developed materials and obtained analytical signals in biomaterials. We are grateful for the financial support of this research from Russian Foundation for Basic Research (grants no. 15-03-05064 a, and 15-29-01330 ofi_m).

Keywords: biomaterials, fluorescent and SERS-recognition, neurotransmitters, solid-phase turn-on sensor system

Procedia PDF Downloads 394
18741 Identifying Game Variables from Students’ Surveys for Prototyping Games for Learning

Authors: N. Ismail, O. Thammajinda, U. Thongpanya

Abstract:

Games-based learning (GBL) has become increasingly important in teaching and learning. This paper explains the first two phases (analysis and design) of a GBL development project, ending up with a prototype design based on students’ and teachers’ perceptions. The two phases are part of a full cycle GBL project aiming to help secondary school students in Thailand in their study of Comprehensive Sex Education (CSE). In the course of the study, we invited 1,152 students to complete questionnaires and interviewed 12 secondary school teachers in focus groups. This paper found that GBL can serve students in their learning about CSE, enabling them to gain understanding of their sexuality, develop skills, including critical thinking skills and interact with others (peers, teachers, etc.) in a safe environment. The objectives of this paper are to outline the development of GBL variables from the research question(s) into the developers’ flow chart, to be responsive to the GBL beneficiaries’ preferences and expectations, and to help in answering the research questions. This paper details the steps applied to generate GBL variables that can feed into a game flow chart to develop a GBL prototype. In our approach, we detailed two models: (1) Game Elements Model (GEM) and (2) Game Object Model (GOM). There are three outcomes of this research – first, to achieve the objectives and benefits of GBL in learning, game design has to start with the research question(s) and the challenges to be resolved as research outcomes. Second, aligning the educational aims with engaging GBL end users (students) within the data collection phase to inform the game prototype with the game variables is essential to address the answer/solution to the research question(s). Third, for efficient GBL to bridge the gap between pedagogy and technology and in order to answer the research questions via technology (i.e. GBL) and to minimise the isolation between the pedagogists “P” and technologist “T”, several meetings and discussions need to take place within the team.

Keywords: games-based learning, engagement, pedagogy, preferences, prototype

Procedia PDF Downloads 158
18740 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment

Authors: Kim Byung-Kon, Kim Young-Jin

Abstract:

As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.

Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM

Procedia PDF Downloads 489
18739 Assessment of Pre-Processing Influence on Near-Infrared Spectra for Predicting the Mechanical Properties of Wood

Authors: Aasheesh Raturi, Vimal Kothiyal, P. D. Semalty

Abstract:

We studied mechanical properties of Eucalyptus tereticornis using FT-NIR spectroscopy. Firstly, spectra were pre-processed to eliminate useless information. Then, prediction model was constructed by partial least squares regression. To study the influence of pre-processing on prediction of mechanical properties for NIR analysis of wood samples, we applied various pretreatment methods like straight line subtraction, constant offset elimination, vector-normalization, min-max normalization, multiple scattering. Correction, first derivative, second derivatives and their combination with other treatment such as First derivative + straight line subtraction, First derivative+ vector normalization and First derivative+ multiplicative scattering correction. The data processing methods in combination of preprocessing with different NIR regions, RMSECV, RMSEP and optimum factors/rank were obtained by optimization process of model development. More than 350 combinations were obtained during optimization process. More than one pre-processing method gave good calibration/cross-validation and prediction/test models, but only the best calibration/cross-validation and prediction/test models are reported here. The results show that one can safely use NIR region between 4000 to 7500 cm-1 with straight line subtraction, constant offset elimination, first derivative and second derivative preprocessing method which were found to be most appropriate for models development.

Keywords: FT-NIR, mechanical properties, pre-processing, PLS

Procedia PDF Downloads 336
18738 Non-Reacting Numerical Simulation of Axisymmetric Trapped Vortex Combustor

Authors: Heval Serhat Uluk, Sam M. Dakka, Kuldeep Singh, Richard Jefferson-Loveday

Abstract:

This paper will focus on the suitability of a trapped vortex combustor as a candidate for gas turbine combustor objectives to minimize pressure drop across the combustor and investigate aerodynamic performance. Non-reacting simulation of axisymmetric cavity trapped vortex combustors were simulated to investigate the pressure drop for various cavity aspect ratios of 0.3, 0.6, and 1 and for air mass flow rates of 14 m/s, 28 m/s, and 42 m/s. A numerical study of an axisymmetric trapped vortex combustor was carried out by using two-dimensional and three-dimensional computational domains. A comparison study was conducted between Reynolds Averaged Navier Stokes (RANS) k-ε Realizable with enhanced wall treatment and RANS k-ω Shear Stress Transport (SST) models to find the most suitable turbulence model. It was found that the k-ω SST model gives relatively close results to experimental outcomes. The numerical results were validated and showed good agreement with the experimental data. Pressure drop rises with increasing air mass flow rate, and the lowest pressure drop was observed at 0.6 cavity aspect ratio for all air mass flow rates tested, which agrees with the experimental outcome. A mixing enhancement study showed that 30-degree angle air injectors provide improved fuel-air mixing.

Keywords: aerodynamic, computational fluid dynamics, propulsion, trapped vortex combustor

Procedia PDF Downloads 74
18737 Opportunities in Self-care Abortion and Telemedicine: Findings from a Study in Colombia

Authors: Paola Montenegro, Maria de los Angeles Balaguera Villa

Abstract:

In February 2022 Colombia achieved a historic milestone in ensuring universal access to abortion rights with ruling C-055 of 2022 decriminalising abortion up to 24 weeks of gestation. In the context of this triumph and the expansion of telemedicine services in the wake of the COVID-19 pandemic, this research studied the acceptability of self-care abortion in young people (13 - 28 years) through a telemedicine service and also explored the primary needs that should be the focus of such care. The results shine light on a more comprehensive understanding of opportunities and challenges of teleabortion practices in a context that combines overall higher access to technology and low access to reliable information of safe abortion, stigma, and scarcity especially felt by transnational migrants, racialised people, trans men and non-binary people. Through a mixed methods approach, this study collected 5.736 responses to a virtual survey disseminated nationwide in Colombia and 47 in-person interviews (24 of them with people who were assigned female at birth and 21 with local key stakeholders in the abortion ecosystem). Quantitative data was analyzed using Stata SE Version 16.0 and qualitative analysis was completed through NVivo using thematic analysis. Key findings of the research suggest that self-care abortion is practice with growing acceptability among young people, but important adjustments must be made to meet quality of care expectations of users. Elements like quick responses from providers, lower costs, and accessible information were defined by users as decisive factors to choose over the abortion service provider. In general, the narratives in participants about quality care were centred on the promotion of autonomy and the provision of accompaniment and care practices, also perceived as transformative and currently absent of most health care services. The most staggering findings from the investigation are related to current barriers faced by young people in abortion contexts even when the legal barriers have: high rates of scepticism and distrust associated with pitfalls of telehealth and structural challenges associated with lacking communications infrastructure, among a few of them. Other important barriers to safe self-care abortion identified by participants surfaced like lack of privacy and confidentiality (especially in rural areas of the country), difficulties accessing reliable information, high costs of procedures and expenses related to travel costs or having to cease economic activities, waiting times, and stigma are among the primary barriers to abortion identified by participants. Especially in a scenario marked by unprecedented social, political and economic disruptions due to the COVID-19 pandemic, the commitment to design better care services that can be adapted to the identities, experiences, social contexts and possibilities of the user population is more necessary than ever. In this sense, the possibility of expanding access to services through telemedicine brings us closer to the opportunity to rethink the role of health care models in transforming the role of individuals and communities to make autonomous, safe and informed decisions about their own health and well-being.

Keywords: contraception, family planning, premarital fertility, unplanned pregnancy

Procedia PDF Downloads 60
18736 The Use of Artificial Intelligence in Diagnosis of Mastitis in Cows

Authors: Djeddi Khaled, Houssou Hind, Miloudi Abdellatif, Rabah Siham

Abstract:

In the field of veterinary medicine, there is a growing application of artificial intelligence (AI) for diagnosing bovine mastitis, a prevalent inflammatory disease in dairy cattle. AI technologies, such as automated milking systems, have streamlined the assessment of key metrics crucial for managing cow health during milking and identifying prevalent diseases, including mastitis. These automated milking systems empower farmers to implement automatic mastitis detection by analyzing indicators like milk yield, electrical conductivity, fat, protein, lactose, blood content in the milk, and milk flow rate. Furthermore, reports highlight the integration of somatic cell count (SCC), thermal infrared thermography, and diverse systems utilizing statistical models and machine learning techniques, including artificial neural networks, to enhance the overall efficiency and accuracy of mastitis detection. According to a review of 15 publications, machine learning technology can predict the risk and detect mastitis in cattle with an accuracy ranging from 87.62% to 98.10% and sensitivity and specificity ranging from 84.62% to 99.4% and 81.25% to 98.8%, respectively. Additionally, machine learning algorithms and microarray meta-analysis are utilized to identify mastitis genes in dairy cattle, providing insights into the underlying functional modules of mastitis disease. Moreover, AI applications can assist in developing predictive models that anticipate the likelihood of mastitis outbreaks based on factors such as environmental conditions, herd management practices, and animal health history. This proactive approach supports farmers in implementing preventive measures and optimizing herd health. By harnessing the power of artificial intelligence, the diagnosis of bovine mastitis can be significantly improved, enabling more effective management strategies and ultimately enhancing the health and productivity of dairy cattle. The integration of artificial intelligence presents valuable opportunities for the precise and early detection of mastitis, providing substantial benefits to the dairy industry.

Keywords: artificial insemination, automatic milking system, cattle, machine learning, mastitis

Procedia PDF Downloads 45
18735 Nonlinear Porous Diffusion Modeling of Ionic Agrochemicals in Astomatous Plant Cuticle Aqueous Pores: A Mechanistic Approach

Authors: Eloise C. Tredenick, Troy W. Farrell, W. Alison Forster, Steven T. P. Psaltis

Abstract:

The agriculture industry requires improved efficacy of sprays being applied to crops. More efficacious sprays provide many environmental and financial benefits. The plant leaf cuticle is known to be the main barrier to diffusion of agrochemicals within the leaf. The importance of a mathematical model to simulate uptake of agrochemicals in plant cuticles has been noted, as the results of each uptake experiments are specific to each formulation of active ingredient and plant species. In this work we develop a mathematical model and numerical simulation for the uptake of ionic agrochemicals through aqueous pores in plant cuticles. We propose a nonlinear porous diffusion model of ionic agrochemicals in isolated cuticles, which provides additions to a simple diffusion model through the incorporation of parameters capable of simulating plant species' variations, evaporation of surface droplet solutions and swelling of the aqueous pores with water. The model could feasibly be adapted to other ionic active ingredients diffusing through other plant species' cuticles. We validate our theoretical results against appropriate experimental data, discuss the key sensitivities in the model and relate theoretical predictions to appropriate physical mechanisms.

Keywords: aqueous pores, ionic active ingredient, mathematical model, plant cuticle, porous diffusion

Procedia PDF Downloads 250
18734 Economic Development Impacts of Connected and Automated Vehicles (CAV)

Authors: Rimon Rafiah

Abstract:

This paper will present a combination of two seemingly unrelated models, which are the one for estimating economic development impacts as a result of transportation investment and the other for increasing CAV penetration in order to reduce congestion. Measuring economic development impacts resulting from transportation investments is becoming more recognized around the world. Examples include the UK’s Wider Economic Benefits (WEB) model, Economic Impact Assessments in the USA, various input-output models, and additional models around the world. The economic impact model is based on WEB and is based on the following premise: investments in transportation will reduce the cost of personal travel, enabling firms to be more competitive, creating additional throughput (the same road allows more people to travel), and reducing the cost of travel of workers to a new workplace. This reduction in travel costs was estimated in out-of-pocket terms in a given localized area and was then translated into additional employment based on regional labor supply elasticity. This additional employment was conservatively assumed to be at minimum wage levels, translated into GDP terms, and from there into direct taxation (i.e., an increase in tax taken by the government). The CAV model is based on economic principles such as CAV usage, supply, and demand. Usage of CAVs can increase capacity using a variety of means – increased automation (known as Level I thru Level IV) and also by increased penetration and usage, which has been predicted to go up to 50% by 2030 according to several forecasts, with possible full conversion by 2045-2050. Several countries have passed policies and/or legislation on sales of gasoline-powered vehicles (none) starting in 2030 and later. Supply was measured via increased capacity on given infrastructure as a function of both CAV penetration and implemented technologies. The CAV model, as implemented in the USA, has shown significant savings in travel time and also in vehicle operating costs, which can be translated into economic development impacts in terms of job creation, GDP growth and salaries as well. The models have policy implications as well and can be adapted for use in Japan as well.

Keywords: CAV, economic development, WEB, transport economics

Procedia PDF Downloads 62
18733 Quantum Sieving for Hydrogen Isotope Separation

Authors: Hyunchul Oh

Abstract:

One of the challenges in modern separation science and technology is the separation of hydrogen isotopes mixtures since D2 and H2 consist of almost identical size, shape and thermodynamic properties. Recently, quantum sieving of isotopes by confinement in narrow space has been proposed as an alternative technique. Despite many theoretical suggestions, however, it has been difficult to discover a feasible microporous material up to now. Among various porous materials, the novel class of microporous framework materials (COFs, ZIFs and MOFs) is considered as a promising material class for isotope sieving due to ultra-high porosity and uniform pore size which can be tailored. Hence, we investigate experimentally the fundamental correlation between D2/H2 molar ratio and pore size at optimized operating conditions by using different ultramicroporous frameworks. The D2/H2 molar ratio is strongly depending on pore size, pressure and temperature. An experimentally determined optimum pore diameter for quantum sieving lies between 3.0 and 3.4 Å which can be an important guideline for designing and developing feasible microporous frameworks for isotope separation. Afterwards, we report a novel strategy for efficient hydrogen isotope separation at technologically relevant operating pressure through the development of quantum sieving exploited by the pore aperture engineering. The strategy involves installation of flexible components in the pores of the framework to tune the pore surface.

Keywords: gas adsorption, hydrogen isotope, metal organic frameworks(MOFs), quantum sieving

Procedia PDF Downloads 256
18732 Logistics and Supply Chain Management Using Smart Contracts on Blockchain

Authors: Armen Grigoryan, Milena Arakelyan

Abstract:

The idea of smart logistics is still quite a complicated one. It can be used to market products to a large number of customers or to acquire raw materials of the highest quality at the lowest cost in geographically dispersed areas. The use of smart contracts in logistics and supply chain management has the potential to revolutionize the way that goods are tracked, transported, and managed. Smart contracts are simply computer programs written in one of the blockchain programming languages (Solidity, Rust, Vyper), which are capable of self-execution once the predetermined conditions are met. They can be used to automate and streamline many of the traditional manual processes that are currently used in logistics and supply chain management, including the tracking and movement of goods, the management of inventory, and the facilitation of payments and settlements between different parties in the supply chain. Currently, logistics is a core area for companies which is concerned with transporting products between parties. Still, the problem of this sector is that its scale may lead to detainments and defaults in the delivery of goods, as well as other issues. Moreover, large distributors require a large number of workers to meet all the needs of their stores. All this may contribute to big detainments in order processing and increases the potentiality of losing orders. In an attempt to break this problem, companies have automated all their procedures, contributing to a significant augmentation in the number of businesses and distributors in the logistics sector. Hence, blockchain technology and smart contracted legal agreements seem to be suitable concepts to redesign and optimize collaborative business processes and supply chains. The main purpose of this paper is to examine the scope of blockchain technology and smart contracts in the field of logistics and supply chain management. This study discusses the research question of how and to which extent smart contracts and blockchain technology can facilitate and improve the implementation of collaborative business structures for sustainable entrepreneurial activities in smart supply chains. The intention is to provide a comprehensive overview of the existing research on the use of smart contracts in logistics and supply chain management and to identify any gaps or limitations in the current knowledge on this topic. This review aims to provide a summary and evaluation of the key findings and themes that emerge from the research, as well as to suggest potential directions for future research on the use of smart contracts in logistics and supply chain management.

Keywords: smart contracts, smart logistics, smart supply chain management, blockchain and smart contracts in logistics, smart contracts for controlling supply chain management

Procedia PDF Downloads 78
18731 The Role of Digital Technology in Crime Prevention: a Case Study of Cellular Forensics Unit, Capital City Police Peshawar-Pakistan

Authors: Muhammad Ashfaq

Abstract:

Main theme: This prime focus of this study is on the role of digital technology in crime prevention, with special focus on Cellular Forensic Unit, Capital City Police Peshawar-Khyber Pakhtunkhwa-Pakistan. Objective(s) of the study: The prime objective of this study is to provide statistics, strategies and pattern of analysis used for crime prevention in Cellular Forensic Unit of Capital City Police Peshawar, Khyber Pakhtunkhwa-Pakistan. Research Method and Procedure: Qualitative method of research has been used in the study for obtaining secondary data from research wing and Information Technology (IT) section of Peshawar police. Content analysis was the method used for the conduction of the study. This study is delimited to Capital City Police and Cellular Forensic Unit Peshawar-KP, Pakistan. information technologies.Major finding(s): It is evident that the old traditional approach will never provide solutions for better management in controlling crimes. The best way to control crimes and promotion of proactive policing is to adopt new technologies. The study reveals that technology have transformed police more effective and vigilant as compared to traditional policing. The heinous crimes like abduction, missing of an individual, snatching, burglaries and blind murder cases are now traceable with the help of technology.Recommendation(s): From the analysis of the data, it is reflected that Information Technology (IT) expert should be recruited along with research analyst to timely assist and facilitate operational as well as investigation units of police .A mobile locator should be Provided to Cellular Forensic Unit to timely apprehend the criminals .Latest digital analysis software should be provided to equip the Cellular Forensic Unit.

Keywords: crime-prevention, cellular-forensic unit-pakistan, crime prevention-digital-pakistan, crminology-pakistan

Procedia PDF Downloads 70
18730 AutoML: Comprehensive Review and Application to Engineering Datasets

Authors: Parsa Mahdavi, M. Amin Hariri-Ardebili

Abstract:

The development of accurate machine learning and deep learning models traditionally demands hands-on expertise and a solid background to fine-tune hyperparameters. With the continuous expansion of datasets in various scientific and engineering domains, researchers increasingly turn to machine learning methods to unveil hidden insights that may elude classic regression techniques. This surge in adoption raises concerns about the adequacy of the resultant meta-models and, consequently, the interpretation of the findings. In response to these challenges, automated machine learning (AutoML) emerges as a promising solution, aiming to construct machine learning models with minimal intervention or guidance from human experts. AutoML encompasses crucial stages such as data preparation, feature engineering, hyperparameter optimization, and neural architecture search. This paper provides a comprehensive overview of the principles underpinning AutoML, surveying several widely-used AutoML platforms. Additionally, the paper offers a glimpse into the application of AutoML on various engineering datasets. By comparing these results with those obtained through classical machine learning methods, the paper quantifies the uncertainties inherent in the application of a single ML model versus the holistic approach provided by AutoML. These examples showcase the efficacy of AutoML in extracting meaningful patterns and insights, emphasizing its potential to revolutionize the way we approach and analyze complex datasets.

Keywords: automated machine learning, uncertainty, engineering dataset, regression

Procedia PDF Downloads 49
18729 Multiobjective Optimization of Wastwater Treatment by Electrochemical Process

Authors: Malek Bendjaballah, Hacina Saidi, Sarra Hamidoud

Abstract:

The aim of this study is to model and optimize the performance of a new electrocoagulation (E.C) process for the treatment of wastewater as well as the energy consumption in order to extrapolate it to the industrial scale. Through judicious application of an experimental design (DOE), it has been possible to evaluate the individual effects and interactions that have a significant influence on both objective functions (maximizing efficiency and minimizing energy consumption) by using aluminum electrodes as sacrificial anode. Preliminary experiments have shown that the pH of the medium, the applied potential and the treatment time with E.C are the main parameters. A factorial design 33 has been adopted to model performance and energy consumption. Under optimal conditions, the pollution reduction efficiency is 93%, combined with a minimum energy consumption of 2.60.10-3 kWh / mg-COD. The potential or current applied and the processing time and their interaction were the most influential parameters in the mathematical models obtained. The results of the modeling were also correlated with the experimental ones. The results offer promising opportunities to develop a clean process and inexpensive technology to eliminate or reduce wastewater,

Keywords: electrocoagulation, green process, experimental design, optimization

Procedia PDF Downloads 79
18728 The Impact of Bitcoin and Cryptocurrency on the Development of Community

Authors: Felib Ayman Shawky Salem

Abstract:

Nowadays crypto currency has become a global phenomenon known to most people. People using this alternative digital money to do a transaction in many ways (e.g. Used for online shopping, wealth management, and fundraising). However, this digital asset also widely used in criminal activities since its use decentralized control as opposed to centralized electronic money and central banking systems and this makes a user, who used this currency invisible. The high-value exchange of these digital currencies also has been a target to criminal activities. The crypto currency crimes have become a challenge for the law enforcement to analyze and to proof the evidence as criminal devices. In this paper, our focus is more on bitcoin crypto currency and the possible artifacts that can be obtained from the different type of digital wallet, which is software and browser-based application. The process memory and physical hard disk are examined with the aims of identifying and recovering potential digital evidence. The stage of data acquisition divided by three states which are the initial creation of the wallet, transaction that consists transfer and receiving a coin and the last state is after the wallet is being deleted. Findings from this study suggest that both data from software and browser type of wallet process memory is a valuable source of evidence, and many of the artifacts found in process memory are also available from the application and wallet files on the client computer storage.

Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriationBitCoin, financial protection, crypto currency, money laundering cryptocurrency, digital wallet, digital forensics

Procedia PDF Downloads 22
18727 Understanding Mathematics Achievements among U. S. Middle School Students: A Bayesian Multilevel Modeling Analysis with Informative Priors

Authors: Jing Yuan, Hongwei Yang

Abstract:

This paper aims to understand U.S. middle school students’ mathematics achievements by examining relevant student and school-level predictors. Through a variance component analysis, the study first identifies evidence supporting the use of multilevel modeling. Then, a multilevel analysis is performed under Bayesian statistical inference where prior information is incorporated into the modeling process. During the analysis, independent variables are entered sequentially in the order of theoretical importance to create a hierarchy of models. By evaluating each model using Bayesian fit indices, a best-fit and most parsimonious model is selected where Bayesian statistical inference is performed for the purpose of result interpretation and discussion. The primary dataset for Bayesian modeling is derived from the Program for International Student Assessment (PISA) in 2012 with a secondary PISA dataset from 2003 analyzed under the traditional ordinary least squares method to provide the information needed to specify informative priors for a subset of the model parameters. The dependent variable is a composite measure of mathematics literacy, calculated from an exploratory factor analysis of all five PISA 2012 mathematics achievement plausible values for which multiple evidences are found supporting data unidimensionality. The independent variables include demographics variables and content-specific variables: mathematics efficacy, teacher-student ratio, proportion of girls in the school, etc. Finally, the entire analysis is performed using the MCMCpack and MCMCglmm packages in R.

Keywords: Bayesian multilevel modeling, mathematics education, PISA, multilevel

Procedia PDF Downloads 319