Search results for: restructuringbuilding information modeling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13835

Search results for: restructuringbuilding information modeling

10025 Numerical Analysis of Shallow Footing Rested on Geogrid Reinforced Sandy Soil

Authors: Seyed Abolhasan Naeini, Javad Shamsi Soosahab

Abstract:

The use of geosynthetic reinforcement within the footing soils is a very effective and useful method to avoid the construction of costly deep foundations. This study investigated the use of geosynthetics for soil improvement based on numerical modeling using FELA software. Pressure settlement behavior and bearing capacity ratio of foundation on geogrid reinforced sand is investigated and the effect of different parameters like as number of geogrid layers and vertical distance between elements in three different relative density soil is studied. The effects of geometrical parameters of reinforcement layers were studied for determining the optimal values to reach to maximum bearing capacity. The results indicated that the optimum range of the distance ratio between the reinforcement layers was achieved at 0.5 to 0.6 and after number of geogrid layers of 4, no significant effect on increasing the bearing capacity of footing on reinforced sandy with geogrid

Keywords: geogrid, reinforced sand, FELA software, distance ratio, number of geogrid layers

Procedia PDF Downloads 134
10024 Financial Instruments Disclosure: A Review of the Literature

Authors: Y. Tahat, T. Dunne, S. Fifield, D. Power

Abstract:

Information about a firm’s usage of Financial Instruments (FIs) plays a very important role in determining its financial position and performance. Yet accounting standard-setters have encountered problems when deciding on the FI-related disclosures which firms must make. The primary objective of this paper is to review the extant literature on FI disclosure. This objective is achieved by surveying the literature on: the corporate usage of FIs; the different accounting standards adopted concerning FIs; and empirical studies on FI disclosure. This review concludes that the current research on FI disclosure has generated a number of useful insights. In particular, the paper reports that: FIs are a very important risk management mechanism in ensuring that companies have the cash available to make value-enhancing investments, however, without a clear set of risk management objectives, using such instruments can be dangerous; accounting standards concerning FIs have resulted in enhanced transparency about the usage of these instruments; and FI-related information is a key input into investors’ decision-making processes. Finally, the paper provides a number of suggestions for future research in the area.

Keywords: financial instruments, financial reporting, accounting standards, value relevance, corporate disclosure

Procedia PDF Downloads 400
10023 Harnessing Artificial Intelligence for Early Detection and Management of Infectious Disease Outbreaks

Authors: Amarachukwu B. Isiaka, Vivian N. Anakwenze, Chinyere C. Ezemba, Chiamaka R. Ilodinso, Chikodili G. Anaukwu, Chukwuebuka M. Ezeokoli, Ugonna H. Uzoka

Abstract:

Infectious diseases continue to pose significant threats to global public health, necessitating advanced and timely detection methods for effective outbreak management. This study explores the integration of artificial intelligence (AI) in the early detection and management of infectious disease outbreaks. Leveraging vast datasets from diverse sources, including electronic health records, social media, and environmental monitoring, AI-driven algorithms are employed to analyze patterns and anomalies indicative of potential outbreaks. Machine learning models, trained on historical data and continuously updated with real-time information, contribute to the identification of emerging threats. The implementation of AI extends beyond detection, encompassing predictive analytics for disease spread and severity assessment. Furthermore, the paper discusses the role of AI in predictive modeling, enabling public health officials to anticipate the spread of infectious diseases and allocate resources proactively. Machine learning algorithms can analyze historical data, climatic conditions, and human mobility patterns to predict potential hotspots and optimize intervention strategies. The study evaluates the current landscape of AI applications in infectious disease surveillance and proposes a comprehensive framework for their integration into existing public health infrastructures. The implementation of an AI-driven early detection system requires collaboration between public health agencies, healthcare providers, and technology experts. Ethical considerations, privacy protection, and data security are paramount in developing a framework that balances the benefits of AI with the protection of individual rights. The synergistic collaboration between AI technologies and traditional epidemiological methods is emphasized, highlighting the potential to enhance a nation's ability to detect, respond to, and manage infectious disease outbreaks in a proactive and data-driven manner. The findings of this research underscore the transformative impact of harnessing AI for early detection and management, offering a promising avenue for strengthening the resilience of public health systems in the face of evolving infectious disease challenges. This paper advocates for the integration of artificial intelligence into the existing public health infrastructure for early detection and management of infectious disease outbreaks. The proposed AI-driven system has the potential to revolutionize the way we approach infectious disease surveillance, providing a more proactive and effective response to safeguard public health.

Keywords: artificial intelligence, early detection, disease surveillance, infectious diseases, outbreak management

Procedia PDF Downloads 48
10022 Educating Children with the Child-Friendly Smartphone Operation System

Authors: Wildan Maulana Wildan, Siti Annisa Rahmayani Icha

Abstract:

Nowadays advances in information technology are needed by all the inhabitants of the earth for the sake of ease all their work, but it is worth to introduced the technological advances in the world of children. Before the technology is growing rapidly, children busy with various of traditional games and have high socialization. Moreover, after it presence, almost all of children spend more their time for playing gadget, It can affect the education of children and will change the character and personality children. However, children also can not be separated with the technology. Because the technology insight knowledge of children will be more extensive. Because the world can not be separated with advances in technology as well as with children, there should be developed a smartphone operating system that is child-friendly. The operating system is able to filter contents that do not deserve children, even in this system there is a reminder of a time study, prayer time and play time for children and there are interactive contents that will help the development of education and children's character. Children need technology, and there are some ways to introduce it to children. We must look at the characteristics of children in different environments. Thus advances in technology can be beneficial to the world children and their parents, and educators do not have to worry about advances in technology. We should be able to take advantage of advances in technology best possible.

Keywords: information technology, smartphone operating system, education, character

Procedia PDF Downloads 495
10021 Integrated Models of Reading Comprehension: Understanding to Impact Teaching—The Teacher’s Central Role

Authors: Sally A. Brown

Abstract:

Over the last 30 years, researchers have developed models or frameworks to provide a more structured understanding of the reading comprehension process. Cognitive information processing models and social cognitive theories both provide frameworks to inform reading comprehension instruction. The purpose of this paper is to (a) provide an overview of the historical development of reading comprehension theory, (b) review the literature framed by cognitive information processing, social cognitive, and integrated reading comprehension theories, and (c) demonstrate how these frameworks inform instruction. As integrated models of reading can guide the interpretation of various factors related to student learning, an integrated framework designed by the researcher will be presented. Results indicated that features of cognitive processing and social cognitivism theory—represented in the integrated framework—highlight the importance of the role of the teacher. This model can aid teachers in not only improving reading comprehension instruction but in identifying areas of challenge for students.

Keywords: explicit instruction, integrated models of reading comprehension, reading comprehension, teacher’s role

Procedia PDF Downloads 82
10020 Numerical Modeling Analysis for the Double-Layered Asphalt Pavement Structure Behavior with Interface Bonding

Authors: Minh Tu Le, Quang Huy Nguyen, Mai Lan Nguyen

Abstract:

Bonding characteristics between pavement layers have an important influence on responses of pavement structures. This paper deals with analytical solution for the stresses, strains, and deflections of double-layered asphalt pavement structure. This solution is based on the homogeneous half-space of layered theory developed by Burmister (1943). The partial interaction between the layers is taken into account by considering an interface bonding behavior which is obtained by push-out shear test. Numerical applications considering three cases of bonding (unbonded, partially bonded, and fully bonded overlays) are carried out to the influence of the interface bonding on the structural behavior of asphalt pavement under static loading. Further, it was observed that numerical results indicate that the horizontal shear reaction modulus at the interface (Ks) will significantly affect pavement structure behavior.

Keywords: analytical solution, interface bonding, shear test keyword, double-layered asphalt, shear reaction modulus

Procedia PDF Downloads 216
10019 The Public Law Studies: Relationship Between Accountability, Environmental Education and Smart Cities

Authors: Aline Alves Bandeira, Luís Pedro Lima, Maria Cecília de Paula Silva, Paulo Henrique de Viveiros Tavares

Abstract:

Nowadays, the study of public policies regarding management efficiency is essential. Public policies are about what governments do or do not do, being an area that has grown worldwide, contributing through the knowledge of technologies and methodologies that monitor and evaluate the performance of public administrators. The information published on official government websites needs to provide for transparency and responsiveness of managers. Thus, transparency is a primordial factor for the execution of Accountability, providing, in this way, services to the citizen with the expansion of transparent, efficient, democratic information and that value administrative eco-efficiency. The ecologically balanced management of a Smart City must optimize environmental education, building a fairer society, which brings about equality in the use of quality environmental resources. Smart Cities add value in the construction of public management, enabling interaction between people, enhancing environmental education and the practical applicability of administrative eco-efficiency, fostering economic development and improving the quality of life.

Keywords: accountability, environmental education, new public administration, smart cities

Procedia PDF Downloads 113
10018 Model Averaging in a Multiplicative Heteroscedastic Model

Authors: Alan Wan

Abstract:

In recent years, the body of literature on frequentist model averaging in statistics has grown significantly. Most of this work focuses on models with different mean structures but leaves out the variance consideration. In this paper, we consider a regression model with multiplicative heteroscedasticity and develop a model averaging method that combines maximum likelihood estimators of unknown parameters in both the mean and variance functions of the model. Our weight choice criterion is based on a minimisation of a plug-in estimator of the model average estimator's squared prediction risk. We prove that the new estimator possesses an asymptotic optimality property. Our investigation of finite-sample performance by simulations demonstrates that the new estimator frequently exhibits very favourable properties compared to some existing heteroscedasticity-robust model average estimators. The model averaging method hedges against the selection of very bad models and serves as a remedy to variance function misspecification, which often discourages practitioners from modeling heteroscedasticity altogether. The proposed model average estimator is applied to the analysis of two real data sets.

Keywords: heteroscedasticity-robust, model averaging, multiplicative heteroscedasticity, plug-in, squared prediction risk

Procedia PDF Downloads 360
10017 Calibration of Discrete Element Method Parameters for Modelling DRI Pellets Flow

Authors: A. Hossein Madadi-Najafabadi, Masoud Nasiri

Abstract:

The discrete element method is a powerful technique for numerical modeling the flow of granular materials such as direct reduced iron. It would enable us to study processes and equipment related to the production and handling of the material. However, the characteristics and properties of the granules have to be adjusted precisely to achieve reliable results in a DEM simulation. The main properties for DEM simulation are size distribution, density, Young's modulus, Poisson's ratio and the contact coefficients of restitution, rolling friction and sliding friction. In the present paper, the mentioned properties are determined for DEM simulation of DRI pellets. A reliable DEM simulation would contribute to optimizing the handling system of DRIs in an iron-making plant. Among the mentioned properties, Young's modulus is the most important parameter, which is usually hard to get for particulate solids. Here, an especial method is utilized to precisely determine this parameter for DRI.

Keywords: discrete element method, direct reduced iron, simulation parameters, granular material

Procedia PDF Downloads 166
10016 Youth Involvement in Cybercrime in Nigeria: A Case Study of Ikeja Local Government Area

Authors: Niyi Adegoke, Saanumi Jimmy Omolou

Abstract:

The prevalence rate of youth involving in cybercrime is alarming, which calls for concern among the government, parents, NGO and religious bodies, hence this paper aims at examining youth involvement in cybercrime in Nigeria. Achievement motivation theory was used to explain the activities of cyber-criminals in Nigerian society. A descriptive survey method was adopted for the study. The sample for the study was one hundred and fifty (150) respondents randomly selected from the population of the study. A questionnaire was used to gather information and data from the respondents. Data collected through the questionnaire were analyzed using percentage tool for the respondents’ bio-data while chi-square was employed to test the hypotheses. Findings from the study have revealed that parental negligence, unemployment, peer influence, and quest for materialism were responsible for cyber-crimes in Nigeria. The study concludes with the following recommendations among which are: creating employment opportunities for the youths and ensure good governance and accountability among other things will go a long way to solve the problem of cybercrime in our society.

Keywords: cybercrime, youth, Nigeria, unemployment, information communication technology

Procedia PDF Downloads 207
10015 Application and Aspects of Biometeorology in Inland Open Water Fisheries Management in the Context of Changing Climate: Status and Research Needs

Authors: U.K. Sarkar, G. Karnatak, P. Mishal, Lianthuamluaia, S. Kumari, S.K. Das, B.K. Das

Abstract:

Inland open water fisheries provide food, income, livelihood and nutritional security to millions of fishers across the globe. However, the open water ecosystem and fisheries are threatened due to climate change and anthropogenic pressures, which are more visible in the recent six decades, making the resources vulnerable. Understanding the interaction between meteorological parameters and inland fisheries is imperative to develop mitigation and adaptation strategies. As per IPCC 5th assessment report, the earth is warming at a faster rate in recent decades. Global mean surface temperature (GMST) for the decade 2006–2015 (0.87°C) was 6 times higher than the average over the 1850–1900 period. The direct and indirect impacts of climatic parameters on the ecology of fisheries ecosystem have a great bearing on fisheries due to alterations in fish physiology. The impact of meteorological factors on ecosystem health and fish food organisms brings about changes in fish diversity, assemblage, reproduction and natural recruitment. India’s average temperature has risen by around 0.7°C during 1901–2018. The studies show that the mean air temperature in the Ganga basin has increased in the range of 0.20 - 0.47 °C and annual rainfall decreased in the range of 257-580 mm during the last three decades. The studies clearly indicate visible impacts of climatic and environmental factors on inland open water fisheries. Besides, a significant reduction in-depth and area (37.20–57.68% reduction), diversity of natural indigenous fish fauna (ranging from 22.85 to 54%) in wetlands and progression of trophic state from mesotrophic to eutrophic were recorded. In this communication, different applications of biometeorology in inland fisheries management with special reference to the assessment of ecosystem and species vulnerability to climatic variability and change have been discussed. Further, the paper discusses the impact of climate anomaly and extreme climatic events on inland fisheries and emphasizes novel modeling approaches for understanding the impact of climatic and environmental factors on reproductive phenology for identification of climate-sensitive/resilient fish species for the adoption of climate-smart fisheries in the future. Adaptation and mitigation strategies to enhance fish production and the role of culture-based fisheries and enclosure culture in converting sequestered carbon into blue carbon have also been discussed. In general, the type and direction of influence of meteorological parameters on fish biology in open water fisheries ecosystems are not adequately understood. The optimum range of meteorological parameters for sustaining inland open water fisheries is yet to be established. Therefore, the application of biometeorology in inland fisheries offers ample scope for understanding the dynamics in changing climate, which would help to develop a database on such least, addressed research frontier area. This would further help to project fisheries scenarios in changing climate regimes and develop adaptation and mitigation strategies to cope up with adverse meteorological factors to sustain fisheries and to conserve aquatic ecosystem and biodiversity.

Keywords: biometeorology, inland fisheries, aquatic ecosystem, modeling, India

Procedia PDF Downloads 181
10014 Digital Nudge, Social Proof Nudge and Trust on Brand loyalty

Authors: Mirza Amin Ul Haq

Abstract:

Purpose – the purpose of conducting this research is to check the impact of nudges constructs, whether they create an encouragement factor with consumer brand loyalty and relating of word-of-mouth power have some kind of effect with all independent variables. Desin/Methodology/Approach – this study adopted the four constructs (i.e., Digital Nudge, Social Proof Nudge, Trust, and the mediator Word of Mouth) and explore its effect and connection with Brand Loyalty. A total of 390 respondents were selected for self-administrated questionnaire to obtain the finding of the research. Findings – the impact and cause between the constructs were done through structural equation modeling. The findings show a positive impact of social proof nudge and word of mouth whereas, digital nudge and trust have the weaker influence on the consumer choices when talk about brand loyalty. Originality/Value – Further implication for research and its marketing strategies in the field of clothing industry creating brand loyalty with customer.

Keywords: nudge, digital nudge, social proof, online buying, brand loyalty, trust, word of mouth

Procedia PDF Downloads 97
10013 Modelling and Simulation of Milk Fouling

Authors: Harche Rima, Laoufi Nadia Aicha

Abstract:

This work focuses on the study and modeling of the fouling phenomenon in a vertical pipe. In the first step, milk is one of the fluids obeying the phenomenon of fouling because of the denaturation of these proteins, especially lactoglobulin, which is the active element of milk, and to facilitate its use, we chose to study milk as a fouling fluid. In another step, we consider the test section of our installation as a tubular-type heat exchanger that works against the current and in a closed circuit. A simple mathematical model of Kern & Seaton, based on the kinetics of the fouling resistance, was used to evaluate the influence of the operating parameters (fluid flow velocity and exchange wall temperature) on the fouling resistance. The influence of the variation of the fouling resistance with the operating conditions on the efficiency of the heat exchanger and the importance of the dirty state exchange coefficient as an exchange quality control parameter were discussed and examined. On the other hand, an electronic scanning microscope analysis was performed on the milk deposit in order to obtain its actual image and composition, which allowed us to calculate the thickness of this deposit.

Keywords: fouling, milk, tubular heat exchanger, fouling resistance

Procedia PDF Downloads 30
10012 The Extension of Monomeric Computational Results to Polymeric Measurable Properties: An Introductory Computational Chemistry Experiment

Authors: Jing Zhao, Yongqing Bai, Qiaofang Shi, Huaihao Zhang

Abstract:

Advances in software technology enable computational chemistry to be commonly applied in various research fields, especially in pedagogy. Thus, in order to expand and improve experimental instructions of computational chemistry for undergraduates, we designed an introductory experiment—research on acrylamide molecular structure and physicochemical properties. Initially, students construct molecular models of acrylamide and polyacrylamide in Gaussian and Materials Studio software respectively. Then, the infrared spectral data, atomic charge and molecular orbitals of acrylamide as well as solvation effect of polyacrylamide are calculated to predict their physicochemical performance. At last, rheological experiments are used to validate these predictions. Through the combination of molecular simulation (performed on Gaussian, Materials Studio) with experimental verification (rheology experiment), learners have deeply comprehended the chemical nature of acrylamide and polyacrylamide, achieving good learning outcomes.

Keywords: upper-division undergraduate, computer-based learning, laboratory instruction, molecular modeling

Procedia PDF Downloads 126
10011 Stability of Composite Struts Using the Modified Newmark Method

Authors: Seyed Amin Vakili, Sahar Sadat Vakili, Seyed Ehsan Vakili, Nader Abdoli Yazdi

Abstract:

The aim of this paper is to examine the behavior of elastic stability of reinforced and composite concrete struts with axial loads. The objective of this study is to verify the ability of the Modified Newmark Method to include geometric non-linearity in addition to non-linearity due to cracking, and also to show the advantage of the established method to reconsider an ignored minor parameter in mathematical modeling, such as the effect of the cracking by extra geometric bending moment Ny on cross-section properties. The purpose of this investigation is not to present some new results for the instability of reinforced or composite concrete columns. Therefore, no kinds of non-linearity involved in the problem are considered here. Only as mentioned, it is a part of the verification of the new established method to solve two kinds of non-linearity P- δ effect and cracking together simultaneously. However, the Modified Newmark Method can be used to solve non-linearity of materials and time-dependent behavior of concrete. However, since it is out of the scope of this article, it is not considered.

Keywords: stability, buckling, modified newmark method, reinforced

Procedia PDF Downloads 314
10010 Review of Dielectric Permittivity Measurement Techniques

Authors: Ahmad H. Abdelgwad, Galal E. Nadim, Tarek M. Said, Amr M. Gody

Abstract:

The prime objective of this manuscript is to provide intensive review of the techniques used for permittivity measurements. The measurement techniques, relevant for any desired application, rely on the nature of the measured dielectric material, both electrically and physically, the degree of accuracy required, and the frequency of interest. Regardless of the way that distinctive sorts of instruments can be utilized, measuring devices that provide reliable determinations of the required electrical properties including the obscure material in the frequency range of interest can be considered. The challenge in making precise dielectric property or permittivity measurements is in designing of the material specimen holder for those measurements (RF and MW frequency ranges) and adequately modeling the circuit for reliable computation of the permittivity from the electrical measurements. If the RF circuit parameters such as the impedance or admittance are estimated appropriately at a certain frequency, the material’s permittivity at this frequency can be estimated by the equations which relate the way in which the dielectric properties of the material affect on the parameters of the circuit.

Keywords: dielectric permittivity, free space measurement, waveguide techniques, coaxial probe, cavity resonator

Procedia PDF Downloads 357
10009 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models

Authors: Danielle Shackley, Yetunde Folajimi

Abstract:

As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.

Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model

Procedia PDF Downloads 84
10008 Application of Life Cycle Assessment “LCA” Approach for a Sustainable Building Design under Specific Climate Conditions

Authors: Djeffal Asma, Zemmouri Noureddine

Abstract:

In order for building designer to be able to balance environmental concerns with other performance requirements, they need clear and concise information. For certain decisions during the design process, qualitative guidance, such as design checklists or guidelines information may not be sufficient for evaluating the environmental benefits between different building materials, products and designs. In this case, quantitative information, such as that generated through a life cycle assessment, provides the most value. LCA provides a systematic approach to evaluating the environmental impacts of a product or system over its entire life. In the case of buildings life cycle includes the extraction of raw materials, manufacturing, transporting and installing building components or products, operating and maintaining the building. By integrating LCA into building design process, designers can evaluate the life cycle impacts of building design, materials, components and systems and choose the combinations that reduce the building life cycle environmental impact. This article attempts to give an overview of the integration of LCA methodology in the context of building design, and focuses on the use of this methodology for environmental considerations concerning process design and optimization. A multiple case study was conducted in order to assess the benefits of the LCA as a decision making aid tool during the first stages of the building design under specific climate conditions of the North East region of Algeria. It is clear that the LCA methodology can help to assess and reduce the impact of a building design and components on the environment even if the process implementation is rather long and complicated and lacks of global approach including human factors. It is also demonstrated that using LCA as a multi objective optimization of building process will certainly facilitates the improvement in design and decision making for both new design and retrofit projects.

Keywords: life cycle assessment, buildings, sustainability, elementary schools, environmental impacts

Procedia PDF Downloads 533
10007 Lexical Based Method for Opinion Detection on Tripadvisor Collection

Authors: Faiza Belbachir, Thibault Schienhinski

Abstract:

The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.

Keywords: Tripadvisor, opinion detection, SentiWordNet, trust score

Procedia PDF Downloads 185
10006 Thermodynamic Analyses of Information Dissipation along the Passive Dendritic Trees and Active Action Potential

Authors: Bahar Hazal Yalçınkaya, Bayram Yılmaz, Mustafa Özilgen

Abstract:

Brain information transmission in the neuronal network occurs in the form of electrical signals. Neural work transmits information between the neurons or neurons and target cells by moving charged particles in a voltage field; a fraction of the energy utilized in this process is dissipated via entropy generation. Exergy loss and entropy generation models demonstrate the inefficiencies of the communication along the dendritic trees. In this study, neurons of 4 different animals were analyzed with one dimensional cable model with N=6 identical dendritic trees and M=3 order of symmetrical branching. Each branch symmetrically bifurcates in accordance with the 3/2 power law in an infinitely long cylinder with the usual core conductor assumptions, where membrane potential is conserved in the core conductor at all branching points. In the model, exergy loss and entropy generation rates are calculated for each branch of equivalent cylinders of electrotonic length (L) ranging from 0.1 to 1.5 for four different dendritic branches, input branch (BI), and sister branch (BS) and two cousin branches (BC-1 & BC-2). Thermodynamic analysis with the data coming from two different cat motoneuron studies show that in both experiments nearly the same amount of exergy is lost while generating nearly the same amount of entropy. Guinea pig vagal motoneuron loses twofold more exergy compared to the cat models and the squid exergy loss and entropy generation were nearly tenfold compared to the guinea pig vagal motoneuron model. Thermodynamic analysis show that the dissipated energy in the dendritic tress is directly proportional with the electrotonic length, exergy loss and entropy generation. Entropy generation and exergy loss show variability not only between the vertebrate and invertebrates but also within the same class. Concurrently, single action potential Na+ ion load, metabolic energy utilization and its thermodynamic aspect contributed for squid giant axon and mammalian motoneuron model. Energy demand is supplied to the neurons in the form of Adenosine triphosphate (ATP). Exergy destruction and entropy generation upon ATP hydrolysis are calculated. ATP utilization, exergy destruction and entropy generation showed differences in each model depending on the variations in the ion transport along the channels.

Keywords: ATP utilization, entropy generation, exergy loss, neuronal information transmittance

Procedia PDF Downloads 379
10005 Mathematical Properties of the Resonance of the Inner Waves in Rotating Stratified Three-Dimensional Fluids

Authors: A. Giniatoulline

Abstract:

We consider the internal oscillations of the ocean which are caused by the gravity force and the Coriolis force, for different models with changeable density, heat transfer, and salinity. Traditionally, the mathematical description of the resonance effect is related to the growing amplitude as a result of input vibrations. We offer a different approach: the study of the relation between the spectrum of the internal oscillations and the properties of the limiting amplitude of the solution for the harmonic input vibrations of the external forces. Using the results of the spectral theory of self-adjoint operators in Hilbert functional spaces, we prove that there exists an explicit relation between the localization of the frequency of the external input vibrations with respect to the essential spectrum of proper inner oscillations and the non-uniqueness of the limiting amplitude. The results may find their application in various problems concerning mathematical modeling of turbulent flows in the ocean.

Keywords: computational fluid dynamics, essential spectrum, limiting amplitude, rotating fluid, spectral theory, stratified fluid, the uniqueness of solutions of PDE equations

Procedia PDF Downloads 248
10004 Multiobjective Optimization of Wastwater Treatment by Electrochemical Process

Authors: Malek Bendjaballah, Hacina Saidi, Sarra Hamidoud

Abstract:

The aim of this study is to model and optimize the performance of a new electrocoagulation (E.C) process for the treatment of wastewater as well as the energy consumption in order to extrapolate it to the industrial scale. Through judicious application of an experimental design (DOE), it has been possible to evaluate the individual effects and interactions that have a significant influence on both objective functions (maximizing efficiency and minimizing energy consumption) by using aluminum electrodes as sacrificial anode. Preliminary experiments have shown that the pH of the medium, the applied potential and the treatment time with E.C are the main parameters. A factorial design 33 has been adopted to model performance and energy consumption. Under optimal conditions, the pollution reduction efficiency is 93%, combined with a minimum energy consumption of 2.60.10-3 kWh / mg-COD. The potential or current applied and the processing time and their interaction were the most influential parameters in the mathematical models obtained. The results of the modeling were also correlated with the experimental ones. The results offer promising opportunities to develop a clean process and inexpensive technology to eliminate or reduce wastewater,

Keywords: electrocoagulation, green process, experimental design, optimization

Procedia PDF Downloads 77
10003 Reverse Engineering of a Secondary Structure of a Helicopter: A Study Case

Authors: Jose Daniel Giraldo Arias, Camilo Rojas Gomez, David Villegas Delgado, Gullermo Idarraga Alarcon, Juan Meza Meza

Abstract:

The reverse engineering processes are widely used in the industry with the main goal to determine the materials and the manufacture used to produce a component. There are a lot of characterization techniques and computational tools that are used in order to get this information. A study case of a reverse engineering applied to a secondary sandwich- hybrid type structure used in a helicopter is presented. The methodology used consists of five main steps, which can be applied to any other similar component: Collect information about the service conditions of the part, disassembly and dimensional characterization, functional characterization, material properties characterization and manufacturing processes characterization, allowing to obtain all the supports of the traceability of the materials and processes of the aeronautical products that ensure their airworthiness. A detailed explanation of each step is covered. Criticality and comprehend the functionalities of each part, information of the state of the art and information obtained from interviews with the technical groups of the helicopter’s operators were analyzed,3D optical scanning technique, standard and advanced materials characterization techniques and finite element simulation allow to obtain all the characteristics of the materials used in the manufacture of the component. It was found that most of the materials are quite common in the aeronautical industry, including Kevlar, carbon, and glass fibers, aluminum honeycomb core, epoxy resin and epoxy adhesive. The stacking sequence and volumetric fiber fraction are a critical issue for the mechanical behavior; a digestion acid method was used for this purpose. This also helps in the determination of the manufacture technique which for this case was Vacuum Bagging. Samples of the material were manufactured and submitted to mechanical and environmental tests. These results were compared with those obtained during reverse engineering, which allows concluding that the materials and manufacture were correctly determined. Tooling for the manufacture was designed and manufactured according to the geometry and manufacture process requisites. The part was manufactured and the mechanical, and environmental tests required were also performed. Finally, a geometric characterization and non-destructive techniques allow verifying the quality of the part.

Keywords: reverse engineering, sandwich-structured composite parts, helicopter, mechanical properties, prototype

Procedia PDF Downloads 399
10002 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)

Authors: Perpetual Nwakaego Ibe

Abstract:

The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.

Keywords: MDG, ICT, data bank, database

Procedia PDF Downloads 187
10001 Machine Learning Approach for Yield Prediction in Semiconductor Production

Authors: Heramb Somthankar, Anujoy Chakraborty

Abstract:

This paper presents a classification study on yield prediction in semiconductor production using machine learning approaches. A complicated semiconductor production process is generally monitored continuously by signals acquired from sensors and measurement sites. A monitoring system contains a variety of signals, all of which contain useful information, irrelevant information, and noise. In the case of each signal being considered a feature, "Feature Selection" is used to find the most relevant signals. The open-source UCI SECOM Dataset provides 1567 such samples, out of which 104 fail in quality assurance. Feature extraction and selection are performed on the dataset, and useful signals were considered for further study. Afterward, common machine learning algorithms were employed to predict whether the signal yields pass or fail. The most relevant algorithm is selected for prediction based on the accuracy and loss of the ML model.

Keywords: deep learning, feature extraction, feature selection, machine learning classification algorithms, semiconductor production monitoring, signal processing, time-series analysis

Procedia PDF Downloads 98
10000 Determinants of Mobile Payment Adoption among Retailers in Ghana

Authors: Ibrahim Masud, Yusheng Kong, Adam Diyawu Rahman

Abstract:

Mobile payment variously referred to as mobile money, mobile money transfer, and mobile wallet refers to payment services operated under financial regulation and performed from or via a mobile device. Mobile payment systems have come to augment and to some extent try to replace the conventional payment methods like cash, cheque, or credit cards. This study examines mobile payment adoption factors among retailers in Ghana. A conceptual framework was adopted from the extant literature using the Technology Acceptance Model and the Theory of Reasoned action as the theoretical bases. Data for the study was obtained from a sample of 240 respondents through a structured questionnaire. The PLS-SEM was used to analyze the data through SPSS v.22 and SmartPLS v.3. The findings indicate that factors such as perceived usefulness, perceived ease of use, perceived security, competitive pressure and facilitating conditions are the main determinants of mobile payment adoption among retailers in Ghana. The study contributes to the literature on mobile payment adoption from developing country context.

Keywords: mobile payment, retailers, structural equation modeling, technology acceptance model

Procedia PDF Downloads 160
9999 An Experimental Study on Some Conventional and Hybrid Models of Fuzzy Clustering

Authors: Jeugert Kujtila, Kristi Hoxhalli, Ramazan Dalipi, Erjon Cota, Ardit Murati, Erind Bedalli

Abstract:

Clustering is a versatile instrument in the analysis of collections of data providing insights of the underlying structures of the dataset and enhancing the modeling capabilities. The fuzzy approach to the clustering problem increases the flexibility involving the concept of partial memberships (some value in the continuous interval [0, 1]) of the instances in the clusters. Several fuzzy clustering algorithms have been devised like FCM, Gustafson-Kessel, Gath-Geva, kernel-based FCM, PCM etc. Each of these algorithms has its own advantages and drawbacks, so none of these algorithms would be able to perform superiorly in all datasets. In this paper we will experimentally compare FCM, GK, GG algorithm and a hybrid two-stage fuzzy clustering model combining the FCM and Gath-Geva algorithms. Firstly we will theoretically dis-cuss the advantages and drawbacks for each of these algorithms and we will describe the hybrid clustering model exploiting the advantages and diminishing the drawbacks of each algorithm. Secondly we will experimentally compare the accuracy of the hybrid model by applying it on several benchmark and synthetic datasets.

Keywords: fuzzy clustering, fuzzy c-means algorithm (FCM), Gustafson-Kessel algorithm, hybrid clustering model

Procedia PDF Downloads 499
9998 Automated End of Sprint Detection for Force-Velocity-Power Analysis with GPS/GNSS Systems

Authors: Patrick Cormier, Cesar Meylan, Matt Jensen, Dana Agar-Newman, Chloe Werle, Ming-Chang Tsai, Marc Klimstra

Abstract:

Sprint-derived horizontal force-velocity-power (FVP) profiles can be developed with adequate validity and reliability with satellite (GPS/GNSS) systems. However, FVP metrics are sensitive to small nuances in data processing procedures such that minor differences in defining the onset and end of the sprint could result in different FVP metric outcomes. Furthermore, in team-sports, there is a requirement for rapid analysis and feedback of results from multiple athletes, therefore developing standardized and automated methods to improve the speed, efficiency and reliability of this process are warranted. Thus, the purpose of this study was to compare different methods of sprint end detection on the development of FVP profiles from 10Hz GPS/GNSS data through goodness-of-fit and intertrial reliability statistics. Seventeen national team female soccer players participated in the FVP protocol which consisted of 2x40m maximal sprints performed towards the end of a soccer specific warm-up in a training session (1020 hPa, wind = 0, temperature = 30°C) on an open grass field. Each player wore a 10Hz Catapult system unit (Vector S7, Catapult Innovations) inserted in a vest in a pouch between the scapulae. All data were analyzed following common procedures. Variables computed and assessed were the model parameters, estimated maximal sprint speed (MSS) and the acceleration constant τ, in addition to horizontal relative force (F₀), velocity at zero (V₀), and relative mechanical power (Pmax). The onset of the sprints was standardized with an acceleration threshold of 0.1 m/s². The sprint end detection methods were: 1. Time when peak velocity (MSS) was achieved (zero acceleration), 2. Time after peak velocity drops by -0.4 m/s, 3. Time after peak velocity drops by -0.6 m/s, and 4. When the integrated distance from the GPS/GNSS signal achieves 40-m. Goodness-of-fit of each sprint end detection method was determined using the residual sum of squares (RSS) to demonstrate the error of the FVP modeling with the sprint data from the GPS/GNSS system. Inter-trial reliability (from 2 trials) was assessed utilizing intraclass correlation coefficients (ICC). For goodness-of-fit results, the end detection technique that used the time when peak velocity was achieved (zero acceleration) had the lowest RSS values, followed by -0.4 and -0.6 velocity decay, and 40-m end had the highest RSS values. For intertrial reliability, the end of sprint detection techniques that were defined as the time at (method 1) or shortly after (method 2 and 3) when MSS was achieved had very large to near perfect ICC and the time at the 40 m integrated distance (method 4) had large to very large ICCs. Peak velocity was reached at 29.52 ± 4.02-m. Therefore, sport scientists should implement end of sprint detection either when peak velocity is determined or shortly after to improve goodness of fit to achieve reliable between trial FVP profile metrics. Although, more robust processing and modeling procedures should be developed in future research to improve sprint model fitting. This protocol was seamlessly integrated into the usual training which shows promise for sprint monitoring in the field with this technology.

Keywords: automated, biomechanics, team-sports, sprint

Procedia PDF Downloads 111
9997 Towards Safety-Oriented System Design: Preventing Operator Errors by Scenario-Based Models

Authors: Avi Harel

Abstract:

Most accidents are commonly attributed in hindsight to human errors, yet most methodologies for safety focus on technical issues. According to the Black Swan theory, this paradox is due to insufficient data about the ways systems fail. The article presents a study of the sources of errors, and proposes a methodology for utility-oriented design, comprising methods for coping with each of the sources identified. Accident analysis indicates that errors typically result from difficulties of operating in exceptional conditions. Therefore, following STAMP, the focus should be on preventing exceptions. Exception analysis indicates that typically they involve an improper account of the operational scenario, due to deficiencies in the system integration. The methodology proposes a model, which is a formal definition of the system operation, as well as principles and guidelines for safety-oriented system integration. The article calls to develop and integrate tools for recording and analysis of the system activity during the operation, required to implement validate the model.

Keywords: accidents, complexity, errors, exceptions, interaction, modeling, resilience, risks

Procedia PDF Downloads 188
9996 Digitalization, Supply Chain Integration and Financial Performance: Case of Tunisian Agro-industrial Sector

Authors: Rym Ghariani, Younes Boujelbene

Abstract:

In contemporary times, global technological advancements, particularly those in the realm of digital technology, have emerged as pivotal instruments for enterprises in fostering viable partnerships and forging meaningful alliances with other firms. The advent of these digital innovations is poised to revolutionize nearly every facet and operation within corporate entities. The primary objective of this study is to explore the correlation between digitization, integration of supply chains, and the financial efficacy of the agro-industrial sector in Tunisia. To accomplish this, data collection employed a questionnaire as the primary research instrument. Subsequently, the research queries were addressed, and hypotheses were examined by subjecting the gathered data to principal component analysis and linear regression modeling, facilitated by the utilization of SPSS26 software. The findings revealed that digitalization within the supply chain, along with external supply chain integration, exerted discernible impacts on the financial performance of the organization.

Keywords: digitalization, supply chain integration, financial performance, Tunisian agro-industrial sector

Procedia PDF Downloads 25