Search results for: software component and interfaces
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7405

Search results for: software component and interfaces

5935 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO

Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu

Abstract:

Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.

Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO

Procedia PDF Downloads 80
5934 Logistic Regression Based Model for Predicting Students’ Academic Performance in Higher Institutions

Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu

Abstract:

In recent years, there has been a desire to forecast student academic achievement prior to graduation. This is to help them improve their grades, particularly for individuals with poor performance. The goal of this study is to employ supervised learning techniques to construct a predictive model for student academic achievement. Many academics have already constructed models that predict student academic achievement based on factors such as smoking, demography, culture, social media, parent educational background, parent finances, and family background, to name a few. This feature and the model employed may not have correctly classified the students in terms of their academic performance. This model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester as a prerequisite to predict if the student will perform well in future on related courses. The model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost, returning a 96.7% accuracy. This model is available as a desktop application, allowing both instructors and students to benefit from user-friendly interfaces for predicting student academic achievement. As a result, it is recommended that both students and professors use this tool to better forecast outcomes.

Keywords: artificial intelligence, ML, logistic regression, performance, prediction

Procedia PDF Downloads 94
5933 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems

Authors: Lei Chen, Jian Jiao, Tingdi Zhao

Abstract:

Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.

Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system

Procedia PDF Downloads 114
5932 Mathematical Modeling and Simulation of Convective Heat Transfer System in Adjustable Flat Collector Orientation for Commercial Solar Dryers

Authors: Adeaga Ibiyemi Iyabo, Adeaga Oyetunde Adeoye

Abstract:

Interestingly, mechanical drying methods has played a major role in the commercialization of agricultural and agricultural allied sectors. In the overall, drying enhances the favorable storability and preservation of agricultural produce which in turn promotes its producibility, marketability, salability, and profitability. Recent researches have shown that solar drying is easier, affordable, controllable, and of course, cleaner and purer than other means of drying methods. It is, therefore, needful to persistently appraise solar dryers with a view to improving on the existing advantages. In this paper, mathematical equations were formulated for solar dryer using mass conservation law, material balance law and least cost savings method. Computer codes were written in Visual Basic.Net. The developed computer software, which considered Ibadan, a strategic south-western geographical location in Nigeria, was used to investigate the relationship between variable orientation angle of flat plate collector on solar energy trapped, derived monthly heat load, available energy supplied by solar and fraction supplied by solar energy when 50000 Kg/Month of produce was dried over a year. At variable collector tilt angle of 10°.13°,15°,18°, 20°, the derived monthly heat load, available energy supplied by solar were 1211224.63MJ, 102121.34MJ, 0.111; 3299274.63MJ, 10121.34MJ, 0.132; 5999364.706MJ, 171222.859MJ, 0.286; 4211224.63MJ, 132121.34MJ, 0.121; 2200224.63MJ, 112121.34MJ, 0.104, respectively .These results showed that if optimum collector angle is not reached, those factors needed for efficient and cost reduction drying will be difficult to attain. Therefore, this software has revealed that off - optimum collector angle in commercial solar drying does not worth it, hence the importance of the software in decision making as to the optimum collector angle of orientation.

Keywords: energy, ibadan, heat - load, visual-basic.net

Procedia PDF Downloads 407
5931 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches

Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.

Abstract:

A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.

Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency

Procedia PDF Downloads 141
5930 Association of Non Synonymous SNP in DC-SIGN Receptor Gene with Tuberculosis (Tb)

Authors: Saima Suleman, Kalsoom Sughra, Naeem Mahmood Ashraf

Abstract:

Mycobacterium tuberculosis is a communicable chronic illness. This disease is being highly focused by researchers as it is present approximately in one third of world population either in active or latent form. The genetic makeup of a person plays an important part in producing immunity against disease. And one important factor association is single nucleotide polymorphism of relevant gene. In this study, we have studied association between single nucleotide polymorphism of CD-209 gene (encode DC-SIGN receptor) and patients of tuberculosis. Dry lab (in silico) and wet lab (RFLP) analysis have been carried out. GWAS catalogue and GEO database have been searched to find out previous association data. No association study has been found related to CD-209 nsSNPs but role of CD-209 in pulmonary tuberculosis have been addressed in GEO database.Therefore, CD-209 has been selected for this study. Different databases like ENSEMBLE and 1000 Genome Project has been used to retrieve SNP data in form of VCF file which is further submitted to different software to sort SNPs into benign and deleterious. Selected SNPs are further annotated by using 3-D modeling techniques using I-TASSER online software. Furthermore, selected nsSNPs were checked in Gujrat and Faisalabad population through RFLP analysis. In this study population two SNPs are found to be associated with tuberculosis while one nsSNP is not found to be associated with the disease.

Keywords: association, CD209, DC-SIGN, tuberculosis

Procedia PDF Downloads 306
5929 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application

Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada

Abstract:

This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.

Keywords: energy policy, energy diversification, “IntelSymb” software, renewable energy

Procedia PDF Downloads 220
5928 Urban Flood Risk Mapping–a Review

Authors: Sherly M. A., Subhankar Karmakar, Terence Chan, Christian Rau

Abstract:

Floods are one of the most frequent natural disasters, causing widespread devastation, economic damage and threat to human lives. Hydrologic impacts of climate change and intensification of urbanization are two root causes of increased flood occurrences, and recent research trends are oriented towards understanding these aspects. Due to rapid urbanization, population of cities across the world has increased exponentially leading to improperly planned developments. Climate change due to natural and anthropogenic activities on our environment has resulted in spatiotemporal changes in rainfall patterns. The combined effect of both aggravates the vulnerability of urban populations to floods. In this context, an efficient and effective flood risk management with its core component as flood risk mapping is essential in prevention and mitigation of flood disasters. Urban flood risk mapping involves zoning of an urban region based on its flood risk, which depicts the spatiotemporal pattern of frequency and severity of hazards, exposure to hazards, and degree of vulnerability of the population in terms of socio-economic, environmental and infrastructural aspects. Although vulnerability is a key component of risk, its assessment and mapping is often less advanced than hazard mapping and quantification. A synergic effort from technical experts and social scientists is vital for the effectiveness of flood risk management programs. Despite an increasing volume of quality research conducted on urban flood risk, a comprehensive multidisciplinary approach towards flood risk mapping still remains neglected due to which many of the input parameters and definitions of flood risk concepts are imprecise. Thus, the objectives of this review are to introduce and precisely define the relevant input parameters, concepts and terms in urban flood risk mapping, along with its methodology, current status and limitations. The review also aims at providing thought-provoking insights to potential future researchers and flood management professionals.

Keywords: flood risk, flood hazard, flood vulnerability, flood modeling, urban flooding, urban flood risk mapping

Procedia PDF Downloads 585
5927 Software Tool Design for Heavy Oil Upgrading by Hydrogen Donor Addition in a Hydrodynamic Cavitation Process

Authors: Munoz A. Tatiana, Solano R. Brandon, Montes C. Juan, Cierco G. Javier

Abstract:

The hydrodynamic cavitation is a process in which the energy that the fluids have in the phase changes is used. From this energy, local temperatures greater than 5000 °C are obtained where thermal cracking of the fluid molecules takes place. The process applied to heavy oil affects variables such as viscosity, density, and composition, which constitutes an important improvement in the quality of crude oil. In this study, the need to design a software through mathematical integration models of mixing, cavitation, kinetics, and reactor, allows modeling changes in density, viscosity, and composition of a heavy oil crude, when the fluid passes through a hydrodynamic cavitation reactor. In order to evaluate the viability of this technique in the industry, a heavy oil of 18° API gravity, was simulated using naphtha as a hydrogen donor at concentrations of 1, 2 and 5% vol, where the simulation results showed an API gravity increase to 0.77, 1.21 and 1.93° respectively and a reduction viscosity by 9.9, 12.9 and 15.8%. The obtained results allow to have a favorable panorama on this technological development, an appropriate visualization on the generation of innovative knowledge of this technique and the technical-economic opportunity that benefits the development of the hydrocarbon sector related to heavy crude oil that includes the largest world oil production.

Keywords: hydrodynamic cavitation, thermal cracking, hydrogen donor, heavy oil upgrading, simulator

Procedia PDF Downloads 147
5926 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river

Procedia PDF Downloads 284
5925 Towards Competence-Based Regulatory Sciences Education in Sub-Saharan Africa: Identification of Competencies

Authors: Abigail Ekeigwe, Bethany McGowan, Loran C. Parker, Stephen Byrn, Kari L. Clase

Abstract:

There are growing calls in the literature to develop and implement competency-based regulatory sciences education (CBRSE) in sub-Saharan Africa to expand and create a pipeline of a competent workforce of regulatory scientists. A defined competence framework is an essential component in developing competency-based education. However, such a competence framework is not available for regulatory scientists in sub-Saharan Africa. The purpose of this research is to identify entry-level competencies for inclusion in a competency framework for regulatory scientists in sub-Saharan Africa as a first step in developing CBRSE. The team systematically reviewed the literature following the PRISMA guidelines for systematic reviews and based on a pre-registered protocol on Open Science Framework (OSF). The protocol has the search strategy and the inclusion and exclusion criteria for publications. All included publications were coded to identify entry-level competencies for regulatory scientists. The team deductively coded the publications included in the study using the 'framework synthesis' model for systematic literature review. The World Health Organization’s conceptualization of competence guided the review and thematic synthesis. Topic and thematic codings were done using NVivo 12™ software. Based on the search strategy in the protocol, 2345 publications were retrieved. Twenty-two (n=22) of the retrieved publications met all the inclusion criteria for the research. Topic and thematic coding of the publications yielded three main domains of competence: knowledge, skills, and enabling behaviors. The knowledge domain has three sub-domains: administrative, regulatory governance/framework, and scientific knowledge. The skills domain has two sub-domains: functional and technical skills. Identification of competencies is the primal step that serves as a bedrock for curriculum development and competency-based education. The competencies identified in this research will help policymakers, educators, institutions, and international development partners design and implement competence-based regulatory science education in sub-Saharan Africa, ultimately leading to access to safe, quality, and effective medical products.

Keywords: competence-based regulatory science education, competencies, systematic review, sub-Saharan Africa

Procedia PDF Downloads 192
5924 Analysis of Biomarkers Intractable Epileptogenic Brain Networks with Independent Component Analysis and Deep Learning Algorithms: A Comprehensive Framework for Scalable Seizure Prediction with Unimodal Neuroimaging Data in Pediatric Patients

Authors: Bliss Singhal

Abstract:

Epilepsy is a prevalent neurological disorder affecting approximately 50 million individuals worldwide and 1.2 million Americans. There exist millions of pediatric patients with intractable epilepsy, a condition in which seizures fail to come under control. The occurrence of seizures can result in physical injury, disorientation, unconsciousness, and additional symptoms that could impede children's ability to participate in everyday tasks. Predicting seizures can help parents and healthcare providers take precautions, prevent risky situations, and mentally prepare children to minimize anxiety and nervousness associated with the uncertainty of a seizure. This research proposes a comprehensive framework to predict seizures in pediatric patients by evaluating machine learning algorithms on unimodal neuroimaging data consisting of electroencephalogram signals. The bandpass filtering and independent component analysis proved to be effective in reducing the noise and artifacts from the dataset. Various machine learning algorithms’ performance is evaluated on important metrics such as accuracy, precision, specificity, sensitivity, F1 score and MCC. The results show that the deep learning algorithms are more successful in predicting seizures than logistic Regression, and k nearest neighbors. The recurrent neural network (RNN) gave the highest precision and F1 Score, long short-term memory (LSTM) outperformed RNN in accuracy and convolutional neural network (CNN) resulted in the highest Specificity. This research has significant implications for healthcare providers in proactively managing seizure occurrence in pediatric patients, potentially transforming clinical practices, and improving pediatric care.

Keywords: intractable epilepsy, seizure, deep learning, prediction, electroencephalogram channels

Procedia PDF Downloads 78
5923 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs

Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar

Abstract:

The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.

Keywords: simulation, probability, confidence interval, sensitivity analysis

Procedia PDF Downloads 377
5922 An Interactive Platform Displaying Mixed Reality Media

Authors: Alfred Chen, Cheng Chieh Hsu, Yu-Pin Ma, Meng-Jie Lin, Fu Pai Chiu, Yi-Yan Sie

Abstract:

This study is attempted to construct a human-computer interactive platform system that has mainly consisted of an augmented hardware system, a software system, a display table, and mixed media. This system has provided with human-computer interaction services through an interactive platform for the tourism industry. A well designed interactive platform, integrating of augmented reality and mixed media, has potential to enhance museum display quality and diversity. Besides, it will create a comprehensive and creative display mode for most museums and historical heritages. Therefore, it is essential to let public understand what the platform is, how it functions, and most importantly how one builds an interactive augmented platform. Hence the authors try to elaborate the construction process of the platform in detail. Thus, there are three issues to be considered, i.e.1) the theory and application of augmented reality, 2) the hardware and software applied, and 3) the mixed media presented. In order to describe how the platform works, Courtesy Door of Tainan Confucius Temple has been selected as case study in this study. As a result, a developed interactive platform has been presented by showing the physical entity object, along with virtual mixing media such as text, images, animation, and video. This platform will result in providing diversified and effective information that will be delivered to the users.

Keywords: human-computer interaction, mixed reality, mixed media, tourism

Procedia PDF Downloads 485
5921 Experimental Investigation on Shear Behaviour of Fibre Reinforced Concrete Beams Using Steel Fibres

Authors: G. Beulah Gnana Ananthi, A. Jaffer Sathick, M. Abirami

Abstract:

Fibre reinforced concrete (FRC) has been widely used in industrial pavements and non-structural elements such as pipes, culverts, tunnels, and precast elements. The strengthening effect of fibres in the concrete matrix is achieved primarily due to the bridging effect of fibres at the crack interfaces. The workability of the concrete was reduced on addition of high percentages of steel fibres. The optimum percentage of addition of steel fibres varies with its aspect ratio. For this study, 1% addition of steel has resulted to be the optimum percentage for both Hooked and Crimped Steel Fibres and was added to the beam specimens. The fibres restrain efficiently the cracks and take up residual stresses beyond the cracking. In this sense, diagonal cracks are effectively stitched up by fibres crossing it. The failure of beams within the shear failure range changed from shear to flexure in the presence of sufficient steel fibre quantity. The shear strength is increased with the addition of steel fibres and had exceeded the enhancement obtained with the transverse reinforcement. However, such increase is not directly in proportion with the quantity of fibres used. Considering all the clarification made in the present experimental investigation, it is concluded that 1% of crimped steel fibres with an aspect ratio of 50 is the best type of steel fibres for replacement of transverse stirrups in high strength concrete beams when compared to the steel fibres with hooked ends.

Keywords: fibre reinforced concrete, steel fibre, shear strength, crack pattern

Procedia PDF Downloads 139
5920 VaR Estimation Using the Informational Content of Futures Traded Volume

Authors: Amel Oueslati, Olfa Benouda

Abstract:

New Value at Risk (VaR) estimation is proposed and investigated. The well-known two stages Garch-EVT approach uses conditional volatility to generate one step ahead forecasts of VaR. With daily data for twelve stocks that decompose the Dow Jones Industrial Average (DJIA) index, this paper incorporates the volume in the first stage volatility estimation. Afterwards, the forecasting ability of this conditional volatility concerning the VaR estimation is compared to that of a basic volatility model without considering any trading component. The results are significant and bring out the importance of the trading volume in the VaR measure.

Keywords: Garch-EVT, value at risk, volume, volatility

Procedia PDF Downloads 280
5919 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment

Authors: Seun Mayowa Sunday

Abstract:

Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.

Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud

Procedia PDF Downloads 126
5918 A Coupled Extended-Finite-Discrete Element Method: On the Different Contact Schemes between Continua and Discontinua

Authors: Shervin Khazaeli, Shahab Haj-zamani

Abstract:

Recently, advanced geotechnical engineering problems related to soil movement, particle loss, and modeling of local failure (i.e. discontinua) as well as modeling the in-contact structures (i.e. continua) are of the great interest among researchers. The aim of this research is to meet the requirements with respect to the modeling of the above-mentioned two different domains simultaneously. To this end, a coupled numerical method is introduced based on Discrete Element Method (DEM) and eXtended-Finite Element Method (X-FEM). In the coupled procedure, DEM is employed to capture the interactions and relative movements of soil particles as discontinua, while X-FEM is utilized to model in-contact structures as continua, which may consist of different types of discontinuities. For verification purposes, the new coupled approach is utilized to examine benchmark problems including different contacts between/within continua and discontinua. Results are validated by comparison with those of existing analytical and numerical solutions. This study proves that extended-finite-discrete element method can be used to robustly analyze not only contact problems, but also other types of discontinuities in continua such as (i) crack formations and propagations, (ii) voids and bimaterial interfaces, and (iii) combination of previous cases. In essence, the proposed method can be used vastly in advanced soil-structure interaction problems to investigate the micro and macro behaviour of the surrounding soil and the response of the embedded structure that contains discontinuities.

Keywords: contact problems, discrete element method, extended-finite element method, soil-structure interaction

Procedia PDF Downloads 501
5917 New Roles of Telomerase and Telomere-Associated Proteins in the Regulation of Telomere Length

Authors: Qin Yang, Fan Zhang, Juan Du, Chongkui Sun, Krishna Kota, Yun-Ling Zheng

Abstract:

Telomeres are specialized structures at chromosome ends consisting of tandem repetitive DNA sequences [(TTAGGG)n in humans] and associated proteins, which are necessary for telomere function. Telomere lengths are tightly regulated within a narrow range in normal human somatic cells, the basis of cellular senescence and aging. Previous studies have extensively focused on how short telomeres are extended and have demonstrated that telomerase plays a central role in telomere maintenance through elongating the short telomeres. However, the molecular mechanisms of regulating excessively long telomeres are unknown. Here, we found that telomerase enzymatic component hTERT plays a dual role in the regulation of telomeres length. We analyzed single telomere alterations at each chromosomal end led to the discoveries that hTERT shortens excessively long telomeres and elongates short telomeres simultaneously, thus maintaining the optimal telomere length at each chromosomal end for an efficient protection. The hTERT-mediated telomere shortening removes large segments of telomere DNA rapidly without inducing telomere dysfunction foci or affecting cell proliferation, thus it is mechanistically distinct from rapid telomere deletion. We found that expression of hTERT generates telomeric circular DNA, suggesting that telomere homologous recombination may be involved in this telomere shortening process. Moreover, the hTERT-mediated telomere shortening is required its enzymatic activity, but telomerase RNA component hTR is not involved in it. Furthermore, shelterin protein TPP1 interacts with hTERT and recruits it on telomeres to mediate telomere shortening. In addition, telomere-associated proteins, DKC1 and TCAB1 also play roles in this process. This novel hTERT-mediated telomere shortening mechanism not only exists in cancer cells, but also in primary human cells. Thus, the hTERT-mediated telomere shortening is expected to shift the paradigm on current molecular models of telomere length maintenance, with wide-reaching consequences in cancer and aging fields.

Keywords: aging, hTERT, telomerase, telomeres, human cells

Procedia PDF Downloads 422
5916 Advances in Fiber Optic Technology for High-Speed Data Transmission

Authors: Salim Yusif

Abstract:

Fiber optic technology has revolutionized telecommunications and data transmission, providing unmatched speed, bandwidth, and reliability. This paper presents the latest advancements in fiber optic technology, focusing on innovations in fiber materials, transmission techniques, and network architectures that enhance the performance of high-speed data transmission systems. Key advancements include the development of ultra-low-loss optical fibers, multi-core fibers, advanced modulation formats, and the integration of fiber optics into next-generation network architectures such as Software-Defined Networking (SDN) and Network Function Virtualization (NFV). Additionally, recent developments in fiber optic sensors are discussed, extending the utility of optical fibers beyond data transmission. Through comprehensive analysis and experimental validation, this research offers valuable insights into the future directions of fiber optic technology, highlighting its potential to drive innovation across various industries.

Keywords: fiber optics, high-speed data transmission, ultra-low-loss optical fibers, multi-core fibers, modulation formats, coherent detection, software-defined networking, network function virtualization, fiber optic sensors

Procedia PDF Downloads 55
5915 Future Education: Changing Paradigms

Authors: Girish Choudhary

Abstract:

Education is in a state of flux. Not only one need to acquire skills in order to cope with a fast changing global world, an explosive growth in technology, on the other hand is providing a new wave of teaching tools - computer aided video instruction, hypermedia, multimedia, CD-ROMs, Internet connections, and collaborative software environments. The emerging technology incorporates the group qualities of interactive, classroom-based learning while providing individual students the flexibility to participate in an educational programme at their own time and place. The technology facilitating self learning also seems to provide a cost effective solution to the dilemma of delivering education to masses. Online education is a unique learning domain that provides for many to many communications as well. The computer conferencing software defines the boundaries of the virtual classroom. The changing paradigm provides access of instruction to a large proportion of society, promises a qualitative change in the quality of learning and echoes a new way of thinking in educational theory that promotes active learning and open new learning approaches. Putting it to practice is challenging and may fundamentally alter the nature of educational institutions. The subsequent part of paper addresses such questions viz. 'Do we need to radically re-engineer the curriculum and foster an alternate set of skills in students?' in the onward journey.

Keywords: on-line education, self learning, energy and power engineering, future education

Procedia PDF Downloads 325
5914 The Impact of Information and Communication Technology on the Re-Engineering Process of Small and Medium Enterprises

Authors: Hiba Mezaache

Abstract:

The current study aimed to know the impact of using information and communication technology on the process of re-engineering small and medium enterprises, as the world witnessed the speed development of the latter in its field of work and the diversity of its objectives and programs, that also made its process important for the growth and development of the institution and also gaining the flexibility to face the changes that may occur in the environment of work, so in order to know the impact of information and communication technology on the success of this process, we prepared an electronic questionnaire that included (70) items, and we also used the SPSS statistical calendar to analyze the data obtained. In the end of our study, our conclusion was that there was a positive correlation between the four dimensions of information and communication technology, i.e., hardware and equipment, software, communication networks, databases, and the re-engineering process, in addition to the fact that the studied institutions attach great importance to formal communication, for its positive advantages that it achieves in reducing time and effort and costs in performing the business. We could also say that communication technology contributes to the process of formulating objectives related to the re-engineering strategy. Finally, we recommend the necessity of empowering workers to use information technology and communication more in enterprises, and to integrate them more into the activity of the enterprise by involving them in the decision-making process, and also to keep pace with the development in the field of software, hardware, and technological equipment.

Keywords: information and communication technology, re-engineering, small and medium enterprises, the impact

Procedia PDF Downloads 173
5913 Modal Approach for Decoupling Damage Cost Dependencies in Building Stories

Authors: Haj Najafi Leila, Tehranizadeh Mohsen

Abstract:

Dependencies between diverse factors involved in probabilistic seismic loss evaluation are recognized to be an imperative issue in acquiring accurate loss estimates. Dependencies among component damage costs could be taken into account considering two partial distinct states of independent or perfectly-dependent for component damage states; however, in our best knowledge, there is no available procedure to take account of loss dependencies in story level. This paper attempts to present a method called "modal cost superposition method" for decoupling story damage costs subjected to earthquake ground motions dealt with closed form differential equations between damage cost and engineering demand parameters which should be solved in complex system considering all stories' cost equations by the means of the introduced "substituted matrixes of mass and stiffness". Costs are treated as probabilistic variables with definite statistic factors of median and standard deviation amounts and a presumed probability distribution. To supplement the proposed procedure and also to display straightforwardness of its application, one benchmark study has been conducted. Acceptable compatibility has been proven for the estimated damage costs evaluated by the new proposed modal and also frequently used stochastic approaches for entire building; however, in story level, insufficiency of employing modification factor for incorporating occurrence probability dependencies between stories has been revealed due to discrepant amounts of dependency between damage costs of different stories. Also, more dependency contribution in occurrence probability of loss could be concluded regarding more compatibility of loss results in higher stories than the lower ones, whereas reduction in incorporation portion of cost modes provides acceptable level of accuracy and gets away from time consuming calculations including some limited number of cost modes in high mode situation.

Keywords: dependency, story-cost, cost modes, engineering demand parameter

Procedia PDF Downloads 175
5912 Multidimensional Poverty and Its Correlates among Rural Households in Limpopo Province, South Africa

Authors: Tamunotonye Mayowa Braide, Isaac Oluwatayo

Abstract:

This study investigates multidimensional poverty, and its correlates among rural households in Sekhukhune and Capricorn District municipalities (SDM & CDM) in Limpopo Province, South Africa. Primary data were collected from 407 rural households selected through purposive and simple random sampling techniques. Analytical techniques employed include descriptive statistics, principal component analysis (PCA), and the Alkire Foster (A-F) methodology. The results of the descriptive statistics showed there are more females (66%) than males (34%) in rural areas of Limpopo Province, with about 45% of them having secondary school education as the highest educational level attained and only about 3% do not have formal education. In the analysis of deprivation, eight dimensions of deprivation, constructed from 21 variables, were identified using the PCA. These dimensions include type and condition of dwelling water and sanitation, educational attainment and income, type of fuel for cooking and heating, access to clothing and cell phone, assets and fuel for light, health condition, crowding, and child health. In identifying the poor with poverty cut-off (0.13) of all indicators, about 75.9% of the rural households are deprived in 25% of the total dimensions, with the adjusted headcount ratio (M0) being 0.19. Multidimensional poverty estimates showed higher estimates of poor rural households with 71%, compared to 29%, which fall below the income poverty line. The study conducted poverty decomposition, using sub-groups within the area by examining regions and household characteristics. In SDM, there are more multidimensionally poor households than in CDM. The water and sanitation dimension is the largest contributor to the multidimensional poverty index (MPI) in rural areas of Limpopo Province. The findings can, therefore, assist in better design of welfare policy and target poverty alleviation programs and as well help in efficient resource allocation at the provincial and local municipality levels.

Keywords: Alkire-Foster methodology, Limpopo province, multidimensional poverty, principal component analysis, South Africa

Procedia PDF Downloads 157
5911 Prediction of Changes in Optical Quality by Tissue Redness after Pterygium Surgery

Authors: Mohd Radzi Hilmi, Mohd Zulfaezal Che Azemin, Khairidzan Mohd Kamal, Azrin Esmady Ariffin, Mohd Izzuddin Mohd Tamrin, Norfazrina Abdul Gaffur, Tengku Mohd Tengku Sembok

Abstract:

Purpose: The purpose of this study is to predict optical quality changes after pterygium surgery using tissue redness grading. Methods: Sixty-eight primary pterygium participants were selected from patients who visited an ophthalmology clinic. We developed a semi-automated computer program to measure the pterygium fibrovascular redness from digital pterygium images. The outcome of this software is a continuous scale grading of 1 (minimum redness) to 3 (maximum redness). The region of interest (ROI) was selected manually using the software. Reliability was determined by repeat grading of all 68 images and its association with contrast sensitivity function (CSF) and visual acuity (VA) was examined. Results: The mean and standard deviation of redness of the pterygium fibrovascular images was 1.88 ± 0.55. Intra- and inter-grader reliability estimates were high with intraclass correlation ranging from 0.97 to 0.98. The new grading was positively associated with CSF (p<0.01) and VA (p<0.01). The redness grading was able to predict 25% and 23% of the variance in the CSF and the VA respectively. Conclusions: The new grading of pterygium fibrovascular redness can be reliably measured from digital images and show a good correlation with CSF and VA. The redness grading can be used in addition to the existing pterygium grading.

Keywords: contrast sensitivity, pterygium, redness, visual acuity

Procedia PDF Downloads 508
5910 Getting Out of the Box: Tangible Music Production in the Age of Virtual Technological Abundance

Authors: Tim Nikolsky

Abstract:

This paper seeks to explore the different ways in which music producers choose to embrace various levels of technology based on musical values, objectives, affordability, access and workflow benefits. Current digital audio production workflow is questioned. Engineers and music producers of today are increasingly divorced from the tangibility of music production. Making music no longer requires you to reach over and turn a knob. Ideas of authenticity in music production are being redefined. Calculations from the mathematical algorithm with the pretty pictures are increasingly being chosen over hardware containing transformers and tubes. Are mouse clicks and movements equivalent or inferior to the master brush strokes we are seeking to conjure? We are making audio production decisions visually by constantly looking at a screen rather than listening. Have we compromised our music objectives and values by removing the ‘hands-on’ nature of music making? DAW interfaces are making our musical decisions for us not necessarily in our best interests. Technological innovation has presented opportunities as well as challenges for education. What do music production students actually need to learn in a formalised education environment, and to what extent do they need to know it? In this brave new world of omnipresent music creation tools, do we still need tangibility in music production? Interviews with prominent Australian music producers that work in a variety of fields will be featured in this paper, and will provide insight in answering these questions and move towards developing an understanding how tangibility can be rediscovered in the next generation of music production.

Keywords: analogue, digital, digital audio workstation, music production, plugins, tangibility, technology, workflow

Procedia PDF Downloads 268
5909 Coding Structures for Seated Row Simulation of an Active Controlled Vibration Isolation and Stabilization System for Astronaut’s Exercise Platform

Authors: Ziraguen O. Williams, Shield B. Lin, Fouad N. Matari, Leslie J. Quiocho

Abstract:

Simulation for seated row exercise was a continued task to assist NASA in analyzing a one-dimensional vibration isolation and stabilization system for astronaut’s exercise platform. Feedback delay and signal noise were added to the model as previously done in simulation for squat exercise. Simulation runs for this study were conducted in two software simulation tools, Trick and MBDyn, software simulation environments developed at the NASA Johnson Space Center. The exciter force in the simulation was calculated from the motion capture of an exerciser during a seated row exercise. The simulation runs include passive control, active control using a Proportional, Integral, Derivative (PID) controller, and active control using a Piecewise Linear Integral Derivative (PWLID) controller. Output parameters include displacements of the exercise platform, the exerciser, and the counterweight; transmitted force to the wall of spacecraft; and actuator force to the platform. The simulation results showed excellent force reduction in the actively controlled system compared to the passive controlled system, which showed less force reduction.

Keywords: control, counterweight, isolation, vibration.

Procedia PDF Downloads 133
5908 Network Analysis and Sex Prediction based on a full Human Brain Connectome

Authors: Oleg Vlasovets, Fabian Schaipp, Christian L. Mueller

Abstract:

we conduct a network analysis and predict the sex of 1000 participants based on ”connectome” - pairwise Pearson’s correlation across 436 brain parcels. We solve the non-smooth convex optimization problem, known under the name of Graphical Lasso, where the solution includes a low-rank component. With this solution and machine learning model for a sex prediction, we explain the brain parcels-sex connectivity patterns.

Keywords: network analysis, neuroscience, machine learning, optimization

Procedia PDF Downloads 144
5907 An Artificially Intelligent Teaching-Agent to Enhance Learning Interactions in Virtual Settings

Authors: Abdulwakeel B. Raji

Abstract:

This paper introduces a concept of an intelligent virtual learning environment that involves communication between learners and an artificially intelligent teaching agent in an attempt to replicate classroom learning interactions. The benefits of this technology over current e-learning practices is that it creates a virtual classroom where real time adaptive learning interactions are made possible. This is a move away from the static learning practices currently being adopted by e-learning systems. Over the years, artificial intelligence has been applied to various fields, including and not limited to medicine, military applications, psychology, marketing etc. The purpose of e-learning applications is to ensure users are able to learn outside of the classroom, but a major limitation has been the inability to fully replicate classroom interactions between teacher and students. This study used comparative surveys to gain information and understanding of the current learning practices in Nigerian universities and how they compare to these practices compare to the use of a developed e-learning system. The study was conducted by attending several lectures and noting the interactions between lecturers and tutors and as an aftermath, a software has been developed that deploys the use of an artificial intelligent teaching-agent alongside an e-learning system to enhance user learning experience and attempt to create the similar learning interactions to those found in classroom and lecture hall settings. Dialogflow has been used to implement a teaching-agent, which has been developed using JSON, which serves as a virtual teacher. Course content has been created using HTML, CSS, PHP and JAVASCRIPT as a web-based application. This technology can run on handheld devices and Google based home technologies to give learners an access to the teaching agent at any time. This technology also implements the use of definite clause grammars and natural language processing to match user inputs and requests with defined rules to replicate learning interactions. This technology developed covers familiar classroom scenarios such as answering users’ questions, asking ‘do you understand’ at regular intervals and answering subsequent requests, taking advanced user queries to give feedbacks at other periods. This software technology uses deep learning techniques to learn user interactions and patterns to subsequently enhance user learning experience. A system testing has been undergone by undergraduate students in the UK and Nigeria on the course ‘Introduction to Database Development’. Test results and feedback from users shows that this study and developed software is a significant improvement on existing e-learning systems. Further experiments are to be run using the software with different students and more course contents.

Keywords: virtual learning, natural language processing, definite clause grammars, deep learning, artificial intelligence

Procedia PDF Downloads 132
5906 Cost Effective and Efficient Feeding: A Way Forward for Sustainable and Profitable Aquaculture

Authors: Pawan Kumar Sharma, J. Stephan Sampath Kumar, S. Anand, Chandana B. L.

Abstract:

Protein is the major component for the success in culture of shrimp and fishes. Apparently, excess dietary protein is undesirable, as it not only enhances the production cost but also leads to water quality deterioration. A field survey was conducted with aqua farmers of Kerala, India, a leading state in coastal aquaculture, to assess the role of protein component in feed that can be efficiently and effectively managed for sustainable aquaculture. The study showed an average feed amount of 13.55 ± 2.16 tonnes per hectare was being used by the farmers of Kerala. The average feed cost percentage of Rs. 57.76 ± 13.46 /kg was invested for an average protein level of 36.26 % ± 0.082 in the feed and Rs.78.95 ± 3.086 per kilogram of feed was being paid by the farmers. Study revealed that replacement of fish meal and fish oil within shrimp aquafeeds with alternative protein, and lipid sources can only be achieved if changes are made in the basic shrimp culturing practices, such as closed farming system through water recycling or zero-water exchange, and by maximizing in-situ, floc and natural food production within the culture system. The upshot of such production systems is that imports of high-quality feed ingredients and aqua feeds can eventually be eliminated, and the utilization of locally available feed ingredients from agricultural by-products can be greatly improved and maximized. The promotion of closed shrimp production systems would also greatly reduce water use and increase shrimp production per unit area but would necessitate the continuous provision of electricity for aeration during production. Alternative energy sources such as solar power might be used, and resource poor farming communities should also explore wind energy for use. The study concluded that farm made feed and closed farming systems are essential for the sustainability and profitability of the aquaculture industry.

Keywords: aqua feeds, floc, fish meal, protein, zero-water exchange

Procedia PDF Downloads 139