Search results for: AIMMS mathematical software
5130 Studying on Pile Seismic Operation with Numerical Method by Using FLAC 3D Software
Authors: Hossein Motaghedi, Kaveh Arkani, Siavash Salamatpoor
Abstract:
Usually the piles are important tools for safety and economical design of high and heavy structures. For this aim the response of single pile under dynamic load is so effective. Also, the agents which have influence on single pile response are properties of pile geometrical, soil and subjected loads. In this study the finite difference numerical method and by using FLAC 3D software is used for evaluation of single pile behavior under peak ground acceleration (PGA) of El Centro earthquake record in California (1940). The results of this models compared by experimental results of other researchers and it will be seen that the results of this models are approximately coincide by experimental data's. For example the maximum moment and displacement in top of the pile is corresponding to the other experimental results of pervious researchers. Furthermore, in this paper is tried to evaluate the effective properties between soil and pile. The results is shown that by increasing the pile diagonal, the pile top displacement will be decreased. As well as, by increasing the length of pile, the top displacement will be increased. Also, by increasing the stiffness ratio of pile to soil, the produced moment in pile body will be increased and the taller piles have more interaction by soils and have high inertia. So, these results can help directly to optimization design of pile dimensions.Keywords: pile seismic response, interaction between soil and pile, numerical analysis, FLAC 3D
Procedia PDF Downloads 3905129 Precise Identification of Clustered Regularly Interspaced Short Palindromic Repeats-Induced Mutations via Hidden Markov Model-Based Sequence Alignment
Authors: Jingyuan Hu, Zhandong Liu
Abstract:
CRISPR genome editing technology has transformed molecular biology by accurately targeting and altering an organism’s DNA. Despite the state-of-art precision of CRISPR genome editing, the imprecise mutation outcome and off-target effects present considerable risk, potentially leading to unintended genetic changes. Targeted deep sequencing, combined with bioinformatics sequence alignment, can detect such unwanted mutations. Nevertheless, the classical method, Needleman-Wunsch (NW) algorithm may produce false alignment outcomes, resulting in inaccurate mutation identification. The key to precisely identifying CRISPR-induced mutations lies in determining optimal parameters for the sequence alignment algorithm. Hidden Markov models (HMM) are ideally suited for this task, offering flexibility across CRISPR systems by leveraging forward-backward algorithms for parameter estimation. In this study, we introduce CRISPR-HMM, a statistical software to precisely call CRISPR-induced mutations. We demonstrate that the software significantly improves precision in identifying CRISPR-induced mutations compared to NW-based alignment, thereby enhancing the overall understanding of the CRISPR gene-editing process.Keywords: CRISPR, HMM, sequence alignment, gene editing
Procedia PDF Downloads 555128 Information Technology Competences for Professional Accountants in Thai Small to Medium Accounting Practice
Authors: Manirath Wongsim, Chatchawarn Srimontree, Pornpichit Phosri
Abstract:
Today, the majority of the data innovation may be currently majorly influencing business, what more accepted part of the accountant may be evolving. Information Technology elements have been appearing to be crucial in triggering changes of accountants’ roles. Thus, this study aims to investigate IT competencies among professional accountants to enhance firm performance. This research was conducted with 47 respondents at five organizations in Thailand and used quantitative research. The results indicate that the factor IT competencies for professional accountants in Thai small to medium accounting within the organizational issues defines18 factors. Specifically, these new factors, based on the research findings and the literature, then unique to IT competencies for professional accountants, include ERP software skills and accounting law and legal skills. The evidence in this study suggests that Analytical skills, teamwork skills, and accounting software were ranked as much-needed skills to be acquired by accountants while communication skills were ranked as the most required skills and delegation skills as the least required. The findings of the research’s empirical evidence suggest that organizations should understand appropriate in developing information technology influence competencies for knowledge employees in general and professional accountants in particular and provide assistance in all processes of decision making.Keywords: IT competencies, IT competences for professional accountants, IT skills for accounting, IT skills in SMEs
Procedia PDF Downloads 2345127 The Inverse Problem in the Process of Heat and Moisture Transfer in Multilayer Walling
Authors: Bolatbek Rysbaiuly, Nazerke Rysbayeva, Aigerim Rysbayeva
Abstract:
Relevance: Energy saving elevated to public policy in almost all developed countries. One of the areas for energy efficiency is improving and tightening design standards. In the tie with the state standards, make high demands for thermal protection of buildings. Constructive arrangement of layers should ensure normal operation in which the humidity of materials of construction should not exceed a certain level. Elevated levels of moisture in the walls can be attributed to a defective condition, as moisture significantly reduces the physical, mechanical and thermal properties of materials. Absence at the design stage of modeling the processes occurring in the construction and predict the behavior of structures during their work in the real world leads to an increase in heat loss and premature aging structures. Method: To solve this problem, widely used method of mathematical modeling of heat and mass transfer in materials. The mathematical modeling of heat and mass transfer are taken into the equation interconnected layer [1]. In winter, the thermal and hydraulic conductivity characteristics of the materials are nonlinear and depends on the temperature and moisture in the material. In this case, the experimental method of determining the coefficient of the freezing or thawing of the material becomes much more difficult. Therefore, in this paper we propose an approximate method for calculating the thermal conductivity and moisture permeability characteristics of freezing or thawing material. Questions. Following the development of methods for solving the inverse problem of mathematical modeling allows us to answer questions that are closely related to the rational design of fences: Where the zone of condensation in the body of the multi-layer fencing; How and where to apply insulation rationally his place; Any constructive activities necessary to provide for the removal of moisture from the structure; What should be the temperature and humidity conditions for the normal operation of the premises enclosing structure; What is the longevity of the structure in terms of its components frost materials. Tasks: The proposed mathematical model to solve the following problems: To assess the condition of the thermo-physical designed structures at different operating conditions and select appropriate material layers; Calculate the temperature field in a structurally complex multilayer structures; When measuring temperature and moisture in the characteristic points to determine the thermal characteristics of the materials constituting the surveyed construction; Laboratory testing to significantly reduce test time, and eliminates the climatic chamber and expensive instrumentation experiments and research; Allows you to simulate real-life situations that arise in multilayer enclosing structures associated with freezing, thawing, drying and cooling of any layer of the building material.Keywords: energy saving, inverse problem, heat transfer, multilayer walling
Procedia PDF Downloads 3995126 Investigation of Software Integration for Simulations of Buoyancy-Driven Heat Transfer in a Vehicle Underhood during Thermal Soak
Authors: R. Yuan, S. Sivasankaran, N. Dutta, K. Ebrahimi
Abstract:
This paper investigates the software capability and computer-aided engineering (CAE) method of modelling transient heat transfer process occurred in the vehicle underhood region during vehicle thermal soak phase. The heat retention from the soak period will be beneficial to the cold start with reduced friction loss for the second 14°C worldwide harmonized light-duty vehicle test procedure (WLTP) cycle, therefore provides benefits on both CO₂ emission reduction and fuel economy. When vehicle undergoes soak stage, the airflow and the associated convective heat transfer around and inside the engine bay is driven by the buoyancy effect. This effect along with thermal radiation and conduction are the key factors to the thermal simulation of the engine bay to obtain the accurate fluids and metal temperature cool-down trajectories and to predict the temperatures at the end of the soak period. Method development has been investigated in this study on a light-duty passenger vehicle using coupled aerodynamic-heat transfer thermal transient modelling method for the full vehicle under 9 hours of thermal soak. The 3D underhood flow dynamics were solved inherently transient by the Lattice-Boltzmann Method (LBM) method using the PowerFlow software. This was further coupled with heat transfer modelling using the PowerTHERM software provided by Exa Corporation. The particle-based LBM method was capable of accurately handling extremely complicated transient flow behavior on complex surface geometries. The detailed thermal modelling, including heat conduction, radiation, and buoyancy-driven heat convection, were integrated solved by PowerTHERM. The 9 hours cool-down period was simulated and compared with the vehicle testing data of the key fluid (coolant, oil) and metal temperatures. The developed CAE method was able to predict the cool-down behaviour of the key fluids and components in agreement with the experimental data and also visualised the air leakage paths and thermal retention around the engine bay. The cool-down trajectories of the key components obtained for the 9 hours thermal soak period provide vital information and a basis for the further development of reduced-order modelling studies in future work. This allows a fast-running model to be developed and be further imbedded with the holistic study of vehicle energy modelling and thermal management. It is also found that the buoyancy effect plays an important part at the first stage of the 9 hours soak and the flow development during this stage is vital to accurately predict the heat transfer coefficients for the heat retention modelling. The developed method has demonstrated the software integration for simulating buoyancy-driven heat transfer in a vehicle underhood region during thermal soak with satisfying accuracy and efficient computing time. The CAE method developed will allow integration of the design of engine encapsulations for improving fuel consumption and reducing CO₂ emissions in a timely and robust manner, aiding the development of low-carbon transport technologies.Keywords: ATCT/WLTC driving cycle, buoyancy-driven heat transfer, CAE method, heat retention, underhood modeling, vehicle thermal soak
Procedia PDF Downloads 1555125 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO
Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu
Abstract:
Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO
Procedia PDF Downloads 945124 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems
Authors: Lei Chen, Jian Jiao, Tingdi Zhao
Abstract:
Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system
Procedia PDF Downloads 1225123 Association of Non Synonymous SNP in DC-SIGN Receptor Gene with Tuberculosis (Tb)
Authors: Saima Suleman, Kalsoom Sughra, Naeem Mahmood Ashraf
Abstract:
Mycobacterium tuberculosis is a communicable chronic illness. This disease is being highly focused by researchers as it is present approximately in one third of world population either in active or latent form. The genetic makeup of a person plays an important part in producing immunity against disease. And one important factor association is single nucleotide polymorphism of relevant gene. In this study, we have studied association between single nucleotide polymorphism of CD-209 gene (encode DC-SIGN receptor) and patients of tuberculosis. Dry lab (in silico) and wet lab (RFLP) analysis have been carried out. GWAS catalogue and GEO database have been searched to find out previous association data. No association study has been found related to CD-209 nsSNPs but role of CD-209 in pulmonary tuberculosis have been addressed in GEO database.Therefore, CD-209 has been selected for this study. Different databases like ENSEMBLE and 1000 Genome Project has been used to retrieve SNP data in form of VCF file which is further submitted to different software to sort SNPs into benign and deleterious. Selected SNPs are further annotated by using 3-D modeling techniques using I-TASSER online software. Furthermore, selected nsSNPs were checked in Gujrat and Faisalabad population through RFLP analysis. In this study population two SNPs are found to be associated with tuberculosis while one nsSNP is not found to be associated with the disease.Keywords: association, CD209, DC-SIGN, tuberculosis
Procedia PDF Downloads 3095122 A Multidimensional Genetic Algorithm Applicable for Our VRP Variant Dealing with the Problems of Infrastructure Defaults SVRDP-CMTW: “Safety Vehicle Routing Diagnosis Problem with Control and Modified Time Windows”
Authors: Ben Mansour Mouin, Elloumi Abdelkarim
Abstract:
We will discuss the problem of routing a fleet of different vehicles from a central depot to different types of infrastructure-defaults with dynamic maintenance requests, modified time windows, and control of default maintained. For this reason, we propose a modified metaheuristicto to solve our mathematical model. SVRDP-CMTW is a variant VRP of an optimal vehicle plan that facilitates the maintenance task of different types of infrastructure-defaults. This task will be monitored after the maintenance, based on its priorities, the degree of danger associated with each default, and the neighborhood at the black-spots. We will present, in this paper, a multidimensional genetic algorithm “MGA” by detailing its characteristics, proposed mechanisms, and roles in our work. The coding of this algorithm represents the necessary parameters that characterize each infrastructure-default with the objective of minimizing a combination of cost, distance and maintenance times while satisfying the priority levels of the most urgent defaults. The developed algorithm will allow the dynamic integration of newly detected defaults at the execution time. This result will be displayed in our programmed interactive system at the routing time. This multidimensional genetic algorithm replaces N genetic algorithm to solve P different type problems of infrastructure defaults (instead of N algorithm for P problem we can solve in one multidimensional algorithm simultaneously who can solve all these problemsatonce).Keywords: mathematical model, VRP, multidimensional genetic algorithm, metaheuristics
Procedia PDF Downloads 1985121 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application
Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada
Abstract:
This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.Keywords: energy policy, energy diversification, “IntelSymb” software, renewable energy
Procedia PDF Downloads 2255120 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).Keywords: time series modelling, stochastic processes, ARIMA model, Karkheh river
Procedia PDF Downloads 2885119 Numerical Board Game for Low-Income Preschoolers
Authors: Gozde Inal Kiziltepe, Ozgun Uyanik
Abstract:
There is growing evidence that socioeconomic (SES)-related differences in mathematical knowledge primarily start in early childhood period. Preschoolers from low-income families are likely to perform substantially worse in mathematical knowledge than their counterparts from middle and higher income families. The differences are seen on a wide range of recognizing written numerals, counting, adding and subtracting, and comparing numerical magnitudes. Early differences in numerical knowledge have a permanent effect childrens’ mathematical knowledge in other grades. In this respect, analyzing the effect of number board game on the number knowledge of 48-60 month-old children from disadvantaged low-income families constitutes the main objective of the study. Participants were the 71 preschoolers from a childcare center which served low-income urban families. Children were randomly assigned to the number board condition or to the color board condition. The number board condition included 35 children and the color board game condition included 36 children. Both board games were 50 cm long and 30 cm high; had ‘The Great Race’ written across the top; and included 11 horizontally arranged, different colored squares of equal sizes with the leftmost square labeled ‘Start’. The numerical board had the numbers 1–10 in the rightmost 10 squares; the color board had different colors in those squares. A rabbit or a bear token were presented to children for selecting, and on each trial spun a spinner to determine whether the token would move one or two spaces. The number condition spinner had a ‘1’ half and a ‘2’ half; the color condition spinner had colors that matched the colors of the squares on the board. Children met one-on-one with an experimenter for four 15- to 20-min sessions within a 2-week period. In the first and fourth sessions, children were administered identical pretest and posttest measures of numerical knowledge. All children were presented three numerical tasks and one subtest presented in the following order: counting, numerical magnitude comparison, numerical identification and Count Objects – Circle Number Probe subtest of Early Numeracy Assessment. In addition, same numerical tasks and subtest were given as a follow-up test four weeks after the post-test administration. Findings obtained from the study; showed that there was a meaningful difference between scores of children who played a color board game in favor of children who played number board game.Keywords: low income, numerical board game, numerical knowledge, preschool education
Procedia PDF Downloads 3555118 An Insite to the Probabilistic Assessment of Reserves in Conventional Reservoirs
Authors: Sai Sudarshan, Harsh Vyas, Riddhiman Sherlekar
Abstract:
The oil and gas industry has been unwilling to adopt stochastic definition of reserves. Nevertheless, Monte Carlo simulation methods have gained acceptance by engineers, geoscientists and other professionals who want to evaluate prospects or otherwise analyze problems that involve uncertainty. One of the common applications of Monte Carlo simulation is the estimation of recoverable hydrocarbon from a reservoir.Monte Carlo Simulation makes use of random samples of parameters or inputs to explore the behavior of a complex system or process. It finds application whenever one needs to make an estimate, forecast or decision where there is significant uncertainty. First, the project focuses on performing Monte-Carlo Simulation on a given data set using U. S Department of Energy’s MonteCarlo Software, which is a freeware e&p tool. Further, an algorithm for simulation has been developed for MATLAB and program performs simulation by prompting user for input distributions and parameters associated with each distribution (i.e. mean, st.dev, min., max., most likely, etc.). It also prompts user for desired probability for which reserves are to be calculated. The algorithm so developed and tested in MATLAB further finds implementation in Python where existing libraries on statistics and graph plotting have been imported to generate better outcome. With PyQt designer, codes for a simple graphical user interface have also been written. The graph so plotted is then validated with already available results from U.S DOE MonteCarlo Software.Keywords: simulation, probability, confidence interval, sensitivity analysis
Procedia PDF Downloads 3835117 An Interactive Platform Displaying Mixed Reality Media
Authors: Alfred Chen, Cheng Chieh Hsu, Yu-Pin Ma, Meng-Jie Lin, Fu Pai Chiu, Yi-Yan Sie
Abstract:
This study is attempted to construct a human-computer interactive platform system that has mainly consisted of an augmented hardware system, a software system, a display table, and mixed media. This system has provided with human-computer interaction services through an interactive platform for the tourism industry. A well designed interactive platform, integrating of augmented reality and mixed media, has potential to enhance museum display quality and diversity. Besides, it will create a comprehensive and creative display mode for most museums and historical heritages. Therefore, it is essential to let public understand what the platform is, how it functions, and most importantly how one builds an interactive augmented platform. Hence the authors try to elaborate the construction process of the platform in detail. Thus, there are three issues to be considered, i.e.1) the theory and application of augmented reality, 2) the hardware and software applied, and 3) the mixed media presented. In order to describe how the platform works, Courtesy Door of Tainan Confucius Temple has been selected as case study in this study. As a result, a developed interactive platform has been presented by showing the physical entity object, along with virtual mixing media such as text, images, animation, and video. This platform will result in providing diversified and effective information that will be delivered to the users.Keywords: human-computer interaction, mixed reality, mixed media, tourism
Procedia PDF Downloads 4905116 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment
Authors: Seun Mayowa Sunday
Abstract:
Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud
Procedia PDF Downloads 1395115 Advances in Fiber Optic Technology for High-Speed Data Transmission
Authors: Salim Yusif
Abstract:
Fiber optic technology has revolutionized telecommunications and data transmission, providing unmatched speed, bandwidth, and reliability. This paper presents the latest advancements in fiber optic technology, focusing on innovations in fiber materials, transmission techniques, and network architectures that enhance the performance of high-speed data transmission systems. Key advancements include the development of ultra-low-loss optical fibers, multi-core fibers, advanced modulation formats, and the integration of fiber optics into next-generation network architectures such as Software-Defined Networking (SDN) and Network Function Virtualization (NFV). Additionally, recent developments in fiber optic sensors are discussed, extending the utility of optical fibers beyond data transmission. Through comprehensive analysis and experimental validation, this research offers valuable insights into the future directions of fiber optic technology, highlighting its potential to drive innovation across various industries.Keywords: fiber optics, high-speed data transmission, ultra-low-loss optical fibers, multi-core fibers, modulation formats, coherent detection, software-defined networking, network function virtualization, fiber optic sensors
Procedia PDF Downloads 635114 Future Education: Changing Paradigms
Authors: Girish Choudhary
Abstract:
Education is in a state of flux. Not only one need to acquire skills in order to cope with a fast changing global world, an explosive growth in technology, on the other hand is providing a new wave of teaching tools - computer aided video instruction, hypermedia, multimedia, CD-ROMs, Internet connections, and collaborative software environments. The emerging technology incorporates the group qualities of interactive, classroom-based learning while providing individual students the flexibility to participate in an educational programme at their own time and place. The technology facilitating self learning also seems to provide a cost effective solution to the dilemma of delivering education to masses. Online education is a unique learning domain that provides for many to many communications as well. The computer conferencing software defines the boundaries of the virtual classroom. The changing paradigm provides access of instruction to a large proportion of society, promises a qualitative change in the quality of learning and echoes a new way of thinking in educational theory that promotes active learning and open new learning approaches. Putting it to practice is challenging and may fundamentally alter the nature of educational institutions. The subsequent part of paper addresses such questions viz. 'Do we need to radically re-engineer the curriculum and foster an alternate set of skills in students?' in the onward journey.Keywords: on-line education, self learning, energy and power engineering, future education
Procedia PDF Downloads 3325113 The Impact of Information and Communication Technology on the Re-Engineering Process of Small and Medium Enterprises
Authors: Hiba Mezaache
Abstract:
The current study aimed to know the impact of using information and communication technology on the process of re-engineering small and medium enterprises, as the world witnessed the speed development of the latter in its field of work and the diversity of its objectives and programs, that also made its process important for the growth and development of the institution and also gaining the flexibility to face the changes that may occur in the environment of work, so in order to know the impact of information and communication technology on the success of this process, we prepared an electronic questionnaire that included (70) items, and we also used the SPSS statistical calendar to analyze the data obtained. In the end of our study, our conclusion was that there was a positive correlation between the four dimensions of information and communication technology, i.e., hardware and equipment, software, communication networks, databases, and the re-engineering process, in addition to the fact that the studied institutions attach great importance to formal communication, for its positive advantages that it achieves in reducing time and effort and costs in performing the business. We could also say that communication technology contributes to the process of formulating objectives related to the re-engineering strategy. Finally, we recommend the necessity of empowering workers to use information technology and communication more in enterprises, and to integrate them more into the activity of the enterprise by involving them in the decision-making process, and also to keep pace with the development in the field of software, hardware, and technological equipment.Keywords: information and communication technology, re-engineering, small and medium enterprises, the impact
Procedia PDF Downloads 1805112 Prediction of Changes in Optical Quality by Tissue Redness after Pterygium Surgery
Authors: Mohd Radzi Hilmi, Mohd Zulfaezal Che Azemin, Khairidzan Mohd Kamal, Azrin Esmady Ariffin, Mohd Izzuddin Mohd Tamrin, Norfazrina Abdul Gaffur, Tengku Mohd Tengku Sembok
Abstract:
Purpose: The purpose of this study is to predict optical quality changes after pterygium surgery using tissue redness grading. Methods: Sixty-eight primary pterygium participants were selected from patients who visited an ophthalmology clinic. We developed a semi-automated computer program to measure the pterygium fibrovascular redness from digital pterygium images. The outcome of this software is a continuous scale grading of 1 (minimum redness) to 3 (maximum redness). The region of interest (ROI) was selected manually using the software. Reliability was determined by repeat grading of all 68 images and its association with contrast sensitivity function (CSF) and visual acuity (VA) was examined. Results: The mean and standard deviation of redness of the pterygium fibrovascular images was 1.88 ± 0.55. Intra- and inter-grader reliability estimates were high with intraclass correlation ranging from 0.97 to 0.98. The new grading was positively associated with CSF (p<0.01) and VA (p<0.01). The redness grading was able to predict 25% and 23% of the variance in the CSF and the VA respectively. Conclusions: The new grading of pterygium fibrovascular redness can be reliably measured from digital images and show a good correlation with CSF and VA. The redness grading can be used in addition to the existing pterygium grading.Keywords: contrast sensitivity, pterygium, redness, visual acuity
Procedia PDF Downloads 5165111 Production of Rhamnolipids from Different Resources and Estimating the Kinetic Parameters for Bioreactor Design
Authors: Olfat A. Mohamed
Abstract:
Rhamnolipids biosurfactants have distinct properties given them importance in many industrial applications, especially their great new future applications in cosmetic and pharmaceutical industries. These applications have encouraged the search for diverse and renewable resources to control the cost of production. The experimental results were then applied to find a suitable mathematical model for obtaining the design criteria of the batch bioreactor. This research aims to produce Rhamnolipids from different oily wastewater sources such as petroleum crude oil (PO) and vegetable oil (VO) by using Pseudomonas aeruginosa ATCC 9027. Different concentrations of the PO and the VO are added to the media broth separately are in arrangement (0.5 1, 1.5, 2, 2.5 % v/v) and (2, 4, 6, 8 and 10%v/v). The effect of the initial concentration of oil residues and the addition of glycerol and palmitic acid was investigated as an inducer in the production of rhamnolipid and the surface tension of the broth. It was found that 2% of the waste (PO) and 6% of the waste (VO) was the best initial substrate concentration for the production of rhamnolipids (2.71, 5.01 g rhamnolipid/l) as arrangement. Addition of glycerol (10-20% v glycerol/v PO) to the 2% PO fermentation broth led to increase the rhamnolipid production (about 1.8-2 times fold). However, the addition of palmitic acid (5 and 10 g/l) to fermentation broth contained 6% VO rarely enhanced the production rate. The experimental data for 2% initially (PO) was used to estimate the various kinetic parameters. The following results were obtained, maximum rate or velocity of reaction (Vmax) = 0.06417 g/l.hr), yield of cell weight per unit weight of substrate utilized (Yx/s = 0.324 g Cx/g Cs) maximum specific growth rate (μmax = 0.05791 hr⁻¹), yield of rhamnolipid weight per unit weight of substrate utilized (Yp/s)=0.2571gCp/g Cs), maintenance coefficient (Ms =0.002419), Michaelis-Menten constant, (Km=6.1237 gmol/l), endogenous decay coefficient (Kd=0.002375 hr⁻¹). Predictive parameters and advanced mathematical models were applied to evaluate the time of the batch bioreactor. The results were as follows: 123.37, 129 and 139.3 hours in respect of microbial biomass, substrate and product concentration, respectively compared with experimental batch time of 120 hours in all cases. The expected mathematical models are compatible with the laboratory results and can, therefore, be considered as tools for expressing the actual system.Keywords: batch bioreactor design, glycerol, kinetic parameters, petroleum crude oil, Pseudomonas aeruginosa, rhamnolipids biosurfactants, vegetable oil
Procedia PDF Downloads 1335110 Coding Structures for Seated Row Simulation of an Active Controlled Vibration Isolation and Stabilization System for Astronaut’s Exercise Platform
Authors: Ziraguen O. Williams, Shield B. Lin, Fouad N. Matari, Leslie J. Quiocho
Abstract:
Simulation for seated row exercise was a continued task to assist NASA in analyzing a one-dimensional vibration isolation and stabilization system for astronaut’s exercise platform. Feedback delay and signal noise were added to the model as previously done in simulation for squat exercise. Simulation runs for this study were conducted in two software simulation tools, Trick and MBDyn, software simulation environments developed at the NASA Johnson Space Center. The exciter force in the simulation was calculated from the motion capture of an exerciser during a seated row exercise. The simulation runs include passive control, active control using a Proportional, Integral, Derivative (PID) controller, and active control using a Piecewise Linear Integral Derivative (PWLID) controller. Output parameters include displacements of the exercise platform, the exerciser, and the counterweight; transmitted force to the wall of spacecraft; and actuator force to the platform. The simulation results showed excellent force reduction in the actively controlled system compared to the passive controlled system, which showed less force reduction.Keywords: control, counterweight, isolation, vibration.
Procedia PDF Downloads 1445109 Evaluation of Tunnel Stability by Numerical Methods
Authors: Yalemsira Bewuket Gubena
Abstract:
Excavating a tunnel releases a large amount of pre-existing stress, causing the material to deform by arching or squeezing effect depending on the depth of the tunnel. Shallow tunnels fail by arching, while deep underground tunnels fail by squeezing effect. There have been many failures recorded around the world, among them Ethiopia's biggest hydroelectric power station, Gillgel Gibe II, has been shut down due to a tunnel collapse weeks after its official opening. Nowadays, the country is constructing a new railway route at Awash-Kombolcha-Haragebeya to connect the towns with the ports of neighboring countries. Tunnel 04, having a maximum overburden of 320m is the focus of this study. The stability of the tunnel is analyzed by incorporating a pseudo-static analysis using the two finite element software, and the most favorable supports are selected. Based on the analysis made all three numerical analysis software’s give nearly the same output results. Using Roc support, it is found that the displacement is 0.017, having a strain value of 0.35%, which is less than one exhibiting few stability problems with no squeezing potential where the tunnel can be supported by shotcrete and rockbolt. Therefore, the analysis from Phase 2 and Plaxis 3D shows a displacement of 0.022 and 0.0231m, respectively, after adding 30cm shotcrete and diameter 32 bolt. From the parametric study done, as the value of the young’s modulus decreases, the displacement around the tunnel opening increases.Keywords: squeezing, finite element method, deformation, support
Procedia PDF Downloads 115108 An Artificially Intelligent Teaching-Agent to Enhance Learning Interactions in Virtual Settings
Authors: Abdulwakeel B. Raji
Abstract:
This paper introduces a concept of an intelligent virtual learning environment that involves communication between learners and an artificially intelligent teaching agent in an attempt to replicate classroom learning interactions. The benefits of this technology over current e-learning practices is that it creates a virtual classroom where real time adaptive learning interactions are made possible. This is a move away from the static learning practices currently being adopted by e-learning systems. Over the years, artificial intelligence has been applied to various fields, including and not limited to medicine, military applications, psychology, marketing etc. The purpose of e-learning applications is to ensure users are able to learn outside of the classroom, but a major limitation has been the inability to fully replicate classroom interactions between teacher and students. This study used comparative surveys to gain information and understanding of the current learning practices in Nigerian universities and how they compare to these practices compare to the use of a developed e-learning system. The study was conducted by attending several lectures and noting the interactions between lecturers and tutors and as an aftermath, a software has been developed that deploys the use of an artificial intelligent teaching-agent alongside an e-learning system to enhance user learning experience and attempt to create the similar learning interactions to those found in classroom and lecture hall settings. Dialogflow has been used to implement a teaching-agent, which has been developed using JSON, which serves as a virtual teacher. Course content has been created using HTML, CSS, PHP and JAVASCRIPT as a web-based application. This technology can run on handheld devices and Google based home technologies to give learners an access to the teaching agent at any time. This technology also implements the use of definite clause grammars and natural language processing to match user inputs and requests with defined rules to replicate learning interactions. This technology developed covers familiar classroom scenarios such as answering users’ questions, asking ‘do you understand’ at regular intervals and answering subsequent requests, taking advanced user queries to give feedbacks at other periods. This software technology uses deep learning techniques to learn user interactions and patterns to subsequently enhance user learning experience. A system testing has been undergone by undergraduate students in the UK and Nigeria on the course ‘Introduction to Database Development’. Test results and feedback from users shows that this study and developed software is a significant improvement on existing e-learning systems. Further experiments are to be run using the software with different students and more course contents.Keywords: virtual learning, natural language processing, definite clause grammars, deep learning, artificial intelligence
Procedia PDF Downloads 1375107 Analyzing the Impact of Migration on HIV and AIDS Incidence Cases in Malaysia
Authors: Ofosuhene O. Apenteng, Noor Azina Ismail
Abstract:
The human immunodeficiency virus (HIV) that causes acquired immune deficiency syndrome (AIDS) remains a global cause of morbidity and mortality. It has caused panic since its emergence. Relationships between migration and HIV/AIDS have become complex. In the absence of prospectively designed studies, dynamic mathematical models that take into account the migration movement which will give very useful information. We have explored the utility of mathematical models in understanding transmission dynamics of HIV and AIDS and in assessing the magnitude of how migration has impact on the disease. The model was calibrated to HIV and AIDS incidence data from Malaysia Ministry of Health from the period of 1986 to 2011 using Bayesian analysis with combination of Markov chain Monte Carlo method (MCMC) approach to estimate the model parameters. From the estimated parameters, the estimated basic reproduction number was 22.5812. The rate at which the susceptible individual moved to HIV compartment has the highest sensitivity value which is more significant as compared to the remaining parameters. Thus, the disease becomes unstable. This is a big concern and not good indicator from the public health point of view since the aim is to stabilize the epidemic at the disease-free equilibrium. However, these results suggest that the government as a policy maker should make further efforts to curb illegal activities performed by migrants. It is shown that our models reflect considerably the dynamic behavior of the HIV/AIDS epidemic in Malaysia and eventually could be used strategically for other countries.Keywords: epidemic model, reproduction number, HIV, MCMC, parameter estimation
Procedia PDF Downloads 3675106 Requirement Engineering for Intrusion Detection Systems in Wireless Sensor Networks
Authors: Afnan Al-Romi, Iman Al-Momani
Abstract:
The urge of applying the Software Engineering (SE) processes is both of vital importance and a key feature in critical, complex large-scale systems, for example, safety systems, security service systems, and network systems. Inevitably, associated with this are risks, such as system vulnerabilities and security threats. The probability of those risks increases in unsecured environments, such as wireless networks in general and in Wireless Sensor Networks (WSNs) in particular. WSN is a self-organizing network of sensor nodes connected by wireless links. WSNs consist of hundreds to thousands of low-power, low-cost, multi-function sensor nodes that are small in size and communicate over short-ranges. The distribution of sensor nodes in an open environment that could be unattended in addition to the resource constraints in terms of processing, storage and power, make such networks in stringent limitations such as lifetime (i.e. period of operation) and security. The importance of WSN applications that could be found in many militaries and civilian aspects has drawn the attention of many researchers to consider its security. To address this important issue and overcome one of the main challenges of WSNs, security solution systems have been developed by researchers. Those solutions are software-based network Intrusion Detection Systems (IDSs). However, it has been witnessed, that those developed IDSs are neither secure enough nor accurate to detect all malicious behaviours of attacks. Thus, the problem is the lack of coverage of all malicious behaviours in proposed IDSs, leading to unpleasant results, such as delays in the detection process, low detection accuracy, or even worse, leading to detection failure, as illustrated in the previous studies. Also, another problem is energy consumption in WSNs caused by IDS. So, in other words, not all requirements are implemented then traced. Moreover, neither all requirements are identified nor satisfied, as for some requirements have been compromised. The drawbacks in the current IDS are due to not following structured software development processes by researches and developers when developing IDS. Consequently, they resulted in inadequate requirement management, process, validation, and verification of requirements quality. Unfortunately, WSN and SE research communities have been mostly impermeable to each other. Integrating SE and WSNs is a real subject that will be expanded as technology evolves and spreads in industrial applications. Therefore, this paper will study the importance of Requirement Engineering when developing IDSs. Also, it will study a set of existed IDSs and illustrate the absence of Requirement Engineering and its effect. Then conclusions are drawn in regard of applying requirement engineering to systems to deliver the required functionalities, with respect to operational constraints, within an acceptable level of performance, accuracy and reliability.Keywords: software engineering, requirement engineering, Intrusion Detection System, IDS, Wireless Sensor Networks, WSN
Procedia PDF Downloads 3245105 Extending BDI Multiagent Systems with Agent Norms
Authors: Francisco José Plácido da Cunha, Tassio Ferenzini Martins Sirqueira, Marx Leles Viana, Carlos José Pereira de Lucena
Abstract:
Open Multiagent Systems (MASs) are societies in which heterogeneous and independently designed entities (agents) work towards similar, or different ends. Software agents are autonomous and the diversity of interests among different members living in the same society is a fact. In order to deal with this autonomy, these open systems use mechanisms of social control (norms) to ensure a desirable social order. This paper considers the following types of norms: (i) obligation — agents must accomplish a specific outcome; (ii) permission — agents may act in a particular way, and (iii) prohibition — agents must not act in a specific way. All of these characteristics mean to encourage the fulfillment of norms through rewards and to discourage norm violation by pointing out the punishments. Once the software agent decides that its priority is the satisfaction of its own desires and goals, each agent must evaluate the effects associated to the fulfillment of one or more norms before choosing which one should be fulfilled. The same applies when agents decide to violate a norm. This paper also introduces a framework for the development of MASs that provide support mechanisms to the agent’s decision-making, using norm-based reasoning. The applicability and validation of this approach is demonstrated applying a traffic intersection scenario.Keywords: BDI agent, BDI4JADE framework, multiagent systems, normative agents
Procedia PDF Downloads 2355104 Solar Energy Applications in Seawater Distillation
Authors: Yousef Abdulaziz Almolhem
Abstract:
Geographically, the most Arabic countries locate in areas confined to arid or semiarid regions. For this reason, most of our countries have adopted the seawater desalination as a strategy to overcome this problem. For example, the water supply of AUE, Kuwait, and Saudi Arabia is almost 100% from the seawater desalination plants. Many areas in Saudia Arabia and other countries in the world suffer from lack of fresh water which hinders the development of these areas, despite the availability of saline water and high solar radiation intensity. Furthermore, most developing countries do not have sufficient meteorological data to evaluate if the solar radiation is enough to meet the solar desalination. A mathematical model was developed to simulate and predict the thermal behavior of the solar still which used direct solar energy for distillation of seawater. Measurement data were measured in the Environment and Natural Resources Department, Faculty of Agricultural and Food sciences, King Faisal University, Saudi Arabia, in order to evaluate the present model. The simulation results obtained from this model were compared with the measured data. The main results of this research showed that there are slight differences between the measured and predicted values of the elements studied, which is resultant from the change of some factors considered constants in the model such as the sky clearance, wind velocity and the salt concentration in the water in the basin of the solar still. It can be concluded that the present model can be used to estimate the average total solar radiation and the thermal behavior of the solar still in any area with consideration to the geographical location.Keywords: mathematical model, sea water, distillation, solar radiation
Procedia PDF Downloads 2855103 Design of a Thrust Vectoring System for an Underwater ROV
Authors: Isaac Laryea
Abstract:
Underwater remote-operated vehicles (ROVs) are highly useful in aquatic research and underwater operations. Unfortunately, unsteady and unpredictable conditions underwater make it difficult for underwater vehicles to maintain a steady attitude during motion. Existing underwater vehicles make use of multiple thrusters positioned at specific positions on their frame to maintain a certain pose. This study proposes an alternate way of maintaining a steady attitude during horizontal motion at low speeds by making use of a thrust vector-controlled propulsion system. The study began by carrying out some preliminary calculations to get an idea of a suitable shape and form factor. Flow simulations were carried out to ensure that enough thrust could be generated to move the system. Using the Lagrangian approach, a mathematical system was developed for the ROV, and this model was used to design a control system. A PID controller was selected for the control system. However, after tuning, it was realized that a PD controller satisfied the design specifications. The designed control system produced an overshoot of 6.72%, with a settling time of 0.192s. To achieve the effect of thrust vectoring, an inverse kinematics synthesis was carried out to determine what angle the actuators need to move to. After building the system, intermittent angular displacements of 10°, 15°, and 20° were given during bench testing, and the response of the control system as well as the servo motor angle was plotted. The final design was able to move in water but was not able to handle large angular displacements as a result of the small angle approximation used in the mathematical model.Keywords: PID control, thrust vectoring, parallel manipulators, ROV, underwater, attitude control
Procedia PDF Downloads 755102 Review on Crew Scheduling of Bus Transit: A Case Study in Kolkata
Authors: Sapan Tiwari, Namrata Ghosh
Abstract:
In urban mass transit, crew scheduling always plays a significant role. It deals with the formulation of work timetables for its staff so that an organization can meet the demand for its products or services. The efficient schedules of a specified timetable have an enormous impact on staff demand. It implies that an urban mass transit company's financial outcomes are strongly associated with planning operations in the region. The research aims to demonstrate the state of the crew scheduling studies and its practical implementation in mass transit businesses in metropolitan areas. First, there is a short overview of past studies in the field. Subsequently, the restrictions and problems with crew scheduling and some models, which have been developed to solve the related issues with their mathematical formulation, are defined. The comments are completed by a description of the solution opportunities provided by computer-aided scheduling program systems for operational use and exposures from urban mass transit organizations. Furthermore, Bus scheduling is performed using the Hungarian technique of problem-solving tasks and mathematical modeling. Afterward, the crew scheduling problem, which consists of developing duties using predefined tasks with set start and end times and places, is resolved. Each duty has to comply with a set line of work. The objective is to minimize a mixture of fixed expenses (number of duties) and varying costs. After the optimization of cost, the outcome of the research is that the same frequency can be provided with fewer buses and less workforce.Keywords: crew scheduling, duty, optimization of cost, urban mass transit
Procedia PDF Downloads 1525101 Paddy/Rice Singulation for Determination of Husking Efficiency and Damage Using Machine Vision
Authors: M. Shaker, S. Minaei, M. H. Khoshtaghaza, A. Banakar, A. Jafari
Abstract:
In this study a system of machine vision and singulation was developed to separate paddy from rice and determine paddy husking and rice breakage percentages. The machine vision system consists of three main components including an imaging chamber, a digital camera, a computer equipped with image processing software. The singulation device consists of a kernel holding surface, a motor with vacuum fan, and a dimmer. For separation of paddy from rice (in the image), it was necessary to set a threshold. Therefore, some images of paddy and rice were sampled and the RGB values of the images were extracted using MATLAB software. Then mean and standard deviation of the data were determined. An Image processing algorithm was developed using MATLAB to determine paddy/rice separation and rice breakage and paddy husking percentages, using blue to red ratio. Tests showed that, a threshold of 0.75 is suitable for separating paddy from rice kernels. Results from the evaluation of the image processing algorithm showed that the accuracies obtained with the algorithm were 98.36% and 91.81% for paddy husking and rice breakage percentage, respectively. Analysis also showed that a suction of 45 mmHg to 50 mmHg yielding 81.3% separation efficiency is appropriate for operation of the kernel singulation system.Keywords: breakage, computer vision, husking, rice kernel
Procedia PDF Downloads 384