Search results for: software component and interfaces
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7505

Search results for: software component and interfaces

6035 Learning, Teaching and Assessing Students’ ESP Skills via Exe and Hot Potatoes Software Programs

Authors: Naira Poghosyan

Abstract:

In knowledge society the content of the studies, the methods used and the requirements for an educator’s professionalism regularly undergo certain changes. It follows that in knowledge society the aim of education is not only to educate professionals for a certain field but also to help students to be aware of cultural values, form human mutual relationship, collaborate, be open, adapt to the new situation, creatively express their ideas, accept responsibility and challenge. In this viewpoint, the development of communicative language competence requires a through coordinated approach to ensure proper comprehension and memorization of subject-specific words starting from high school level. On the other hand, ESP (English for Specific Purposes) teachers and practitioners are increasingly faced with the task of developing and exploiting new ways of assessing their learners’ literacy while learning and teaching ESP. The presentation will highlight the latest achievements in this field. The author will present some practical methodological issues and principles associated with learning, teaching and assessing ESP skills of the learners, using the two software programs of EXE 2.0 and Hot Potatoes 6. On the one hand the author will display the advantages of the two programs as self-learning and self-assessment interactive tools in the course of academic study and professional development of the CLIL learners, on the other hand, she will comprehensively shed light upon some methodological aspects of working out appropriate ways of selection, introduction, consolidation of subject specific materials via EXE 2.0 and Hot Potatoes 6. Then the author will go further to distinguish ESP courses by the general nature of the learners’ specialty identifying three large categories of EST (English for Science and Technology), EBE (English for Business and Economics) and ESS (English for the Social Sciences). The cornerstone of the presentation will be the introduction of the subject titled “The methodology of teaching ESP in non-linguistic institutions”, where a unique case of teaching ESP on Architecture and Construction via EXE 2.0 and Hot Potatoes 6 will be introduced, exemplifying how the introduction, consolidation and assessment can be used as a basis for feedback to the ESP learners in a particular professional field.

Keywords: ESP competences, ESP skill assessment/ self-assessment tool, eXe 2.0 / HotPotatoes software program, ESP teaching strategies and techniques

Procedia PDF Downloads 379
6034 Reliability and Maintainability Optimization for Aircraft’s Repairable Components Based on Cost Modeling Approach

Authors: Adel A. Ghobbar

Abstract:

The airline industry is continuously challenging how to safely increase the service life of the aircraft with limited maintenance budgets. Operators are looking for the most qualified maintenance providers of aircraft components, offering the finest customer service. Component owner and maintenance provider is offering an Abacus agreement (Aircraft Component Leasing) to increase the efficiency and productivity of the customer service. To increase the customer service, the current focus on No Fault Found (NFF) units must change into the focus on Early Failure (EF) units. Since the effect of EF units has a significant impact on customer satisfaction, this needs to increase the reliability of EF units at minimal cost, which leads to the goal of this paper. By identifying the reliability of early failure (EF) units with regards to No Fault Found (NFF) units, in particular, the root cause analysis with an integrated cost analysis of EF units with the use of a failure mode analysis tool and a cost model, there will be a set of EF maintenance improvements. The data used for the investigation of the EF units will be obtained from the Pentagon system, an Enterprise Resource Planning (ERP) system used by Fokker Services. The Pentagon system monitors components, which needs to be repaired from Fokker aircraft owners, Abacus exchange pool, and commercial customers. The data will be selected on several criteria’s: time span, failure rate, and cost driver. When the selected data has been acquired, the failure mode and root cause analysis of EF units are initiated. The failure analysis approach tool was implemented, resulting in the proposed failure solution of EF. This will lead to specific EF maintenance improvements, which can be set-up to decrease the EF units and, as a result of this, increasing the reliability. The investigated EFs, between the time period over ten years, showed to have a significant reliability impact of 32% on the total of 23339 unscheduled failures. Since the EFs encloses almost one-third of the entire population.

Keywords: supportability, no fault found, FMEA, early failure, availability, operational reliability, predictive model

Procedia PDF Downloads 132
6033 Determination of the Volatile Organic Compounds, Antioxidant and Antimicrobial Properties of Microwave-Assisted Green Extracted Ficus Carica Linn Leaves

Authors: Pelin Yilmaz, Gizemnur Yildiz Uysal, Elcin Demirhan, Belma Ozbek

Abstract:

The edible fig plant, Ficus carica Linn, belongs to the Moraceae family, and the leaves are mainly considered agricultural waste after harvesting. It has been demonstrated in the literature that fig leaves contain appealing properties such as high vitamins, fiber, amino acids, organic acids, and phenolic or flavonoid content. The extraction of these valuable products has gained importance. Microwave-assisted extraction (MAE) is a method using microwave energy to heat the solvents, thereby transferring the bioactive compounds from the sample to the solvent. The main advantage of the MAE is the rapid extraction of bioactive compounds. In the present study, the MAE was applied to extract the bioactive compounds from Ficus carica L. leaves, and the effect of microwave power (180-900 W), extraction time (60-180 s), and solvent to sample amount (mL/g) (10-30) on the antioxidant property of the leaves. Then, the volatile organic component profile was determined at the specified extraction point. Additionally, antimicrobial studies were carried out to determine the minimum inhibitory concentration of the microwave-extracted leaves. As a result, according to the data obtained from the experimental studies, the highest antimicrobial properties were obtained under the process parameters such as 540 W, 180 s, and 20 mL/g concentration. The volatile organic compound profile showed that isobergapten, which belongs to the furanocoumarins family exhibiting anticancer, antioxidant, and antimicrobial activity besides promoting bone health, was the main compound. Acknowledgments: This work has been supported by Yildiz Technical University Scientific Research Projects Coordination Unit under project number FBA-2021-4409. The authors would like to acknowledge the financial support from Tubitak 1515 - Frontier R&D Laboratory Support Programme.

Keywords: Ficus carica Linn leaves, volatile organic component, GC-MS, microwave extraction, isobergapten, antimicrobial

Procedia PDF Downloads 85
6032 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 353
6031 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator

Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty

Abstract:

Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.

Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state

Procedia PDF Downloads 269
6030 Aberrant Consumer Behavior in Seller’s and Consumer’s Eyes: Newly Developed Classification

Authors: Amal Abdelhadi

Abstract:

Consumer misbehavior evaluation can be markedly different based on a number of variables and different from one environment to another. Using three aberrant consumer behavior (ACB) scenarios (shoplifting, stealing from hotel rooms and software piracy) this study aimed to explore Libyan seller and consumers of ACB. Materials were collected by using a multi-method approach was employed (qualitative and quantitative approaches) in two fieldwork phases. In the phase stage, a qualitative data were collected from 26 Libyan sellers’ by face-to-face interviews. In the second stage, a consumer survey was used to collect quantitative data from 679 Libyan consumers. This study found that the consumer’s and seller’s evaluation of ACB are not always consistent. Further, ACB evaluations differed based on the form of ACB. Furthermore, the study found that not all consumer behaviors that were considered as bad behavior in other countries have the same evaluation in Libya; for example, software piracy. Therefore this study suggested a newly developed classification of ACB based on marketers’ and consumers’ views. This classification provides 9 ACB types within two dimensions (marketers’ and consumers’ views) and three degrees of behavior evaluation (good, acceptable and misbehavior).

Keywords: aberrant consumer behavior, Libya, multi-method approach, planned behavior theory

Procedia PDF Downloads 580
6029 Virtualization and Visualization Based Driver Configuration in Operating System

Authors: Pavan Shah

Abstract:

In an Embedded system, Virtualization and visualization technology can provide us an effective response and measurable work in a software development environment. In addition to work of virtualization and virtualization can be easily deserved to provide the best resource sharing between real-time hardware applications and a healthy environment. However, the virtualization is noticeable work to minimize the I/O work and utilize virtualization & virtualization technology for either a software development environment (SDE) or a runtime environment of real-time embedded systems (RTMES) or real-time operating system (RTOS) eras. In this Paper, we particularly focus on virtualization and visualization overheads data of network which generates the I/O and implementation of standardized I/O (i.e., Virto), which can work as front-end network driver in a real-time operating system (RTOS) hardware module. Even there have been several work studies are available based on the virtualization operating system environment, but for the Virto on a general-purpose OS, my implementation is on the open-source Virto for a real-time operating system (RTOS). In this paper, the measurement results show that implementation which can improve the bandwidth and latency of memory management of the real-time operating system environment (RTMES) for getting more accuracy of the trained model.

Keywords: virtualization, visualization, network driver, operating system

Procedia PDF Downloads 136
6028 Practical Software for Optimum Bore Hole Cleaning Using Drilling Hydraulics Techniques

Authors: Abdulaziz F. Ettir, Ghait Bashir, Tarek S. Duzan

Abstract:

A proper well planning is very vital to achieve any successful drilling program on the basis of preventing, overcome all drilling problems and minimize cost operations. Since the hydraulic system plays an active role during the drilling operations, that will lead to accelerate the drilling effort and lower the overall well cost. Likewise, an improperly designed hydraulic system can slow drill rate, fail to clean the hole of cuttings, and cause kicks. In most cases, common sense and commercially available computer programs are the only elements required to design the hydraulic system. Drilling optimization is the logical process of analyzing effects and interactions of drilling variables through applied drilling and hydraulic equations and mathematical modeling to achieve maximum drilling efficiency with minimize drilling cost. In this paper, practical software adopted in this paper to define drilling optimization models including four different optimum keys, namely Opti-flow, Opti-clean, Opti-slip and Opti-nozzle that can help to achieve high drilling efficiency with lower cost. The used data in this research from vertical and horizontal wells were recently drilled in Waha Oil Company fields. The input data are: Formation type, Geopressures, Hole Geometry, Bottom hole assembly and Mud reghology. Upon data analysis, all the results from wells show that the proposed program provides a high accuracy than that proposed from the company in terms of hole cleaning efficiency, and cost break down if we consider that the actual data as a reference base for all wells. Finally, it is recommended to use the established Optimization calculations software at drilling design to achieve correct drilling parameters that can provide high drilling efficiency, borehole cleaning and all other hydraulic parameters which assist to minimize hole problems and control drilling operation costs.

Keywords: optimum keys, namely opti-flow, opti-clean, opti-slip and opti-nozzle

Procedia PDF Downloads 323
6027 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 417
6026 On the Homology Modeling, Structural Function Relationship and Binding Site Prediction of Human Alsin Protein

Authors: Y. Ruchi, A. Prerna, S. Deepshikha

Abstract:

Amyotrophic lateral sclerosis (ALS), also known as “Lou Gehrig’s disease”. It is a neurodegenerative disease associated with degeneration of motor neurons in the cerebral cortex, brain stem, and spinal cord characterized by distal muscle weakness, atrophy, normal sensation, pyramidal signs and progressive muscular paralysis reflecting. ALS2 is a juvenile autosomal recessive disorder, slowly progressive, that maps to chromosome 2q33 and is associated with mutations in the alsin gene, a putative GTPase regulator. In this paper we have done homology modeling of alsin2 protein using multiple templates (3KCI_A, 4LIM_A, 402W_A, 4D9S_A, and 4DNV_A) designed using the Prime program in Schrödinger software. Further modeled structure is used to identify effective binding sites on the basis of structural and physical properties using sitemap program in Schrödinger software, structural and function analysis is done by using Prosite and ExPASy server that gives insight into conserved domains and motifs that can be used for protein classification. This paper summarizes the structural, functional and binding site property of alsin2 protein. These binding sites can be potential drug target sites and can be used for docking studies.

Keywords: ALS, binding site, homology modeling, neuronal degeneration

Procedia PDF Downloads 391
6025 Development of Partial Discharge Defect Recognition and Status Diagnosis System with Adaptive Deep Learning

Authors: Chien-kuo Chang, Bo-wei Wu, Yi-yun Tang, Min-chiu Wu

Abstract:

This paper proposes a power equipment diagnosis system based on partial discharge (PD), which is characterized by increasing the readability of experimental data and the convenience of operation. This system integrates a variety of analysis programs of different data formats and different programming languages and then establishes a set of interfaces that can follow and expand the structure, which is also helpful for subsequent maintenance and innovation. This study shows a case of using the developed Convolutional Neural Networks (CNN) to integrate with this system, using the designed model architecture to simplify the complex training process. It is expected that the simplified training process can be used to establish an adaptive deep learning experimental structure. By selecting different test data for repeated training, the accuracy of the identification system can be enhanced. On this platform, the measurement status and partial discharge pattern of each equipment can be checked in real time, and the function of real-time identification can be set, and various training models can be used to carry out real-time partial discharge insulation defect identification and insulation state diagnosis. When the electric power equipment entering the dangerous period, replace equipment early to avoid unexpected electrical accidents.

Keywords: partial discharge, convolutional neural network, partial discharge analysis platform, adaptive deep learning

Procedia PDF Downloads 82
6024 Development of Lectin-Based Biosensor for Glycoprofiling of Clinical Samples: Focus on Prostate Cancer

Authors: Dominika Pihikova, Stefan Belicky, Tomas Bertok, Roman Sokol, Petra Kubanikova, Jan Tkac

Abstract:

Since aberrant glycosylation is frequently accompanied by both physiological and pathological processes in a human body (cancer, AIDS, inflammatory diseases, etc.), the analysis of tumor-associated glycan patterns have a great potential for the development of novel diagnostic approaches. Moreover, altered glycoforms may assist as a suitable tool for the specificity and sensitivity enhancement in early-stage prostate cancer diagnosis. In this paper we discuss the construction and optimization of ultrasensitive sandwich biosensor platform employing lectin as glycan-binding protein. We focus on the immunoassay development, reduction of non-specific interactions and final glycoprofiling of human serum samples including both prostate cancer (PCa) patients and healthy controls. The fabricated biosensor was measured by label-free electrochemical impedance spectroscopy (EIS) with further lectin microarray verification. Furthermore, we analyzed different biosensor interfaces with atomic force microscopy (AFM) in nanomechanical mapping mode showing a significant differences in the altitude. These preliminary results revealing an elevated content of α-2,3 linked sialic acid in PCa patients comparing with healthy controls. All these experiments are important step towards development of point-of-care devices and discovery of novel glyco-biomarkers applicable in cancer diagnosis.

Keywords: biosensor, glycan, lectin, prostate cancer

Procedia PDF Downloads 377
6023 User Selections on Social Network Applications

Authors: C. C. Liang

Abstract:

MSN used to be the most popular application for communicating among social networks, but Facebook chat is now the most popular. Facebook and MSN have similar characteristics, including usefulness, ease-of-use, and a similar function, which is the exchanging of information with friends. Facebook outperforms MSN in both of these areas. However, the adoption of Facebook and abandonment of MSN have occurred for other reasons. Functions can be improved, but users’ willingness to use does not just depend on functionality. Flow status has been established to be crucial to users’ adoption of cyber applications and to affects users’ adoption of software applications. If users experience flow in using software application, they will enjoy using it frequently, and even change their preferred application from an old to this new one. However, no investigation has examined choice behavior related to switching from Facebook to MSN based on a consideration of flow experiences and functions. This investigation discusses the flow experiences and functions of social-networking applications. Flow experience is found to affect perceived ease of use and perceived usefulness; perceived ease of use influences information ex-change with friends, and perceived usefulness; information exchange influences perceived usefulness, but information exchange has no effect on flow experience.

Keywords: consumer behavior, social media, technology acceptance model, flow experience

Procedia PDF Downloads 359
6022 Domain Driven Design vs Soft Domain Driven Design Frameworks

Authors: Mohammed Salahat, Steve Wade

Abstract:

This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.

Keywords: domain-driven design, soft domain-driven design, naked objects, soft language

Procedia PDF Downloads 300
6021 Studying on Pile Seismic Operation with Numerical Method by Using FLAC 3D Software

Authors: Hossein Motaghedi, Kaveh Arkani, Siavash Salamatpoor

Abstract:

Usually the piles are important tools for safety and economical design of high and heavy structures. For this aim the response of single pile under dynamic load is so effective. Also, the agents which have influence on single pile response are properties of pile geometrical, soil and subjected loads. In this study the finite difference numerical method and by using FLAC 3D software is used for evaluation of single pile behavior under peak ground acceleration (PGA) of El Centro earthquake record in California (1940). The results of this models compared by experimental results of other researchers and it will be seen that the results of this models are approximately coincide by experimental data's. For example the maximum moment and displacement in top of the pile is corresponding to the other experimental results of pervious researchers. Furthermore, in this paper is tried to evaluate the effective properties between soil and pile. The results is shown that by increasing the pile diagonal, the pile top displacement will be decreased. As well as, by increasing the length of pile, the top displacement will be increased. Also, by increasing the stiffness ratio of pile to soil, the produced moment in pile body will be increased and the taller piles have more interaction by soils and have high inertia. So, these results can help directly to optimization design of pile dimensions.

Keywords: pile seismic response, interaction between soil and pile, numerical analysis, FLAC 3D

Procedia PDF Downloads 393
6020 Precise Identification of Clustered Regularly Interspaced Short Palindromic Repeats-Induced Mutations via Hidden Markov Model-Based Sequence Alignment

Authors: Jingyuan Hu, Zhandong Liu

Abstract:

CRISPR genome editing technology has transformed molecular biology by accurately targeting and altering an organism’s DNA. Despite the state-of-art precision of CRISPR genome editing, the imprecise mutation outcome and off-target effects present considerable risk, potentially leading to unintended genetic changes. Targeted deep sequencing, combined with bioinformatics sequence alignment, can detect such unwanted mutations. Nevertheless, the classical method, Needleman-Wunsch (NW) algorithm may produce false alignment outcomes, resulting in inaccurate mutation identification. The key to precisely identifying CRISPR-induced mutations lies in determining optimal parameters for the sequence alignment algorithm. Hidden Markov models (HMM) are ideally suited for this task, offering flexibility across CRISPR systems by leveraging forward-backward algorithms for parameter estimation. In this study, we introduce CRISPR-HMM, a statistical software to precisely call CRISPR-induced mutations. We demonstrate that the software significantly improves precision in identifying CRISPR-induced mutations compared to NW-based alignment, thereby enhancing the overall understanding of the CRISPR gene-editing process.

Keywords: CRISPR, HMM, sequence alignment, gene editing

Procedia PDF Downloads 59
6019 Information Technology Competences for Professional Accountants in Thai Small to Medium Accounting Practice

Authors: Manirath Wongsim, Chatchawarn Srimontree, Pornpichit Phosri

Abstract:

Today, the majority of the data innovation may be currently majorly influencing business, what more accepted part of the accountant may be evolving. Information Technology elements have been appearing to be crucial in triggering changes of accountants’ roles. Thus, this study aims to investigate IT competencies among professional accountants to enhance firm performance. This research was conducted with 47 respondents at five organizations in Thailand and used quantitative research. The results indicate that the factor IT competencies for professional accountants in Thai small to medium accounting within the organizational issues defines18 factors. Specifically, these new factors, based on the research findings and the literature, then unique to IT competencies for professional accountants, include ERP software skills and accounting law and legal skills. The evidence in this study suggests that Analytical skills, teamwork skills, and accounting software were ranked as much-needed skills to be acquired by accountants while communication skills were ranked as the most required skills and delegation skills as the least required. The findings of the research’s empirical evidence suggest that organizations should understand appropriate in developing information technology influence competencies for knowledge employees in general and professional accountants in particular and provide assistance in all processes of decision making.

Keywords: IT competencies, IT competences for professional accountants, IT skills for accounting, IT skills in SMEs

Procedia PDF Downloads 235
6018 Investigation of Software Integration for Simulations of Buoyancy-Driven Heat Transfer in a Vehicle Underhood during Thermal Soak

Authors: R. Yuan, S. Sivasankaran, N. Dutta, K. Ebrahimi

Abstract:

This paper investigates the software capability and computer-aided engineering (CAE) method of modelling transient heat transfer process occurred in the vehicle underhood region during vehicle thermal soak phase. The heat retention from the soak period will be beneficial to the cold start with reduced friction loss for the second 14°C worldwide harmonized light-duty vehicle test procedure (WLTP) cycle, therefore provides benefits on both CO₂ emission reduction and fuel economy. When vehicle undergoes soak stage, the airflow and the associated convective heat transfer around and inside the engine bay is driven by the buoyancy effect. This effect along with thermal radiation and conduction are the key factors to the thermal simulation of the engine bay to obtain the accurate fluids and metal temperature cool-down trajectories and to predict the temperatures at the end of the soak period. Method development has been investigated in this study on a light-duty passenger vehicle using coupled aerodynamic-heat transfer thermal transient modelling method for the full vehicle under 9 hours of thermal soak. The 3D underhood flow dynamics were solved inherently transient by the Lattice-Boltzmann Method (LBM) method using the PowerFlow software. This was further coupled with heat transfer modelling using the PowerTHERM software provided by Exa Corporation. The particle-based LBM method was capable of accurately handling extremely complicated transient flow behavior on complex surface geometries. The detailed thermal modelling, including heat conduction, radiation, and buoyancy-driven heat convection, were integrated solved by PowerTHERM. The 9 hours cool-down period was simulated and compared with the vehicle testing data of the key fluid (coolant, oil) and metal temperatures. The developed CAE method was able to predict the cool-down behaviour of the key fluids and components in agreement with the experimental data and also visualised the air leakage paths and thermal retention around the engine bay. The cool-down trajectories of the key components obtained for the 9 hours thermal soak period provide vital information and a basis for the further development of reduced-order modelling studies in future work. This allows a fast-running model to be developed and be further imbedded with the holistic study of vehicle energy modelling and thermal management. It is also found that the buoyancy effect plays an important part at the first stage of the 9 hours soak and the flow development during this stage is vital to accurately predict the heat transfer coefficients for the heat retention modelling. The developed method has demonstrated the software integration for simulating buoyancy-driven heat transfer in a vehicle underhood region during thermal soak with satisfying accuracy and efficient computing time. The CAE method developed will allow integration of the design of engine encapsulations for improving fuel consumption and reducing CO₂ emissions in a timely and robust manner, aiding the development of low-carbon transport technologies.

Keywords: ATCT/WLTC driving cycle, buoyancy-driven heat transfer, CAE method, heat retention, underhood modeling, vehicle thermal soak

Procedia PDF Downloads 159
6017 AI Software Algorithms for Drivers Monitoring within Vehicles Traffic - SiaMOTO

Authors: Ioan Corneliu Salisteanu, Valentin Dogaru Ulieru, Mihaita Nicolae Ardeleanu, Alin Pohoata, Bogdan Salisteanu, Stefan Broscareanu

Abstract:

Creating a personalized statistic for an individual within the population using IT systems, based on the searches and intercepted spheres of interest they manifest, is just one 'atom' of the artificial intelligence analysis network. However, having the ability to generate statistics based on individual data intercepted from large demographic areas leads to reasoning like that issued by a human mind with global strategic ambitions. The DiaMOTO device is a technical sensory system that allows the interception of car events caused by a driver, positioning them in time and space. The device's connection to the vehicle allows the creation of a source of data whose analysis can create psychological, behavioural profiles of the drivers involved. The SiaMOTO system collects data from many vehicles equipped with DiaMOTO, driven by many different drivers with a unique fingerprint in their approach to driving. In this paper, we aimed to explain the software infrastructure of the SiaMOTO system, a system designed to monitor and improve driver driving behaviour, as well as the criteria and algorithms underlying the intelligent analysis process.

Keywords: artificial intelligence, data processing, driver behaviour, driver monitoring, SiaMOTO

Procedia PDF Downloads 96
6016 Urban Flood Risk Mapping–a Review

Authors: Sherly M. A., Subhankar Karmakar, Terence Chan, Christian Rau

Abstract:

Floods are one of the most frequent natural disasters, causing widespread devastation, economic damage and threat to human lives. Hydrologic impacts of climate change and intensification of urbanization are two root causes of increased flood occurrences, and recent research trends are oriented towards understanding these aspects. Due to rapid urbanization, population of cities across the world has increased exponentially leading to improperly planned developments. Climate change due to natural and anthropogenic activities on our environment has resulted in spatiotemporal changes in rainfall patterns. The combined effect of both aggravates the vulnerability of urban populations to floods. In this context, an efficient and effective flood risk management with its core component as flood risk mapping is essential in prevention and mitigation of flood disasters. Urban flood risk mapping involves zoning of an urban region based on its flood risk, which depicts the spatiotemporal pattern of frequency and severity of hazards, exposure to hazards, and degree of vulnerability of the population in terms of socio-economic, environmental and infrastructural aspects. Although vulnerability is a key component of risk, its assessment and mapping is often less advanced than hazard mapping and quantification. A synergic effort from technical experts and social scientists is vital for the effectiveness of flood risk management programs. Despite an increasing volume of quality research conducted on urban flood risk, a comprehensive multidisciplinary approach towards flood risk mapping still remains neglected due to which many of the input parameters and definitions of flood risk concepts are imprecise. Thus, the objectives of this review are to introduce and precisely define the relevant input parameters, concepts and terms in urban flood risk mapping, along with its methodology, current status and limitations. The review also aims at providing thought-provoking insights to potential future researchers and flood management professionals.

Keywords: flood risk, flood hazard, flood vulnerability, flood modeling, urban flooding, urban flood risk mapping

Procedia PDF Downloads 595
6015 Analysis of Biomarkers Intractable Epileptogenic Brain Networks with Independent Component Analysis and Deep Learning Algorithms: A Comprehensive Framework for Scalable Seizure Prediction with Unimodal Neuroimaging Data in Pediatric Patients

Authors: Bliss Singhal

Abstract:

Epilepsy is a prevalent neurological disorder affecting approximately 50 million individuals worldwide and 1.2 million Americans. There exist millions of pediatric patients with intractable epilepsy, a condition in which seizures fail to come under control. The occurrence of seizures can result in physical injury, disorientation, unconsciousness, and additional symptoms that could impede children's ability to participate in everyday tasks. Predicting seizures can help parents and healthcare providers take precautions, prevent risky situations, and mentally prepare children to minimize anxiety and nervousness associated with the uncertainty of a seizure. This research proposes a comprehensive framework to predict seizures in pediatric patients by evaluating machine learning algorithms on unimodal neuroimaging data consisting of electroencephalogram signals. The bandpass filtering and independent component analysis proved to be effective in reducing the noise and artifacts from the dataset. Various machine learning algorithms’ performance is evaluated on important metrics such as accuracy, precision, specificity, sensitivity, F1 score and MCC. The results show that the deep learning algorithms are more successful in predicting seizures than logistic Regression, and k nearest neighbors. The recurrent neural network (RNN) gave the highest precision and F1 Score, long short-term memory (LSTM) outperformed RNN in accuracy and convolutional neural network (CNN) resulted in the highest Specificity. This research has significant implications for healthcare providers in proactively managing seizure occurrence in pediatric patients, potentially transforming clinical practices, and improving pediatric care.

Keywords: intractable epilepsy, seizure, deep learning, prediction, electroencephalogram channels

Procedia PDF Downloads 89
6014 Logistic Regression Based Model for Predicting Students’ Academic Performance in Higher Institutions

Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu

Abstract:

In recent years, there has been a desire to forecast student academic achievement prior to graduation. This is to help them improve their grades, particularly for individuals with poor performance. The goal of this study is to employ supervised learning techniques to construct a predictive model for student academic achievement. Many academics have already constructed models that predict student academic achievement based on factors such as smoking, demography, culture, social media, parent educational background, parent finances, and family background, to name a few. This feature and the model employed may not have correctly classified the students in terms of their academic performance. This model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester as a prerequisite to predict if the student will perform well in future on related courses. The model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost, returning a 96.7% accuracy. This model is available as a desktop application, allowing both instructors and students to benefit from user-friendly interfaces for predicting student academic achievement. As a result, it is recommended that both students and professors use this tool to better forecast outcomes.

Keywords: artificial intelligence, ML, logistic regression, performance, prediction

Procedia PDF Downloads 100
6013 VaR Estimation Using the Informational Content of Futures Traded Volume

Authors: Amel Oueslati, Olfa Benouda

Abstract:

New Value at Risk (VaR) estimation is proposed and investigated. The well-known two stages Garch-EVT approach uses conditional volatility to generate one step ahead forecasts of VaR. With daily data for twelve stocks that decompose the Dow Jones Industrial Average (DJIA) index, this paper incorporates the volume in the first stage volatility estimation. Afterwards, the forecasting ability of this conditional volatility concerning the VaR estimation is compared to that of a basic volatility model without considering any trading component. The results are significant and bring out the importance of the trading volume in the VaR measure.

Keywords: Garch-EVT, value at risk, volume, volatility

Procedia PDF Downloads 288
6012 Failure Analysis and Verification Using an Integrated Method for Automotive Electric/Electronic Systems

Authors: Lei Chen, Jian Jiao, Tingdi Zhao

Abstract:

Failures of automotive electric/electronic systems, which are universally considered to be safety-critical and software-intensive, may cause catastrophic accidents. Analysis and verification of failures in these kinds of systems is a big challenge with increasing system complexity. Model-checking is often employed to allow formal verification by ensuring that the system model conforms to specified safety properties. The system-level effects of failures are established, and the effects on system behavior are observed through the formal verification. A hazard analysis technique, called Systems-Theoretic Process Analysis, is capable of identifying design flaws which may cause potential failure hazardous, including software and system design errors and unsafe interactions among multiple system components. This paper provides a concept on how to use model-checking integrated with Systems-Theoretic Process Analysis to perform failure analysis and verification of automotive electric/electronic systems. As a result, safety requirements are optimized, and failure propagation paths are found. Finally, an automotive electric/electronic system case study is used to verify the effectiveness and practicability of the method.

Keywords: failure analysis and verification, model checking, system-theoretic process analysis, automotive electric/electronic system

Procedia PDF Downloads 126
6011 Mathematical Modeling and Simulation of Convective Heat Transfer System in Adjustable Flat Collector Orientation for Commercial Solar Dryers

Authors: Adeaga Ibiyemi Iyabo, Adeaga Oyetunde Adeoye

Abstract:

Interestingly, mechanical drying methods has played a major role in the commercialization of agricultural and agricultural allied sectors. In the overall, drying enhances the favorable storability and preservation of agricultural produce which in turn promotes its producibility, marketability, salability, and profitability. Recent researches have shown that solar drying is easier, affordable, controllable, and of course, cleaner and purer than other means of drying methods. It is, therefore, needful to persistently appraise solar dryers with a view to improving on the existing advantages. In this paper, mathematical equations were formulated for solar dryer using mass conservation law, material balance law and least cost savings method. Computer codes were written in Visual Basic.Net. The developed computer software, which considered Ibadan, a strategic south-western geographical location in Nigeria, was used to investigate the relationship between variable orientation angle of flat plate collector on solar energy trapped, derived monthly heat load, available energy supplied by solar and fraction supplied by solar energy when 50000 Kg/Month of produce was dried over a year. At variable collector tilt angle of 10°.13°,15°,18°, 20°, the derived monthly heat load, available energy supplied by solar were 1211224.63MJ, 102121.34MJ, 0.111; 3299274.63MJ, 10121.34MJ, 0.132; 5999364.706MJ, 171222.859MJ, 0.286; 4211224.63MJ, 132121.34MJ, 0.121; 2200224.63MJ, 112121.34MJ, 0.104, respectively .These results showed that if optimum collector angle is not reached, those factors needed for efficient and cost reduction drying will be difficult to attain. Therefore, this software has revealed that off - optimum collector angle in commercial solar drying does not worth it, hence the importance of the software in decision making as to the optimum collector angle of orientation.

Keywords: energy, ibadan, heat - load, visual-basic.net

Procedia PDF Downloads 414
6010 New Roles of Telomerase and Telomere-Associated Proteins in the Regulation of Telomere Length

Authors: Qin Yang, Fan Zhang, Juan Du, Chongkui Sun, Krishna Kota, Yun-Ling Zheng

Abstract:

Telomeres are specialized structures at chromosome ends consisting of tandem repetitive DNA sequences [(TTAGGG)n in humans] and associated proteins, which are necessary for telomere function. Telomere lengths are tightly regulated within a narrow range in normal human somatic cells, the basis of cellular senescence and aging. Previous studies have extensively focused on how short telomeres are extended and have demonstrated that telomerase plays a central role in telomere maintenance through elongating the short telomeres. However, the molecular mechanisms of regulating excessively long telomeres are unknown. Here, we found that telomerase enzymatic component hTERT plays a dual role in the regulation of telomeres length. We analyzed single telomere alterations at each chromosomal end led to the discoveries that hTERT shortens excessively long telomeres and elongates short telomeres simultaneously, thus maintaining the optimal telomere length at each chromosomal end for an efficient protection. The hTERT-mediated telomere shortening removes large segments of telomere DNA rapidly without inducing telomere dysfunction foci or affecting cell proliferation, thus it is mechanistically distinct from rapid telomere deletion. We found that expression of hTERT generates telomeric circular DNA, suggesting that telomere homologous recombination may be involved in this telomere shortening process. Moreover, the hTERT-mediated telomere shortening is required its enzymatic activity, but telomerase RNA component hTR is not involved in it. Furthermore, shelterin protein TPP1 interacts with hTERT and recruits it on telomeres to mediate telomere shortening. In addition, telomere-associated proteins, DKC1 and TCAB1 also play roles in this process. This novel hTERT-mediated telomere shortening mechanism not only exists in cancer cells, but also in primary human cells. Thus, the hTERT-mediated telomere shortening is expected to shift the paradigm on current molecular models of telomere length maintenance, with wide-reaching consequences in cancer and aging fields.

Keywords: aging, hTERT, telomerase, telomeres, human cells

Procedia PDF Downloads 428
6009 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches

Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.

Abstract:

A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.

Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency

Procedia PDF Downloads 151
6008 Towards Competence-Based Regulatory Sciences Education in Sub-Saharan Africa: Identification of Competencies

Authors: Abigail Ekeigwe, Bethany McGowan, Loran C. Parker, Stephen Byrn, Kari L. Clase

Abstract:

There are growing calls in the literature to develop and implement competency-based regulatory sciences education (CBRSE) in sub-Saharan Africa to expand and create a pipeline of a competent workforce of regulatory scientists. A defined competence framework is an essential component in developing competency-based education. However, such a competence framework is not available for regulatory scientists in sub-Saharan Africa. The purpose of this research is to identify entry-level competencies for inclusion in a competency framework for regulatory scientists in sub-Saharan Africa as a first step in developing CBRSE. The team systematically reviewed the literature following the PRISMA guidelines for systematic reviews and based on a pre-registered protocol on Open Science Framework (OSF). The protocol has the search strategy and the inclusion and exclusion criteria for publications. All included publications were coded to identify entry-level competencies for regulatory scientists. The team deductively coded the publications included in the study using the 'framework synthesis' model for systematic literature review. The World Health Organization’s conceptualization of competence guided the review and thematic synthesis. Topic and thematic codings were done using NVivo 12™ software. Based on the search strategy in the protocol, 2345 publications were retrieved. Twenty-two (n=22) of the retrieved publications met all the inclusion criteria for the research. Topic and thematic coding of the publications yielded three main domains of competence: knowledge, skills, and enabling behaviors. The knowledge domain has three sub-domains: administrative, regulatory governance/framework, and scientific knowledge. The skills domain has two sub-domains: functional and technical skills. Identification of competencies is the primal step that serves as a bedrock for curriculum development and competency-based education. The competencies identified in this research will help policymakers, educators, institutions, and international development partners design and implement competence-based regulatory science education in sub-Saharan Africa, ultimately leading to access to safe, quality, and effective medical products.

Keywords: competence-based regulatory science education, competencies, systematic review, sub-Saharan Africa

Procedia PDF Downloads 200
6007 Association of Non Synonymous SNP in DC-SIGN Receptor Gene with Tuberculosis (Tb)

Authors: Saima Suleman, Kalsoom Sughra, Naeem Mahmood Ashraf

Abstract:

Mycobacterium tuberculosis is a communicable chronic illness. This disease is being highly focused by researchers as it is present approximately in one third of world population either in active or latent form. The genetic makeup of a person plays an important part in producing immunity against disease. And one important factor association is single nucleotide polymorphism of relevant gene. In this study, we have studied association between single nucleotide polymorphism of CD-209 gene (encode DC-SIGN receptor) and patients of tuberculosis. Dry lab (in silico) and wet lab (RFLP) analysis have been carried out. GWAS catalogue and GEO database have been searched to find out previous association data. No association study has been found related to CD-209 nsSNPs but role of CD-209 in pulmonary tuberculosis have been addressed in GEO database.Therefore, CD-209 has been selected for this study. Different databases like ENSEMBLE and 1000 Genome Project has been used to retrieve SNP data in form of VCF file which is further submitted to different software to sort SNPs into benign and deleterious. Selected SNPs are further annotated by using 3-D modeling techniques using I-TASSER online software. Furthermore, selected nsSNPs were checked in Gujrat and Faisalabad population through RFLP analysis. In this study population two SNPs are found to be associated with tuberculosis while one nsSNP is not found to be associated with the disease.

Keywords: association, CD209, DC-SIGN, tuberculosis

Procedia PDF Downloads 311
6006 Impact of the Non-Energy Sectors Diversification on the Energy Dependency Mitigation: Visualization by the “IntelSymb” Software Application

Authors: Ilaha Rzayeva, Emin Alasgarov, Orkhan Karim-Zada

Abstract:

This study attempts to consider the linkage between management and computer sciences in order to develop the software named “IntelSymb” as a demo application to prove data analysis of non-energy* fields’ diversification, which will positively influence on energy dependency mitigation of countries. Afterward, we analyzed 18 years of economic fields of development (5 sectors) of 13 countries by identifying which patterns mostly prevailed and which can be dominant in the near future. To make our analysis solid and plausible, as a future work, we suggest developing a gateway or interface, which will be connected to all available on-line data bases (WB, UN, OECD, U.S. EIA) for countries’ analysis by fields. Sample data consists of energy (TPES and energy import indicators) and non-energy industries’ (Main Science and Technology Indicator, Internet user index, and Sales and Production indicators) statistics from 13 OECD countries over 18 years (1995-2012). Our results show that the diversification of non-energy industries can have a positive effect on energy sector dependency (energy consumption and import dependence on crude oil) deceleration. These results can provide empirical and practical support for energy and non-energy industries diversification’ policies, such as the promoting of Information and Communication Technologies (ICTs), services and innovative technologies efficiency and management, in other OECD and non-OECD member states with similar energy utilization patterns and policies. Industries, including the ICT sector, generate around 4 percent of total GHG, but this is much higher — around 14 percent — if indirect energy use is included. The ICT sector itself (excluding the broadcasting sector) contributes approximately 2 percent of global GHG emissions, at just under 1 gigatonne of carbon dioxide equivalent (GtCO2eq). Ergo, this can be a good example and lesson for countries which are dependent and independent on energy, and mainly emerging oil-based economies, as well as to motivate non-energy industries diversification in order to be ready to energy crisis and to be able to face any economic crisis as well.

Keywords: energy policy, energy diversification, “IntelSymb” software, renewable energy

Procedia PDF Downloads 229