Search results for: computer based instruction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30045

Search results for: computer based instruction

27915 Project Management Tools within SAP S/4 Hana Program Environment

Authors: Jagoda Bruni, Jan Müller-Lucanus, Gernot Stöger-Knes

Abstract:

The purpose of this article is to demonstrate modern project management approaches in the SAP S/R Hana surrounding a programming environment composed of multiple focus-diversified projects. We would like to propose innovative and goal-oriented management standards based on the specificity of the SAP transformations and customer-driven expectations. Due to the regular sprint-based controlling and management tools' application, it has been data-proven that extensive analysis of productive hours of the employees as much as a thorough review of the project progress (per GAP, per business process, and per Lot) within the whole program, can have a positive impact on customer satisfaction and improvement for projects' budget. This has been a collaborative study based on real-life experience and measurements in collaboration with our customers.

Keywords: project management, program management, SAP, controlling

Procedia PDF Downloads 97
27914 Application of Data Driven Based Models as Early Warning Tools of High Stream Flow Events and Floods

Authors: Mohammed Seyam, Faridah Othman, Ahmed El-Shafie

Abstract:

The early warning of high stream flow events (HSF) and floods is an important aspect in the management of surface water and rivers systems. This process can be performed using either process-based models or data driven-based models such as artificial intelligence (AI) techniques. The main goal of this study is to develop efficient AI-based model for predicting the real-time hourly stream flow (Q) and apply it as early warning tool of HSF and floods in the downstream area of the Selangor River basin, taken here as a paradigm of humid tropical rivers in Southeast Asia. The performance of AI-based models has been improved through the integration of the lag time (Lt) estimation in the modelling process. A total of 8753 patterns of Q, water level, and rainfall hourly records representing one-year period (2011) were utilized in the modelling process. Six hydrological scenarios have been arranged through hypothetical cases of input variables to investigate how the changes in RF intensity in upstream stations can lead formation of floods. The initial SF was changed for each scenario in order to include wide range of hydrological situations in this study. The performance evaluation of the developed AI-based model shows that high correlation coefficient (R) between the observed and predicted Q is achieved. The AI-based model has been successfully employed in early warning throughout the advance detection of the hydrological conditions that could lead to formations of floods and HSF, where represented by three levels of severity (i.e., alert, warning, and danger). Based on the results of the scenarios, reaching the danger level in the downstream area required high RF intensity in at least two upstream areas. According to results of applications, it can be concluded that AI-based models are beneficial tools to the local authorities for flood control and awareness.

Keywords: floods, stream flow, hydrological modelling, hydrology, artificial intelligence

Procedia PDF Downloads 250
27913 Propagation of DEM Varying Accuracy into Terrain-Based Analysis

Authors: Wassim Katerji, Mercedes Farjas, Carmen Morillo

Abstract:

Terrain-Based Analysis results in derived products from an input DEM and these products are needed to perform various analyses. To efficiently use these products in decision-making, their accuracies must be estimated systematically. This paper proposes a procedure to assess the accuracy of these derived products, by calculating the accuracy of the slope dataset and its significance, taking as an input the accuracy of the DEM. Based on the output of previously published research on modeling the relative accuracy of a DEM, specifically ASTER and SRTM DEMs with Lebanon coverage as the area of study, analysis have showed that ASTER has a low significance in the majority of the area where only 2% of the modeled terrain has 50% or more significance. On the other hand, SRTM showed a better significance, where 37% of the modeled terrain has 50% or more significance. Statistical analysis deduced that the accuracy of the slope dataset, calculated on a cell-by-cell basis, is highly correlated to the accuracy of the input DEM. However, this correlation becomes lower between the slope accuracy and the slope significance, whereas it becomes much higher between the modeled slope and the slope significance.

Keywords: terrain-based analysis, slope, accuracy assessment, Digital Elevation Model (DEM)

Procedia PDF Downloads 450
27912 Cognitive SATP for Airborne Radar Based on Slow-Time Coding

Authors: Fanqiang Kong, Jindong Zhang, Daiyin Zhu

Abstract:

Space-time adaptive processing (STAP) techniques have been motivated as a key enabling technology for advanced airborne radar applications. In this paper, the notion of cognitive radar is extended to STAP technique, and cognitive STAP is discussed. The principle for improving signal-to-clutter ratio (SCNR) based on slow-time coding is given, and the corresponding optimization algorithm based on cyclic and power-like algorithms is presented. Numerical examples show the effectiveness of the proposed method.

Keywords: space-time adaptive processing (STAP), airborne radar, signal-to-clutter ratio, slow-time coding

Procedia PDF Downloads 277
27911 Short and Long Term Effects of an Attachment-Based Intervention on Child Behaviors

Authors: Claire Baudry, Jessica Pearson, Laura-Emilie Savage, George Tarbulsy

Abstract:

Over the last fifty years, maternal sensitivity and child development among vulnerable families have been a priority for researchers. For this reason, attachment-based interventions have been implemented and been shown to be effective in enhancing child development. Most of the time, child outcomes are measured shortly after the intervention. Objectives: The goal of the study was to investigate the effects of an attachment-based intervention on child development shortly after the intervention ended and one-year post-intervention. Methods: Over the seventy-two mother-child dyads referred by Child Protective Services in the province of Québec, Canada, forty-two were included in this study: 24 dyads who received 6 to 8 intervention sessions and 18 dyads who did not. Intervention and none intervention dyads were matched for the following variables: duration of child protective services, the reason for involvement with child protection, age, sex, and family status. Internalizing and externalizing behaviors were measured 3 and 12 months after the end of the intervention when the average age of children were respectively 45 and 54 months old. Findings: Independent-sample t-tests were conducted to compare scores between the two groups and the two data collection times. In general, on differences observed between the two groups three months after the intervention ended, just a few of them were still present nine months later. Conclusions: This first set of analyses suggests that the effects of attachment-based intervention observed three months following the intervention are not lasting for most of them. Those results inform us of the importance of considering the possibility to offer more attachment-based intervention sessions for those highly vulnerable families.

Keywords: attachment-based intervention, child behaviors, child protective services, highly vulnerable families

Procedia PDF Downloads 140
27910 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example

Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang

Abstract:

Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.

Keywords: cancer, visualization, database, functional annotation

Procedia PDF Downloads 623
27909 Enhancing Plant Throughput in Mineral Processing Through Multimodal Artificial Intelligence

Authors: Muhammad Bilal Shaikh

Abstract:

Mineral processing plants play a pivotal role in extracting valuable minerals from raw ores, contributing significantly to various industries. However, the optimization of plant throughput remains a complex challenge, necessitating innovative approaches for increased efficiency and productivity. This research paper investigates the application of Multimodal Artificial Intelligence (MAI) techniques to address this challenge, aiming to improve overall plant throughput in mineral processing operations. The integration of multimodal AI leverages a combination of diverse data sources, including sensor data, images, and textual information, to provide a holistic understanding of the complex processes involved in mineral extraction. The paper explores the synergies between various AI modalities, such as machine learning, computer vision, and natural language processing, to create a comprehensive and adaptive system for optimizing mineral processing plants. The primary focus of the research is on developing advanced predictive models that can accurately forecast various parameters affecting plant throughput. Utilizing historical process data, machine learning algorithms are trained to identify patterns, correlations, and dependencies within the intricate network of mineral processing operations. This enables real-time decision-making and process optimization, ultimately leading to enhanced plant throughput. Incorporating computer vision into the multimodal AI framework allows for the analysis of visual data from sensors and cameras positioned throughout the plant. This visual input aids in monitoring equipment conditions, identifying anomalies, and optimizing the flow of raw materials. The combination of machine learning and computer vision enables the creation of predictive maintenance strategies, reducing downtime and improving the overall reliability of mineral processing plants. Furthermore, the integration of natural language processing facilitates the extraction of valuable insights from unstructured textual data, such as maintenance logs, research papers, and operator reports. By understanding and analyzing this textual information, the multimodal AI system can identify trends, potential bottlenecks, and areas for improvement in plant operations. This comprehensive approach enables a more nuanced understanding of the factors influencing throughput and allows for targeted interventions. The research also explores the challenges associated with implementing multimodal AI in mineral processing plants, including data integration, model interpretability, and scalability. Addressing these challenges is crucial for the successful deployment of AI solutions in real-world industrial settings. To validate the effectiveness of the proposed multimodal AI framework, the research conducts case studies in collaboration with mineral processing plants. The results demonstrate tangible improvements in plant throughput, efficiency, and cost-effectiveness. The paper concludes with insights into the broader implications of implementing multimodal AI in mineral processing and its potential to revolutionize the industry by providing a robust, adaptive, and data-driven approach to optimizing plant operations. In summary, this research contributes to the evolving field of mineral processing by showcasing the transformative potential of multimodal artificial intelligence in enhancing plant throughput. The proposed framework offers a holistic solution that integrates machine learning, computer vision, and natural language processing to address the intricacies of mineral extraction processes, paving the way for a more efficient and sustainable future in the mineral processing industry.

Keywords: multimodal AI, computer vision, NLP, mineral processing, mining

Procedia PDF Downloads 74
27908 Influence of Cucurbitacin-Containing Phytonematicides on Growth of Rough Lemon (Citrus jambhiri)

Authors: Raisibe V. Mathabatha, Phatu W. Mashela, Nehemiah M. Mokgalong

Abstract:

Occasional incidence of phytotoxicity in Nemarioc-BL and Nemafric-AL phytonematicides to crops raises credibility challenges that could negate their registration as commercial products. Responses of plants to phytonematicides are characterized by the existence of stimulation, neutral and inhibition phases, with the mid-point of the former being referred to as the Mean Concentration Stimulation Point (MSCP = Dm + Rh/2). The objective of this study was to determine the MCSP and the overall sensitivity (∑k) of Nemarioc-AL and Nemafric-BL phytonematicides to rough lemon seedling rootstocks using the Curve-fitting Allelochemical Response Dosage (CARD) computer-based model. Two parallel greenhouse experiments were initiated, with seven dilutions of each phytonematicide arranged in a randomised complete block design, replicated nine times. Six-month-old rough lemon seedlings were transplanted into 20-cm-diameter plastic pots, filled with steam-pasteurised river sand (300°C for 3 h) and Hygromix-T growing mixture. Treatments at 0, 2, 4, 8, 16, 32 and 164% dilutions were applied weekly at 300 ml/plant. At 84 days after the treatments, analysis of variance-significant plant variables was subjected to the CARD model to generate appropriate biological indices. Computed MCSP values for Nemarioc-AL and Nemafric-BL phytonematicides on rough lemon were 29 and 38%, respectively, whereas ∑k values were 1 and 0, respectively. At the applied concentrations, rough lemon seedlings were highly sensitive to Nemarioc-AL and Nemafric-BL phytonematicides.

Keywords: crude extracts, cucurbitacins, effective microbes, fruit extracts

Procedia PDF Downloads 150
27907 Vehicular Emission Estimation of Islamabad by Using Copert-5 Model

Authors: Muhammad Jahanzaib, Muhammad Z. A. Khan, Junaid Khayyam

Abstract:

Islamabad is the capital of Pakistan with the population of 1.365 million people and with a vehicular fleet size of 0.75 million. The vehicular fleet size is growing annually by the rate of 11%. Vehicular emissions are major source of Black carbon (BC). In developing countries like Pakistan, most of the vehicles consume conventional fuels like Petrol, Diesel, and CNG. These fuels are the major emitters of pollutants like CO, CO2, NOx, CH4, VOCs, and particulate matter (PM10). Carbon dioxide and methane are the leading contributor to the global warming with a global share of 9-26% and 4-9% respectively. NOx is the precursor of nitrates which ultimately form aerosols that are noxious to human health. In this study, COPERT (Computer program to Calculate Emissions from Road Transport) was used for vehicular emission estimation in Islamabad. COPERT is a windows based program which is developed for the calculation of emissions from the road transport sector. The emissions were calculated for the year of 2016 include pollutants like CO, NOx, VOC, and PM and energy consumption. The different variable was input to the model for emission estimation including meteorological parameters, average vehicular trip length and respective time duration, fleet configuration, activity data, degradation factor, and fuel effect. The estimated emissions for CO, CH4, CO2, NOx, and PM10 were found to be 9814.2, 44.9, 279196.7, 3744.2 and 304.5 tons respectively.

Keywords: COPERT Model, emission estimation, PM10, vehicular emission

Procedia PDF Downloads 267
27906 Wavelet Coefficients Based on Orthogonal Matching Pursuit (OMP) Based Filtering for Remotely Sensed Images

Authors: Ramandeep Kaur, Kamaljit Kaur

Abstract:

In recent years, the technology of the remote sensing is growing rapidly. Image enhancement is one of most commonly used of image processing operations. Noise reduction plays very important role in digital image processing and various technologies have been located ahead to reduce the noise of the remote sensing images. The noise reduction using wavelet coefficients based on Orthogonal Matching Pursuit (OMP) has less consequences on the edges than available methods but this is not as establish in edge preservation techniques. So in this paper we provide a new technique minimum patch based noise reduction OMP which reduce the noise from an image and used edge preservation patch which preserve the edges of the image and presents the superior results than existing OMP technique. Experimental results show that the proposed minimum patch approach outperforms over existing techniques.

Keywords: image denoising, minimum patch, OMP, WCOMP

Procedia PDF Downloads 394
27905 Study on the Focus of Attention of Special Education Students in Primary School

Authors: Tung-Kuang Wu, Hsing-Pei Hsieh, Ying-Ru Meng

Abstract:

Special Education in Taiwan has been facing difficulties including shortage of teachers and lack in resources. Some students need to receive special education are thus not identified or admitted. Fortunately, information technologies can be applied to relieve some of the difficulties. For example, on-line multimedia courseware can be used to assist the learning of special education students and take pretty much workload from special education teachers. However, there may exist cognitive variations between students in special or regular educations, which suggests the design of online courseware requires different considerations. This study aims to investigate the difference in focus of attention (FOA) between special and regular education students of primary school in viewing the computer screen. The study is essential as it helps courseware developers in determining where to put learning elements that matter the most on the right position of screen. It may also assist special education specialists to better understand the subtle differences among various subtypes of learning disabilities. This study involves 76 special education students (among them, 39 are students with mental retardation, MR, and 37 are students with learning disabilities, LDs) and 42 regular education students. The participants were asked to view a computer screen showing a picture partitioned into 3 × 3 areas with each area filled with text or icon. The subjects were then instructed to mark on the prior given paper sheets, which are also partitioned into 3 × 3 grids, the areas corresponding to the pictures on the computer screen that they first set their eyes on. The data are then collected and analyzed. Major findings are listed: 1. In both text and icon scenario, significant differences exist in the first preferred FOA between special and regular education students. The first FOA for the former is mainly on area 1 (upper left area, 53.8% / 51.3% for MR / LDs students in text scenario; and 53.8% / 56.8% for MR / LDs students in icons scenario), while the latter on area 5 (middle area, 50.0% and 57.1% in text and icons scenarios). 2. The second most preferred area in text scenario for students with MR and LDs are area 2 (upper-middle, 20.5%) and 5 (middle area, 24.3%). In icons scenario, the results are similar, but lesser in percentage. 3. Students with LDs that show similar preference (either in text or icons scenarios) in FOA to regular education students tend to be of some specific sub-type of learning disabilities. For instance, students with LDs that chose area 5 (middle area, either in text or icon scenario) as their FOA are mostly ones that have reading or writing disability. Also, three (out of 13) subjects in this category, after going through the rediagnosis process, were excluded from being learning disabilities. In summary, the findings suggest when designing multimedia courseware for students with MR and LDs, the essential learning elements should be placed on area 1, 2 and 5. In addition, FOV preference may also potentially be used as an indicator for diagnosing students with LDs.

Keywords: focus of attention, learning disabilities, mental retardation, on-line multimedia courseware, special education

Procedia PDF Downloads 167
27904 Methods for Business Process Simulation Based on Petri Nets

Authors: K. Shoylekova, K. Grigorova

Abstract:

The Petri nets are the first standard for business process modeling. Most probably, it is one of the core reasons why all new standards created afterwards have to be so reformed as to reach the stage of mapping the new standard onto Petri nets. The paper presents a Business process repository based on a universal database. The repository provides the possibility the data about a given process to be stored in three different ways. Business process repository is developed with regard to the reformation of a given model to a Petri net in order to be easily simulated two different techniques for business process simulation based on Petri nets - Yasper and Woflan are discussed. Their advantages and drawbacks are outlined. The way of simulating business process models, stored in the Business process repository is shown.

Keywords: business process repository, petri nets, simulation, Woflan, Yasper

Procedia PDF Downloads 373
27903 Facial Emotion Recognition Using Deep Learning

Authors: Ashutosh Mishra, Nikhil Goyal

Abstract:

A 3D facial emotion recognition model based on deep learning is proposed in this paper. Two convolution layers and a pooling layer are employed in the deep learning architecture. After the convolution process, the pooling is finished. The probabilities for various classes of human faces are calculated using the sigmoid activation function. To verify the efficiency of deep learning-based systems, a set of faces. The Kaggle dataset is used to verify the accuracy of a deep learning-based face recognition model. The model's accuracy is about 65 percent, which is lower than that of other facial expression recognition techniques. Despite significant gains in representation precision due to the nonlinearity of profound image representations.

Keywords: facial recognition, computational intelligence, convolutional neural network, depth map

Procedia PDF Downloads 235
27902 Establishment and Application of Numerical Simulation Model for Shot Peen Forming Stress Field Method

Authors: Shuo Tian, Xuepiao Bai, Jianqin Shang, Pengtao Gai, Yuansong Zeng

Abstract:

Shot peen forming is an essential forming process for aircraft metal wing panel. With the development of computer simulation technology, scholars have proposed a numerical simulation method of shot peen forming based on stress field. Three shot peen forming indexes of crater diameter, shot speed and surface coverage are required as simulation parameters in the stress field method. It is necessary to establish the relationship between simulation and experimental process parameters in order to simulate the deformation under different shot peen forming parameters. The shot peen forming tests of the 2024-T351 aluminum alloy workpieces were carried out using uniform test design method, and three factors of air pressure, feed rate and shot flow were selected. The second-order response surface model between simulation parameters and uniform test factors was established by stepwise regression method using MATLAB software according to the results. The response surface model was combined with the stress field method to simulate the shot peen forming deformation of the workpiece. Compared with the experimental results, the simulated values were smaller than the corresponding test values, the maximum and average errors were 14.8% and 9%, respectively.

Keywords: shot peen forming, process parameter, response surface model, numerical simulation

Procedia PDF Downloads 94
27901 Raman Line Mapping on Melt Spun Polycarbonate/MWNT Fiber-Based Nanocomposites

Authors: Poonam Yadav, Dong Bok Lee

Abstract:

Raman spectroscopy was used for characterization of multi-wall carbon nanotube (MWNT) and Polycarbonate/multi-wall carbon nanotube (PC/MWNT) based fibers with 0.55% and 0.75% of MWNT (PC/MWNT55 and PC/MWNT75). PC/MWNT55 and PC/MWNT75 fibers was prepared by melt spinning device using nanocomposites made by two different route, viz., solvent casting and melt extrusion. Fibers prepared from melt extruded nanocomposites showed smooth and uniform morphology as compared to solvent casting based nanocomposites. The Raman mapping confirmed that the melt extruded based nanocomposites had better dispersion of MWNT in Polycarbonate (PC) than solvent casting carbon nanotube.

Keywords: dispersion, melt extrusion, multi-wall carbon nanotube, mapping

Procedia PDF Downloads 351
27900 Performance-Based Quality Evaluation of Database Conceptual Schemas

Authors: Janusz Getta, Zhaoxi Pan

Abstract:

Performance-based quality evaluation of database conceptual schemas is an important aspect of database design process. It is evident that different conceptual schemas provide different logical schemas and performance of user applications strongly depends on logical and physical database structures. This work presents the entire process of performance-based quality evaluation of conceptual schemas. First, we show format. Then, the paper proposes a new specification of object algebra for representation of conceptual level database applications. Transformation of conceptual schemas and expression of object algebra into implementation schema and implementation in a particular database system allows for precise estimation of the processing costs of database applications and as a consequence for precise evaluation of performance-based quality of conceptual schemas. Then we describe an experiment as a proof of concept for the evaluation procedure presented in the paper.

Keywords: conceptual schema, implementation schema, logical schema, object algebra, performance evaluation, query processing

Procedia PDF Downloads 299
27899 Examining Risk Based Approach to Financial Crime in the Charity Sector: The Challenges and Solutions, Evidence from the Regulation of Charities in England and Wales

Authors: Paschal Ohalehi

Abstract:

Purpose - The purpose of this paper, which is part of a PhD thesis is to examine the role of risk based approach in minimising financial crime in the charity sector as well as offer recommendations to improving the quality of charity regulation whilst still retaining risk based approach as a regulatory framework and also making a case for a new regulatory model. Increase in financial crimes in the charity sector has put the role of regulation in minimising financial crime up for debates amongst researchers and practitioners. Although previous research has addressed the regulation of charities, research on the role of risk based approach to minimising financial crime in the charity sector is limited. Financial crime is a concern for all organisation including charities. Design/methodology/approach - This research adopts a social constructionist’s epistemological position. This research is carried out using semi structured in-depth interviews amongst randomly selected 24 charity trustees divided into three classes: 10 small charities, 10 medium charities and 4 large charities. The researcher also interviewed 4 stakeholders (NFA, Charity Commission and two different police forces in terms of size and area of coverage) in the charity sector. Findings - The results of this research show that reliance on risk based approach to financial crime in the sector is weak and fragmented with the research pointing to a clear evidence of disconnect between the regulator and the regulated leading to little or lack of regulation of trustees’ activities, limited monitoring of charities and lack of training and awareness on financial crime in the sector. Originality – This paper shows how regulation of charities in general and risk based approach in particular can be improved in order to meet the expectations of the stakeholders, the public, the regulator and the regulated.

Keywords: risk, risk based approach, financial crime, fraud, self-regulation

Procedia PDF Downloads 383
27898 Influence of Instructors in Engaging Online Graduate Students in Active Learning in the United States

Authors: Ehi E. Aimiuwu

Abstract:

As of 2017, many online learning professionals, institutions, and journals are still wondering how instructors can keep student engaged in the online learning environment to facilitate active learning effectively. The purpose of this qualitative single-case and narrative research is to explore whether online professors understand their role as mentors and facilitators of students’ academic success by keeping students engaged in active learning based on personalized experience in the field. Data collection tools that were used in the study included an NVivo 12 Plus qualitative software, an interview protocol, a digital audiotape, an observation sheet, and a transcription. Seven online professors in the United States from LinkedIn and residencies were interviewed for this study. Eleven online teaching techniques from previous research were used as the study framework. Data analysis process, member checking, and key themes were used to achieve saturation. About 85.7% of professors agreed on rubric as the preferred online grading technique. About 57.1% agreed on professors logging in daily, students logging in about 2-5 times weekly, knowing students to increase accountability, email as preferred communication tool, and computer access for adequate online learning. About 42.9% agreed on syllabus for clear class expectations, participation to show what has been learned, and energizing students for creativity.

Keywords: class facilitation, class management, online teaching, online education, pedagogy

Procedia PDF Downloads 119
27897 Evaluated Nuclear Data Based Photon Induced Nuclear Reaction Model of GEANT4

Authors: Jae Won Shin

Abstract:

We develop an evaluated nuclear data based photonuclear reaction model of GEANT4 for a more accurate simulation of photon-induced neutron production. The evaluated photonuclear data libraries from the ENDF/B-VII.1 are taken as input. Incident photon energies up to 140 MeV which is the threshold energy for the pion production are considered. For checking the validity of the use of the data-based model, we calculate the photoneutron production cross-sections and yields and compared them with experimental data. The results obtained from the developed model are found to be in good agreement with the experimental data for (γ,xn) reactions.

Keywords: ENDF/B-VII.1, GEANT4, photoneutron, photonuclear reaction

Procedia PDF Downloads 277
27896 Factors Influencing Agricultural Systems Adoption Success: Evidence from Thailand

Authors: Manirath Wongsim, Ekkachai Naenudorn, Nipotepat Muangkote

Abstract:

Information Technology (IT), play an important role in business management strategies and can provide assistance in all phases of decision making. Thus, many organizations need to be seen as adopting IT, which is critical for a company to organize, manage and operate its processes. In order to implement IT successfully, it is important to understand the underlying factors that influence agricultural system's adoption success. Therefore, this research intends to study this perspective of factors that influence and impact successful IT adoption and related agricultural performance. Case study and survey methodology were adopted for this research. Case studies in two Thai- organizations were carried out. The results of the two main case studies suggested 21 factors that may have an impact on IT adoption in agriculture in Thailand, which led to the development of the preliminary framework. Next, a survey instrument was developed based on the findings from case studies. Survey questionnaires were gathered from 217 respondents from two large-scale surveys were sent to selected members of Thailand farmer, and Thailand computer to test the research framework. The results indicate that the top five critical factors for ensuring IT adoption in agricultural were: 1) network and communication facilities; 2) software; 3) hardware; 4) farmer’s IT knowledge, and; 5) training and education. Therefore, it is now clear which factors are influencing IT adoption and which of those factors are critical success factors for ensuring IT adoption in agricultural organization.

Keywords: agricultural systems adoption, factors influencing IT adoption, factors affecting in agricultural adoption

Procedia PDF Downloads 166
27895 Innovating Translation Pedagogy: Maximizing Teaching Effectiveness by Focusing on Cognitive Study

Authors: Dawn Tsang

Abstract:

This paper aims at synthesizing the difficulties in cognitive processes faced by translation majors in mainland China. The purpose is to develop possible solutions and innovation in terms of translation pedagogy, curriculum reform, and syllabus design. This research will base its analysis on students’ instant feedback and interview after training in translation and interpreting courses, and translation faculty’s teaching experiences. This research will take our translation majors as the starting point, who will be one of the focus groups. At present, our Applied Translation Studies Programme is offering translation courses in the following areas: practical translation and interpreting, translation theories, culture and translation, and internship. It is a four-year translation programme, and our students would start their introductory courses since Semester 1 of Year 1. The medium of instruction of our College is solely in English. In general, our students’ competency in English is strong. Yet in translation and especially interpreting classes, no matter it is students’ first attempt or students who have taken university English courses, students find class practices very challenging, if not mission impossible. Their biggest learning problem seems to be weakening cognitive processes in terms of lack of intercultural competence, incomprehension of English language and foreign cultures, inadequate aptitude and slow reaction, and inapt to utilize one’s vocabulary bank etc. This being so, the research questions include: (1) What specific and common cognitive difficulties are students facing while learning translation and interpreting? (2) How to deal with such difficulties, and what implications can be drawn on curriculum reform and syllabus design in translation? (3) How significant should cognitive study be placed on translation curriculum, i.e., the proportion of cognitive study in translation/interpreting courses and in translation major curriculum? and (4) What can we as translation educators do to maximize teaching and learning effectiveness by incorporating the latest development of cognitive study?. We have collected translation students’ instant feedback and conduct interviews with both students and teaching staff, in order to draw parallels as well as distinguishing from our own current teaching practices at United International College (UIC). We have collected 500 questionnaires for now. The main learning difficulties include: poor vocabulary bank, lack of listening and reading comprehension skills in terms of not fully understanding the subtext, aptitude in translation and interpreting etc. This being so, we propose to reform and revitalize translation curriculum and syllabi to address to these difficulties. The aim is to maximize teaching effectiveness in translation by addressing the above-mentioned questions with a special focus on cognitive difficulties faced by translation majors.

Keywords: cognitive difficulties, teaching and learning effectiveness, translation curriculum reform, translation pedagogy

Procedia PDF Downloads 321
27894 Using Deep Learning Real-Time Object Detection Convolution Neural Networks for Fast Fruit Recognition in the Tree

Authors: K. Bresilla, L. Manfrini, B. Morandi, A. Boini, G. Perulli, L. C. Grappadelli

Abstract:

Image/video processing for fruit in the tree using hard-coded feature extraction algorithms have shown high accuracy during recent years. While accurate, these approaches even with high-end hardware are computationally intensive and too slow for real-time systems. This paper details the use of deep convolution neural networks (CNNs), specifically an algorithm (YOLO - You Only Look Once) with 24+2 convolution layers. Using deep-learning techniques eliminated the need for hard-code specific features for specific fruit shapes, color and/or other attributes. This CNN is trained on more than 5000 images of apple and pear fruits on 960 cores GPU (Graphical Processing Unit). Testing set showed an accuracy of 90%. After this, trained data were transferred to an embedded device (Raspberry Pi gen.3) with camera for more portability. Based on correlation between number of visible fruits or detected fruits on one frame and the real number of fruits on one tree, a model was created to accommodate this error rate. Speed of processing and detection of the whole platform was higher than 40 frames per second. This speed is fast enough for any grasping/harvesting robotic arm or other real-time applications.

Keywords: artificial intelligence, computer vision, deep learning, fruit recognition, harvesting robot, precision agriculture

Procedia PDF Downloads 425
27893 Quantitative Evaluation of Diabetic Foot Wound Healing Using Hydrogel Nanosilver Based Dressing vs. Traditional Dressing: A Prospective Randomized Control Study

Authors: Ehsan A. Yahia, Ayman E. El-Sharkawey, Magda M. Bayoumi

Abstract:

Background: Wound dressings perform a crucial role in cutaneous wound management due to their ability to protect wounds and promote dermal and epidermal tissue regeneration. Aim: To evaluate the effectiveness of using hydrogel/nano silver-based dressing vs. traditional dressing on diabetic foot wound healing. Methods: Sixty patients with type-2 diabetes hospitalized for diabetic foot wound treatment were recruited from selected Surgical departments. A prospective randomized control study was carried. Results: The results showed that the percentage of a reduction rate of the ulcer by the third week of the treatment in the hydrogel/nano silver-based dressing group was higher (15.11%) than in the traditional wound dressing group (33.44%). Moreover, the mean ulcer size "sq mm" in the hydrogel/nano silver-based dressing group recognized a faster healing rate (15.11±7.89) and considerably lesser in comparison to the traditional in the third week (21.65±8.4). Conclusion: The hydrogel/nanosilver-based dressing showed better results than traditional dressing in managing diabetic ulcer foot.

Keywords: diabetes, wound care, diabetic foot, wound dressing, hydrogel nanosilver

Procedia PDF Downloads 118
27892 Fair Value Implementation of Financial Asset: Evidence in Indonesia’s Banking Sector

Authors: Alhamdi Alfi Fajri

Abstract:

The purpose of this study is to analyze and to give empirical proof about the effect of fair value implementation on financial asset against information asymmetry in Indonesia’s banking sector. This research tested the effect of fair value implementation on financial asset based on Statement of Financial Accounting Standard (PSAK) No. 55 and the fair value reliability measurement based on PSAK No. 60 against level of information asymmetry. The scope of research is Indonesia’s banking sector. The test’s result shows that the use of fair value based on PSAK No. 55 is significantly associated with information asymmetry. This positive relation is higher than the amortized cost implementation on financial asset. In addition, the fair value hierarchy based on PSAK No. 60 is significantly associated with information asymmetry. This research proves that the more reliable measurement of fair value on financial asset, the more observable fair value measurement and reduces level of information asymmetry.

Keywords: fair value, PSAK No. 55, PSAK No. 60, information asymmetry, bank

Procedia PDF Downloads 358
27891 Domain-Specific Languages Evaluation: A Literature Review and Experience Report

Authors: Sofia Meacham

Abstract:

In this abstract paper, the Domain-Specific Languages (DSL) evaluation will be presented based on existing literature and years of experience developing DSLs for several domains. The domains we worked on ranged from AI, business applications, and finances/accounting to health. In general, DSLs have been utilised in many domains to provide tailored and efficient solutions to address specific problems. Although they are a reputable method among highly technical circles and have also been used by non-technical experts with success, according to our knowledge, there isn’t a commonly accepted method for evaluating them. There are some methods that define criteria that are adaptations from the general software engineering quality criteria. Other literature focuses on the DSL usability aspect of evaluation and applies methods such as Human-Computer Interaction (HCI) and goal modeling. All these approaches are either hard to introduce, such as the goal modeling, or seem to ignore the domain-specific focus of the DSLs. From our experience, the DSLs have domain-specificity in their core, and consequently, the methods to evaluate them should also include domain-specific criteria in their core. The domain-specific criteria would require synergy between the domain experts and the DSL developers in the same way that DSLs cannot be developed without domain-experts involvement. Methods from agile and other software engineering practices, such as co-creation workshops, should be further emphasised and explored to facilitate this direction. Concluding, our latest experience and plans for DSLs evaluation will be presented and open for discussion.

Keywords: domain-specific languages, DSL evaluation, DSL usability, DSL quality metrics

Procedia PDF Downloads 105
27890 Exergy Analysis and Evaluation of the Different Flowsheeting Configurations for CO₂ Capture Plant Using 2-Amino-2-Methyl-1-Propanol

Authors: Ebuwa Osagie, Vasilije Manovic

Abstract:

Exergy analysis provides the identification of the location, sources of thermodynamic inefficiencies, and magnitude in a thermal system. Thus, both the qualitative and quantitative assessment can be evaluated with exergy, unlike energy which is based on quantitative assessment only. The main purpose of exergy analysis is to identify where exergy is destroyed. Thus, reduction of the exergy destruction and losses associated with the capture plant systems can improve work potential. Furthermore, thermodynamic analysis of different configurations of the process helps to identify opportunities for reducing the steam requirements for each of the configurations. This paper presents steady-state simulation and exergy analysis of the 2-amino-2-methyl-1-propanol (AMP)-based post-combustion capture (PCC) plant. Exergy analysis performed for the AMP-based plant and the different configurations revealed that the rich split with intercooling configuration gave the highest exergy efficiency of 73.6%, while that of the intercooling and the reference AMP-based plant were 57.3% and 55.8% respectively.

Keywords: 2-amino-2-methyl-1-propanol, modelling, and simulation, post-combustion capture plant, exergy analysis, flowsheeting configurations

Procedia PDF Downloads 168
27889 Detecting and Thwarting Interest Flooding Attack in Information Centric Network

Authors: Vimala Rani P, Narasimha Malikarjunan, Mercy Shalinie S

Abstract:

Data Networking was brought forth as an instantiation of information-centric networking. The attackers can send a colossal number of spoofs to take hold of the Pending Interest Table (PIT) named an Interest Flooding attack (IFA) since the in- interests are recorded in the PITs of the intermediate routers until they receive corresponding Data Packets are go beyond the time limit. These attacks can be detrimental to network performance. PIT expiration rate or the Interest satisfaction rate, which cannot differentiate the IFA from attacks, is the criterion Traditional IFA detection techniques are concerned with. Threshold values can casually affect Threshold-based traditional methods. This article proposes an accurate IFA detection mechanism based on a Multiple Feature-based Extreme Learning Machine (MF-ELM). Accuracy of the attack detection can be increased by presenting the entropy of Internet names, Interest satisfaction rate and PIT usage as features extracted in the MF-ELM classifier. Furthermore, we deploy a queue-based hostile Interest prefix mitigation mechanism. The inference of this real-time test bed is that the mechanism can help the network to resist IFA with higher accuracy and efficiency.

Keywords: information-centric network, pending interest table, interest flooding attack, MF-ELM classifier, queue-based mitigation strategy

Procedia PDF Downloads 211
27888 Developing a Recommendation Library System based on Android Application

Authors: Kunyanuth Kularbphettong, Kunnika Tenprakhon, Pattarapan Roonrakwit

Abstract:

In this paper, we present a recommendation library application on Android system. The objective of this system is to support and advice user to use library resources based on mobile application. We describe the design approaches and functional components of this system. The system was developed based on under association rules, Apriori algorithm. In this project, it was divided the result by the research purposes into 2 parts: developing the Mobile application for online library service and testing and evaluating the system. Questionnaires were used to measure user satisfaction with system usability by specialists and users. The results were satisfactory both specialists and users.

Keywords: online library, Apriori algorithm, Android application, black box

Procedia PDF Downloads 492
27887 A Fuzzy Logic Based Health Assesment Platform

Authors: J. Al-Dmour, A. Sagahyroon, A. Al-Ali, S. Abusnana

Abstract:

Radio Frequency Based Identification Systems have emerged as one of the possible valuable solutions that can be utilized in healthcare systems. Nowadays, RFID tags are available with built-in human vital signs sensors such as Body Temperature, Blood Pressure, Heart Rate, Blood Sugar level and Oxygen Saturation in Blood. This work proposes the design, implementation, and testing of an integrated mobile RFID-based health care system. The system consists of a wireless mobile vital signs data acquisition unit (RFID-DAQ) integrated with a fuzzy-logic–based software algorithm to monitor and assess patients conditions. The system is implemented and tested in ‘Rashid Center for Diabetes and Research’, Ajman, UAE. System testing results are compared with the Modified Early Warning System (MEWS) that is currently used in practice. We demonstrate that the proposed and implemented system exhibits an accuracy level that is comparable and sometimes better than the widely adopted MEWS system.

Keywords: healthcare, fuzzy logic, MEWS, RFID

Procedia PDF Downloads 353
27886 Excitation Modeling for Hidden Markov Model-Based Speech Synthesis Based on Wavelet Analysis

Authors: M. Kiran Reddy, K. Sreenivasa Rao

Abstract:

The conventional Hidden Markov Model (HMM)-based speech synthesis system (HTS) uses only a pulse excitation model, which significantly differs from natural excitation signal. Hence, buzziness can be perceived in the speech generated using HTS. This paper proposes an efficient excitation modeling method that can significantly reduce the buzziness, and improve the quality of HMM-based speech synthesis. The proposed approach models the pitch-synchronous residual frames extracted from the residual excitation signal. Each pitch synchronous residual frame is parameterized using 30 wavelet coefficients. These 30 wavelet coefficients are found to accurately capture the perceptually important information present in the residual waveform. In synthesis phase, the residual frames are reconstructed from the generated wavelet coefficients and are pitch-synchronously overlap-added to generate the excitation signal. The proposed excitation modeling method is integrated into HMM-based speech synthesis system. Evaluation results indicate that the speech synthesized by the proposed excitation model is significantly better than the speech generated using state-of-the-art excitation modeling methods.

Keywords: excitation modeling, hidden Markov models, pitch-synchronous frames, speech synthesis, wavelet coefficients

Procedia PDF Downloads 252