Search results for: software fault prediction
6313 The Design of a Vehicle Traffic Flow Prediction Model for a Gauteng Freeway Based on an Ensemble of Multi-Layer Perceptron
Authors: Tebogo Emma Makaba, Barnabas Ndlovu Gatsheni
Abstract:
The cities of Johannesburg and Pretoria both located in the Gauteng province are separated by a distance of 58 km. The traffic queues on the Ben Schoeman freeway which connects these two cities can stretch for almost 1.5 km. Vehicle traffic congestion impacts negatively on the business and the commuter’s quality of life. The goal of this paper is to identify variables that influence the flow of traffic and to design a vehicle traffic prediction model, which will predict the traffic flow pattern in advance. The model will unable motorist to be able to make appropriate travel decisions ahead of time. The data used was collected by Mikro’s Traffic Monitoring (MTM). Multi-Layer perceptron (MLP) was used individually to construct the model and the MLP was also combined with Bagging ensemble method to training the data. The cross—validation method was used for evaluating the models. The results obtained from the techniques were compared using predictive and prediction costs. The cost was computed using combination of the loss matrix and the confusion matrix. The predicted models designed shows that the status of the traffic flow on the freeway can be predicted using the following parameters travel time, average speed, traffic volume and day of month. The implications of this work is that commuters will be able to spend less time travelling on the route and spend time with their families. The logistics industry will save more than twice what they are currently spending.Keywords: bagging ensemble methods, confusion matrix, multi-layer perceptron, vehicle traffic flow
Procedia PDF Downloads 3456312 Springback Prediction for Sheet Metal Cold Stamping Using Convolutional Neural Networks
Abstract:
Cold stamping has been widely applied in the automotive industry for the mass production of a great range of automotive panels. Predicting the springback to ensure the dimensional accuracy of the cold-stamped components is a critical step. The main approaches for the prediction and compensation of springback in cold stamping include running Finite Element (FE) simulations and conducting experiments, which require forming process expertise and can be time-consuming and expensive for the design of cold stamping tools. Machine learning technologies have been proven and successfully applied in learning complex system behaviours using presentative samples. These technologies exhibit the promising potential to be used as supporting design tools for metal forming technologies. This study, for the first time, presents a novel application of a Convolutional Neural Network (CNN) based surrogate model to predict the springback fields for variable U-shape cold bending geometries. A dataset is created based on the U-shape cold bending geometries and the corresponding FE simulations results. The dataset is then applied to train the CNN surrogate model. The result shows that the surrogate model can achieve near indistinguishable full-field predictions in real-time when compared with the FE simulation results. The application of CNN in efficient springback prediction can be adopted in industrial settings to aid both conceptual and final component designs for designers without having manufacturing knowledge.Keywords: springback, cold stamping, convolutional neural networks, machine learning
Procedia PDF Downloads 1516311 Application of GeoGebra into Teaching and Learning of Linear and Quadratic Equations amongst Senior Secondary School Students in Fagge Local Government Area of Kano State, Nigeria
Authors: Musa Auwal Mamman, S. G. Isa
Abstract:
This study was carried out in order to investigate the effectiveness of GeoGebra software in teaching and learning of linear and quadratic equations amongst senior secondary school students in Fagge Local Government Area, Kano State–Nigeria. Five research items were raised in objectives, research questions and hypotheses respectively. A random sampling method was used in selecting 398 students from a population of 2098 of SS2 students. The experimental group was taught using the GeoGebra software while the control group was taught using the conventional teaching method. The instrument used for the study was the mathematics performance test (MPT) which was administered at the beginning and at the end of the study. The results of the study revealed that students taught with GeoGebra software (experimental group) performed better than students taught with traditional teaching method. The t- test was used to analyze the data obtained from the study.Keywords: GeoGebra Software, mathematics performance, random sampling, mathematics teaching
Procedia PDF Downloads 2486310 Characterization of Double Shockley Stacking Fault in 4H-SiC Epilayer
Authors: Zhe Li, Tao Ju, Liguo Zhang, Zehong Zhang, Baoshun Zhang
Abstract:
In-grow stacking-faults (IGSFs) in 4H-SiC epilayers can cause increased leakage current and reduce the blocking voltage of 4H-SiC power devices. Double Shockley stacking fault (2SSF) is a common type of IGSF with double slips on the basal planes. In this study, a 2SSF in the 4H-SiC epilayer grown by chemical vaper deposition (CVD) is characterized. The nucleation site of the 2SSF is discussed, and a model for the 2SSF nucleation is proposed. Homo-epitaxial 4H-SiC is grown on a commercial 4 degrees off-cut substrate by a home-built hot-wall CVD. Defect-selected-etching (DSE) is conducted with melted KOH at 500 degrees Celsius for 1-2 min. Room temperature cathodoluminescence (CL) is conducted at a 20 kV acceleration voltage. Low-temperature photoluminescence (LTPL) is conducted at 3.6 K with the 325 nm He-Cd laser line. In the CL image, a triangular area with bright contrast is observed. Two partial dislocations (PDs) with a 20-degree angle in between show linear dark contrast on the edges of the IGSF. CL and LTPL spectrums are conducted to verify the IGSF’s type. The CL spectrum shows the maximum photoemission at 2.431 eV and negligible bandgap emission. In the LTPL spectrum, four phonon replicas are found at 2.468 eV, 2.438 eV, 2.420 eV and 2.410 eV, respectively. The Egx is estimated to be 2.512 eV. A shoulder with a red-shift to the main peak in CL, and a slight protrude at the same wavelength in LTPL are verified as the so called Egx- lines. Based on the CL and LTPL results, the IGSF is identified as a 2SSF. Back etching by neutral loop discharge and DSE are conducted to track the origin of the 2SSF, and the nucleation site is found to be a threading screw dislocation (TSD) in this sample. A nucleation mechanism model is proposed for the formation of the 2SSF. Steps introduced by the off-cut and the TSD on the surface are both suggested to be two C-Si bilayers height. The intersections of such two types of steps are along [11-20] direction from the TSD, while a four-bilayer step at each intersection. The nucleation of the 2SSF in the growth is proposed as follows. Firstly, the upper two bilayers of the four-bilayer step grow down and block the lower two at one intersection, and an IGSF is generated. Secondly, the step-flow grows over the IGSF successively, and forms an AC/ABCABC/BA/BC stacking sequence. Then a 2SSF is formed and extends by the step-flow growth. In conclusion, a triangular IGSF is characterized by CL approach. Base on the CL and LTPL spectrums, the estimated Egx is 2.512 eV and the IGSF is identified to be a 2SSF. By back etching, the 2SSF nucleation site is found to be a TSD. A model for the 2SSF nucleation from an intersection of off-cut- and TSD- introduced steps is proposed.Keywords: cathodoluminescence, defect-selected-etching, double Shockley stacking fault, low-temperature photoluminescence, nucleation model, silicon carbide
Procedia PDF Downloads 3196309 Delineato: Designing Distraction-Free GUIs
Authors: Fernando Miguel Campos, Fernando Jesus Aguiar Campos, Pedro Filipe Campos
Abstract:
A large amount of software products offer a wide range and number of features. This is called featurities or creeping featurism and tends to rise with each release of the product. Feautiris often adds unnecessary complexity to software, leading to longer learning curves and overall confusing the users and degrading their experience. We take a look to a new design approach tendency that has been coming up, the so-called “What You Get Is What You Need” concept that argues that products should be very focused, simple and with minimalistic interfaces in order to help users conduct their tasks in distraction-free ambiances. This is not as simple to implement as it might sound and the developers need to cut down features. Our contribution illustrates and evaluates this design method through a novel distraction-free diagramming tool named Delineato Pro for Mac OS X in which the user is confronted with an empty canvas when launching the software and where tools only show up when really needed.Keywords: diagramming, HCI, usability, user interface
Procedia PDF Downloads 5286308 Structural Analysis of Archaeoseismic Records Linked to the 5 July 408 - 410 AD Utica Strong Earthquake (NE Tunisia)
Authors: Noureddine Ben Ayed, Abdelkader Soumaya, Saïd Maouche, Ali Kadri, Mongi Gueddiche, Hayet Khayati-Ammar, Ahmed Braham
Abstract:
The archaeological monument of Utica, located in north-eastern Tunisia, was founded (8th century BC) By the Phoenicians as a port installed on the trade route connecting Phoenicia and the Straits of Gibraltar in the Mediterranean Sea. The flourishment of this city as an important settlement during the Roman period was followed by a sudden abandonment, disuse and progressive oblivion in the first half of the fifth century AD. This decadence can be attributed to the destructive earthquake of 5 July 408 - 410 AD, affecting this historic city as documented in 1906 by the seismologist Fernand De Montessus De Ballore. The magnitude of the Utica earthquake was estimated at 6.8 by the Tunisian National Institute of Meteorology (INM). In order to highlight the damage caused by this earthquake, a field survey was carried out at the Utica ruins to detect and analyse the earthquake archaeological effects (EAEs) using structural geology methods. This approach allowed us to highlight several structural damages, including: (1) folded mortar pavements, (2) cracks affecting the mosaic and walls of a water basin in the "House of the Grand Oecus", (3) displaced columns, (4) block extrusion in masonry walls, (5) undulations in mosaic pavements, (6) tilted walls. The structural analysis of these EAEs and data measurements reveal a seismic cause for all evidence of deformation in the Utica monument. The maximum horizontal strain of the ground (e.g. SHmax) inferred from the building oriented damage in Utica shows a NNW-SSE direction under a compressive tectonic regime. For the seismogenic source of this earthquake, we propose the active E-W to NE-SW trending Utique - Ghar El Melh reverse fault, passing through the Utica Monument and extending towards the Ghar El Melh Lake, as the causative tectonic structure. The active fault trace is well supported by instrumental seismicity, geophysical data (e.g., gravity, seismic profiles) and geomorphological analyses. In summary, we find that the archaeoseismic records detected at Utica are similar to those observed at many other archaeological sites affected by destructive ancient earthquakes around the world. Furthermore, the calculated orientation of the average maximum horizontal stress (SHmax) closely match the state of the actual stress field, as highlighted by some earthquake focal mechanisms in this region.Keywords: Tunisia, utica, seimogenic fault, archaeological earthquake effects
Procedia PDF Downloads 476307 Design and Burnback Analysis of Three Dimensional Modified Star Grain
Authors: Almostafa Abdelaziz, Liang Guozhu, Anwer Elsayed
Abstract:
The determination of grain geometry is an important and critical step in the design of solid propellant rocket motor. In this study, the design process involved parametric geometry modeling in CAD, MATLAB coding of performance prediction and 2D star grain ignition experiment. The 2D star grain burnback achieved by creating new surface via each web increment and calculating geometrical properties at each step. The 2D star grain is further modified to burn as a tapered 3D star grain. Zero dimensional method used to calculate the internal ballistic performance. Experimental and theoretical results were compared in order to validate the performance prediction of the solid rocket motor. The results show that the usage of 3D grain geometry will decrease the pressure inside the combustion chamber and enhance the volumetric loading ratio.Keywords: burnback analysis, rocket motor, star grain, three dimensional grains
Procedia PDF Downloads 2466306 Design of a Tool for Generating Test Cases from BPMN
Authors: Prat Yotyawilai, Taratip Suwannasart
Abstract:
Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.Keywords: software testing, test case, BPMN, flow graph
Procedia PDF Downloads 5576305 Simulation of Kinetic Friction in L-Bending of Sheet Metals
Authors: Maziar Ramezani, Thomas Neitzert, Timotius Pasang
Abstract:
This paper aims at experimental and numerical investigation of springback behavior of sheet metals during L-bending process with emphasis on Stribeck-type friction modeling. The coefficient of friction in Stribeck curve depends on sliding velocity and contact pressure. The springback behavior of mild steel and aluminum alloy 6022-T4 sheets was studied experimentally and using numerical simulations with ABAQUS software with two types of friction model: Coulomb friction and Stribeck friction. The influence of forming speed on springback behavior was studied experimentally and numerically. The results showed that Stribeck-type friction model has better results in predicting springback in sheet metal forming. The FE prediction error for mild steel and 6022-T4 AA is 23.8%, 25.5% respectively, using Coulomb friction model and 11%, 13% respectively, using Stribeck friction model. These results show that Stribeck model is suitable for simulation of sheet metal forming especially at higher forming speed.Keywords: friction, L-bending, springback, Stribeck curves
Procedia PDF Downloads 4936304 Agile Manifesto Construct for the Film Industry
Authors: Kiri Trier, Theresa Treffers
Abstract:
In the course of continuous volatility like production stops due to the COVID-19 pandemic, video-on-demand player monopolizing the film industry, filmmakers are stuck in traditional, linear content development processes. The industry has to become more agile in order to react quickly and easily to changes. Since content development in agile project management is scientifically–empirically not at all recorded, and a lack beyond the software development in terms of agile methods consists, we examined if the agile manifesto values and principles from the software development can be adapted to the film industry to enable agility and digitalization of content development in the industry. We conducted an online questionnaire with 184 German filmmakers (producers, authors, directors, actors, film financiers) for a first cross-sectional assessment for adaptability of the agile manifesto from the software development to the film industry, factor analysis was used to validate the construct. Our results show that it is crucial to digitalize traditional content development to agile content development end-to-end, with tools, lean processes, new collaboration structures, and holacracy to prepare for any volatility. Overall, we examined the first construct for an agile manifesto for the film industry with four values related to nine own principles. Our findings help to get a better understanding of the agile manifesto beyond the software development as a guideline for implementing agility in the film industry.Keywords: agile manifesto, agile project management, agility, film industry
Procedia PDF Downloads 2026303 Benefits of Gamification in Agile Software Project Courses
Authors: Nina Dzamashvili Fogelström
Abstract:
This paper examines concepts of Game-Based Learning and Gamification. Conducted literature survey found an increased interest in the academia in these concepts, limited evidence of a positive effect on student motivation and academic performance, but also certain scepticism for adding games to traditional educational activities. A small-scale empirical study presented in this paper aims to evaluate student experience and usefulness of GameBased Learning and Gamification for a better understanding of the threshold concepts in software engineering project courses. The participants of the study were 22 second year students from bachelor’s program in software engineering at Blekinge Institute of Technology. As a part of the course instruction, the students were introduced to a digital game specifically designed to simulate agile software project. The game mechanics were designed as to allow manipulation of the agile concept of team velocity. After the application of the game, the students were surveyed to measure the degree of a perceived increase in understanding of the studied threshold concept. The students were also asked whether they would like to have games included in their education. The results show that majority of the students found the game helpful in increasing their understanding of the threshold concept. Most of the students have indicated that they would like to see games included in their education. These results are encouraging. Since the study was of small scale and based on convenience sampling, more studies in the area are recommended.Keywords: agile development, gamification, game based learning, digital games, software engineering, threshold concepts
Procedia PDF Downloads 1686302 Towards the Use of Software Product Metrics as an Indicator for Measuring Mobile Applications Power Consumption
Authors: Ching Kin Keong, Koh Tieng Wei, Abdul Azim Abd. Ghani, Khaironi Yatim Sharif
Abstract:
Maintaining factory default battery endurance rate over time in supporting huge amount of running applications on energy-restricted mobile devices has created a new challenge for mobile applications developer. While delivering customers’ unlimited expectations, developers are barely aware of efficient use of energy from the application itself. Thus developers need a set of valid energy consumption indicators in assisting them to develop energy saving applications. In this paper, we present a few software product metrics that can be used as an indicator to measure energy consumption of Android-based mobile applications in the early of design stage. In particular, Trepn Profiler (Power profiling tool for Qualcomm processor) has used to collect the data of mobile application power consumption, and then analyzed for the 23 software metrics in this preliminary study. The results show that McCabe cyclomatic complexity, number of parameters, nested block depth, number of methods, weighted methods per class, number of classes, total lines of code and method lines have direct relationship with power consumption of mobile application.Keywords: battery endurance, software metrics, mobile application, power consumption
Procedia PDF Downloads 3956301 Automatic Verification Technology of Virtual Machine Software Patch on IaaS Cloud
Authors: Yoji Yamato
Abstract:
In this paper, we propose an automatic verification technology of software patches for user virtual environments on IaaS Cloud to decrease verification costs of patches. In these days, IaaS services have been spread and many users can customize virtual machines on IaaS Cloud like their own private servers. Regarding to software patches of OS or middleware installed on virtual machines, users need to adopt and verify these patches by themselves. This task increases operation costs of users. Our proposed method replicates user virtual environments, extracts verification test cases for user virtual environments from test case DB, distributes patches to virtual machines on replicated environments and conducts those test cases automatically on replicated environments. We have implemented the proposed method on OpenStack using Jenkins and confirmed the feasibility. Using the implementation, we confirmed the effectiveness of test case creation efforts by our proposed idea of 2-tier abstraction of software functions and test cases. We also evaluated the automatic verification performance of environment replications, test cases extractions and test cases conductions.Keywords: OpenStack, cloud computing, automatic verification, jenkins
Procedia PDF Downloads 4916300 Ontology based Fault Detection and Diagnosis system Querying and Reasoning examples
Authors: Marko Batic, Nikola Tomasevic, Sanja Vranes
Abstract:
One of the strongholds in the ubiquitous efforts related to the energy conservation and energy efficiency improvement is represented by the retrofit of high energy consumers in buildings. In general, HVAC systems represent the highest energy consumers in buildings. However they usually suffer from mal-operation and/or malfunction, causing even higher energy consumption than necessary. Various Fault Detection and Diagnosis (FDD) systems can be successfully employed for this purpose, especially when it comes to the application at a single device/unit level. In the case of more complex systems, where multiple devices are operating in the context of the same building, significant energy efficiency improvements can only be achieved through application of comprehensive FDD systems relying on additional higher level knowledge, such as their geographical location, served area, their intra- and inter- system dependencies etc. This paper presents a comprehensive FDD system that relies on the utilization of common knowledge repository that stores all critical information. The discussed system is deployed as a test-bed platform at the two at Fiumicino and Malpensa airports in Italy. This paper aims at presenting advantages of implementation of the knowledge base through the utilization of ontology and offers improved functionalities of such system through examples of typical queries and reasoning that enable derivation of high level energy conservation measures (ECM). Therefore, key SPARQL queries and SWRL rules, based on the two instantiated airport ontologies, are elaborated. The detection of high level irregularities in the operation of airport heating/cooling plants is discussed and estimation of energy savings is reported.Keywords: airport ontology, knowledge management, ontology modeling, reasoning
Procedia PDF Downloads 5406299 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 1386298 Lab Support: A Computer Laboratory Class Management Support System
Authors: Eugenia P. Ramirez, Kevin Matthe Caramancion, Mia Eleazar
Abstract:
Getting the attention of students is a constant challenge to the instructors/lecturers. Although in the computer laboratories some networking and entertainment websites are blocked, yet, these websites have unlimited ways of attracting students to get into it. Thus, when an instructor gives a specific set of instructions, some students may not be able to follow sequentially the steps that are given. The instructor has to physically go to the specific remote terminal and show the student the details. Sometimes, during an examination in laboratory set-up, a proctor may prefer to give detailed and text-written instructions rather than verbal instructions. Even the mere calling of a specific student at any time will distract the whole class especially when activities are being performed. What is needed is : An application software that is able to lock the student's monitor and at the same time display the instructor’s screen; a software that is powerful enough to process in its side alone and manipulate a specific user’s terminal in terms of free configuration that is, without restrictions at the server level is a required functionality for a modern and optimal server structure; a software that is able to send text messages to students, per terminal or in group will be a solution. These features are found in LabSupport. This paper outlines the LabSupport application software framework to efficiently manage computer laboratory sessions and will include different modules: screen viewer, demonstration mode, monitor locking system, text messaging, and class management. This paper's ultimate aim is to provide a system that increases instructor productivity.Keywords: application software, broadcast messaging, class management, locking system
Procedia PDF Downloads 4416297 Estimation of Seismic Ground Motion and Shaking Parameters Based on Microtremor Measurements at Palu City, Central Sulawesi Province, Indonesia
Authors: P. S. Thein, S. Pramumijoyo, K. S. Brotopuspito, J. Kiyono, W. Wilopo, A. Furukawa, A. Setianto
Abstract:
In this study, we estimated the seismic ground motion parameters based on microtremor measurements at Palu City. Several earthquakes have struck along the Palu-Koro Fault during recent years. The USGS epicenter, magnitude Mw 6.3 event that occurred on January 23, 2005 caused several casualties. We conducted a microtremor survey to estimate the strong ground motion distribution during the earthquake. From this survey we produced a map of the peak ground acceleration, velocity, seismic vulnerability index and ground shear strain maps in Palu City. We performed single observations of microtremor at 151 sites in Palu City. We also conducted 8-site microtremors array investigation to gain a representative determination of the soil condition of subsurface structures in Palu City. From the array observations, Palu City corresponds to relatively soil condition with Vs ≤ 300 m/s, the predominant periods due to horizontal vertical ratios (HVSRs) are in the range of 0.4 to 1.8 s and the frequency are in the range of 0.7 to 3.3 Hz. Strong ground motions of the Palu area were predicted based on the empirical stochastic green’s function method. Peak ground acceleration and velocity becomes more than 400 gal and 30 kine in some areas, which causes severe damage for buildings in high probability. Microtremor survey results showed that in hilly areas had low seismic vulnerability index and ground shear strain, whereas in coastal alluvium was composed of material having a high seismic vulnerability and ground shear strain indication.Keywords: Palu-Koro fault, microtremor, peak ground acceleration, peak ground velocity, seismic vulnerability index
Procedia PDF Downloads 4066296 Effects of Global Validity of Predictive Cues upon L2 Discourse Comprehension: Evidence from Self-paced Reading
Authors: Binger Lu
Abstract:
It remains unclear whether second language (L2) speakers could use discourse context cues to predict upcoming information as native speakers do during online comprehension. Some researchers propose that L2 learners may have a reduced ability to generate predictions during discourse processing. At the same time, there is evidence that discourse-level cues are weighed more heavily in L2 processing than in L1. Previous studies showed that L1 prediction is sensitive to the global validity of predictive cues. The current study aims to explore whether and to what extent L2 learners can dynamically and strategically adjust their prediction in accord with the global validity of predictive cues in L2 discourse comprehension as native speakers do. In a self-paced reading experiment, Chinese native speakers (N=128), C-E bilinguals (N=128), and English native speakers (N=128) read high-predictable (e.g., Jimmy felt thirsty after running. He wanted to get some water from the refrigerator.) and low-predictable (e.g., Jimmy felt sick this morning. He wanted to get some water from the refrigerator.) discourses in two-sentence frames. The global validity of predictive cues was manipulated by varying the ratio of predictable (e.g., Bill stood at the door. He opened it with the key.) and unpredictable fillers (e.g., Bill stood at the door. He opened it with the card.), such that across conditions, the predictability of the final word of the fillers ranged from 100% to 0%. The dependent variable was reading time on the critical region (the target word and the following word), analyzed with linear mixed-effects models in R. C-E bilinguals showed reliable prediction across all validity conditions (β = -35.6 ms, SE = 7.74, t = -4.601, p< .001), and Chinese native speakers showed significant effect (β = -93.5 ms, SE = 7.82, t = -11.956, p< .001) in two of the four validity conditions (namely, the High-validity and MedLow conditions, where fillers ended with predictable words in 100% and 25% cases respectively), whereas English native speakers didn’t predict at all (β = -2.78 ms, SE = 7.60, t = -.365, p = .715). There was neither main effect (χ^²(3) = .256, p = .968) nor interaction (Predictability: Background: Validity, χ^²(3) = 1.229, p = .746; Predictability: Validity, χ^²(3) = 2.520, p = .472; Background: Validity, χ^²(3) = 1.281, p = .734) of Validity with speaker groups. The results suggest that prediction occurs in L2 discourse processing but to a much less extent in L1, witha significant effect in some conditions of L1 Chinese and anull effect in L1 English processing, consistent with the view that L2 speakers are more sensitive to discourse cues compared with L1 speakers. Additionally, the pattern of L1 and L2 predictive processing was not affected by the global validity of predictive cues. C-E bilinguals’ predictive processing could be partly transferred from their L1, as prior research showed that discourse information played a more significant role in L1 Chinese processing.Keywords: bilingualism, discourse processing, global validity, prediction, self-paced reading
Procedia PDF Downloads 1396295 Geographical Data Visualization Using Video Games Technologies
Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava
Abstract:
In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material
Procedia PDF Downloads 2476294 Predicting National Football League (NFL) Match with Score-Based System
Authors: Marcho Setiawan Handok, Samuel S. Lemma, Abdoulaye Fofana, Naseef Mansoor
Abstract:
This paper is proposing a method to predict the outcome of the National Football League match with data from 2019 to 2022 and compare it with other popular models. The model uses open-source statistical data of each team, such as passing yards, rushing yards, fumbles lost, and scoring. Each statistical data has offensive and defensive. For instance, a data set of anticipated values for a specific matchup is created by comparing the offensive passing yards obtained by one team to the defensive passing yards given by the opposition. We evaluated the model’s performance by contrasting its result with those of established prediction algorithms. This research is using a neural network to predict the score of a National Football League match and then predict the winner of the game.Keywords: game prediction, NFL, football, artificial neural network
Procedia PDF Downloads 866293 Role of von Willebrand Factor Antigen as Non-Invasive Biomarker for the Prediction of Portal Hypertensive Gastropathy in Patients with Liver Cirrhosis
Authors: Mohamed El Horri, Amine Mouden, Reda Messaoudi, Mohamed Chekkal, Driss Benlaldj, Malika Baghdadi, Lahcene Benmahdi, Fatima Seghier
Abstract:
Background/aim: Recently, the Von Willebrand factor antigen (vWF-Ag)has been identified as a new marker of portal hypertension (PH) and its complications. Few studies talked about its role in the prediction of esophageal varices. VWF-Ag is considered a non-invasive approach, In order to avoid the endoscopic burden, cost, drawbacks, unpleasant and repeated examinations to the patients. In our study, we aimed to evaluate the ability of this marker in the prediction of another complication of portal hypertension, which is portal hypertensive gastropathy (PHG), the one that is diagnosed also by endoscopic tools. Patients and methods: It is about a prospective study, which include 124 cirrhotic patients with no history of bleeding who underwent screening endoscopy for PH-related complications like esophageal varices (EVs) and PHG. Routine biological tests were performed as well as the VWF-Ag testing by both ELFA and Immunoturbidimetric techniques. The diagnostic performance of our marker was assessed using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and receiver operating characteristic curves. Results: 124 patients were enrolled in this study, with a mean age of 58 years [CI: 55 – 60 years] and a sex ratio of 1.17. Viral etiologies were found in 50% of patients. Screening endoscopy revealed the presence of PHG in 20.2% of cases, while for EVsthey were found in 83.1% of cases. VWF-Ag levels, were significantly increased in patients with PHG compared to those who have not: 441% [CI: 375 – 506], versus 279% [CI: 253 – 304], respectively (p <0.0001). Using the area under the receiver operating characteristic curve (AUC), vWF-Ag was a good predictor for the presence of PHG. With a value higher than 320% and an AUC of 0.824, VWF-Ag had an 84% sensitivity, 74% specificity, 44.7% positive predictive value, 94.8% negative predictive value, and 75.8% diagnostic accuracy. Conclusion: VWF-Ag is a good non-invasive low coast marker for excluding the presence of PHG in patients with liver cirrhosis. Using this marker as part of a selective screening strategy might reduce the need for endoscopic screening and the coast of the management of these kinds of patients.Keywords: von willebrand factor, portal hypertensive gastropathy, prediction, liver cirrhosis
Procedia PDF Downloads 2086292 A Systematic Snapshot of Software Outsourcing Challenges
Authors: Issam Jebreen, Eman Al-Qbelat
Abstract:
Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.
Procedia PDF Downloads 936291 Stock Price Prediction with 'Earnings' Conference Call Sentiment
Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu
Abstract:
Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.Keywords: earnings call script, random forest, sentiment analysis, stock price prediction
Procedia PDF Downloads 2946290 A Comparative Assessment of Industrial Composites Using Thermography and Ultrasound
Authors: Mosab Alrashed, Wei Xu, Stephen Abineri, Yifan Zhao, Jörn Mehnen
Abstract:
Thermographic inspection is a relatively new technique for Non-Destructive Testing (NDT) which has been gathering increasing interest due to its relatively low cost hardware and extremely fast data acquisition properties. This technique is especially promising in the area of rapid automated damage detection and quantification. In collaboration with a major industry partner from the aerospace sector advanced thermography-based NDT software for impact damaged composites is introduced. The software is based on correlation analysis of time-temperature profiles in combination with an image enhancement process. The prototype software is aiming to a) better visualise the damages in a relatively easy-to-use way and b) automatically and quantitatively measure the properties of the degradation. Knowing that degradation properties play an important role in the identification of degradation types, tests and results on specimens which were artificially damaged have been performed and analyzed.Keywords: NDT, correlation analysis, image processing, damage, inspection
Procedia PDF Downloads 5496289 Forecasting Direct Normal Irradiation at Djibouti Using Artificial Neural Network
Authors: Ahmed Kayad Abdourazak, Abderafi Souad, Zejli Driss, Idriss Abdoulkader Ibrahim
Abstract:
In this paper Artificial Neural Network (ANN) is used to predict the solar irradiation in Djibouti for the first Time that is useful to the integration of Concentrating Solar Power (CSP) and sites selections for new or future solar plants as part of solar energy development. An ANN algorithm was developed to establish a forward/reverse correspondence between the latitude, longitude, altitude and monthly solar irradiation. For this purpose the German Aerospace Centre (DLR) data of eight Djibouti sites were used as training and testing in a standard three layers network with the back propagation algorithm of Lavenber-Marquardt. Results have shown a very good agreement for the solar irradiation prediction in Djibouti and proves that the proposed approach can be well used as an efficient tool for prediction of solar irradiation by providing so helpful information concerning sites selection, design and planning of solar plants.Keywords: artificial neural network, solar irradiation, concentrated solar power, Lavenberg-Marquardt
Procedia PDF Downloads 3556288 Applying the Regression Technique for Prediction of the Acute Heart Attack
Authors: Paria Soleimani, Arezoo Neshati
Abstract:
Myocardial infarction is one of the leading causes of death in the world. Some of these deaths occur even before the patient reaches the hospital. Myocardial infarction occurs as a result of impaired blood supply. Because the most of these deaths are due to coronary artery disease, hence the awareness of the warning signs of a heart attack is essential. Some heart attacks are sudden and intense, but most of them start slowly, with mild pain or discomfort, then early detection and successful treatment of these symptoms is vital to save them. Therefore, importance and usefulness of a system designing to assist physicians in the early diagnosis of the acute heart attacks is obvious. The purpose of this study is to determine how well a predictive model would perform based on the only patient-reportable clinical history factors, without using diagnostic tests or physical exams. This type of the prediction model might have application outside of the hospital setting to give accurate advice to patients to influence them to seek care in appropriate situations. For this purpose, the data were collected on 711 heart patients in Iran hospitals. 28 attributes of clinical factors can be reported by patients; were studied. Three logistic regression models were made on the basis of the 28 features to predict the risk of heart attacks. The best logistic regression model in terms of performance had a C-index of 0.955 and with an accuracy of 94.9%. The variables, severe chest pain, back pain, cold sweats, shortness of breath, nausea, and vomiting were selected as the main features.Keywords: Coronary heart disease, Acute heart attacks, Prediction, Logistic regression
Procedia PDF Downloads 4526287 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment
Authors: Kim Byung-Kon, Kim Young-Jin
Abstract:
As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM
Procedia PDF Downloads 5076286 Prioritization of Mutation Test Generation with Centrality Measure
Authors: Supachai Supmak, Yachai Limpiyakorn
Abstract:
Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank will be focused first when developing their test cases as these modules are vulnerable to defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.Keywords: software testing, mutation test, network centrality measure, test case prioritization
Procedia PDF Downloads 1136285 A Convolution Neural Network PM-10 Prediction System Based on a Dense Measurement Sensor Network in Poland
Authors: Piotr A. Kowalski, Kasper Sapala, Wiktor Warchalowski
Abstract:
PM10 is a suspended dust that primarily has a negative effect on the respiratory system. PM10 is responsible for attacks of coughing and wheezing, asthma or acute, violent bronchitis. Indirectly, PM10 also negatively affects the rest of the body, including increasing the risk of heart attack and stroke. Unfortunately, Poland is a country that cannot boast of good air quality, in particular, due to large PM concentration levels. Therefore, based on the dense network of Airly sensors, it was decided to deal with the problem of prediction of suspended particulate matter concentration. Due to the very complicated nature of this issue, the Machine Learning approach was used. For this purpose, Convolution Neural Network (CNN) neural networks have been adopted, these currently being the leading information processing methods in the field of computational intelligence. The aim of this research is to show the influence of particular CNN network parameters on the quality of the obtained forecast. The forecast itself is made on the basis of parameters measured by Airly sensors and is carried out for the subsequent day, hour after hour. The evaluation of learning process for the investigated models was mostly based upon the mean square error criterion; however, during the model validation, a number of other methods of quantitative evaluation were taken into account. The presented model of pollution prediction has been verified by way of real weather and air pollution data taken from the Airly sensor network. The dense and distributed network of Airly measurement devices enables access to current and archival data on air pollution, temperature, suspended particulate matter PM1.0, PM2.5, and PM10, CAQI levels, as well as atmospheric pressure and air humidity. In this investigation, PM2.5, and PM10, temperature and wind information, as well as external forecasts of temperature and wind for next 24h served as inputted data. Due to the specificity of the CNN type network, this data is transformed into tensors and then processed. This network consists of an input layer, an output layer, and many hidden layers. In the hidden layers, convolutional and pooling operations are performed. The output of this system is a vector containing 24 elements that contain prediction of PM10 concentration for the upcoming 24 hour period. Over 1000 models based on CNN methodology were tested during the study. During the research, several were selected out that give the best results, and then a comparison was made with the other models based on linear regression. The numerical tests carried out fully confirmed the positive properties of the presented method. These were carried out using real ‘big’ data. Models based on the CNN technique allow prediction of PM10 dust concentration with a much smaller mean square error than currently used methods based on linear regression. What's more, the use of neural networks increased Pearson's correlation coefficient (R²) by about 5 percent compared to the linear model. During the simulation, the R² coefficient was 0.92, 0.76, 0.75, 0.73, and 0.73 for 1st, 6th, 12th, 18th, and 24th hour of prediction respectively.Keywords: air pollution prediction (forecasting), machine learning, regression task, convolution neural networks
Procedia PDF Downloads 1506284 Conflicts Identification Approach among Stakeholders in Goal-Oriented Requirements Analysis
Authors: Muhammad Suhaib
Abstract:
Requirements Analysis are the most important part of software Engineering for both system application development, and project requirements. Conflicts often arise during the requirements gathering and analysis phase. This research aims to identify conflicts during the requirements gathering phase in software development life cycle, Research, Development, and Technology converted the world into a global village. During requirements elicitation/gathering phase it’s very difficult to understand the main objective of stakeholders, after completion of requirements elicitation task final results are used for Software Requirements Specification (SRS), SRS is the highly important outcome of the requirements analysis phase. this is the foundation between the developers and stakeholders or customers, proposed methodology will be helpful to identify those conflicts in a very easy manner during the initial phase of the project.Keywords: goal oriented requirements analysis, conflicts identification model, requirements analysis, requirements engineering
Procedia PDF Downloads 137