Search results for: OSU tidal prediction software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6891

Search results for: OSU tidal prediction software

6141 Springback Prediction for Sheet Metal Cold Stamping Using Convolutional Neural Networks

Authors: Lei Zhu, Nan Li

Abstract:

Cold stamping has been widely applied in the automotive industry for the mass production of a great range of automotive panels. Predicting the springback to ensure the dimensional accuracy of the cold-stamped components is a critical step. The main approaches for the prediction and compensation of springback in cold stamping include running Finite Element (FE) simulations and conducting experiments, which require forming process expertise and can be time-consuming and expensive for the design of cold stamping tools. Machine learning technologies have been proven and successfully applied in learning complex system behaviours using presentative samples. These technologies exhibit the promising potential to be used as supporting design tools for metal forming technologies. This study, for the first time, presents a novel application of a Convolutional Neural Network (CNN) based surrogate model to predict the springback fields for variable U-shape cold bending geometries. A dataset is created based on the U-shape cold bending geometries and the corresponding FE simulations results. The dataset is then applied to train the CNN surrogate model. The result shows that the surrogate model can achieve near indistinguishable full-field predictions in real-time when compared with the FE simulation results. The application of CNN in efficient springback prediction can be adopted in industrial settings to aid both conceptual and final component designs for designers without having manufacturing knowledge.

Keywords: springback, cold stamping, convolutional neural networks, machine learning

Procedia PDF Downloads 143
6140 Design of a Tool for Generating Test Cases from BPMN

Authors: Prat Yotyawilai, Taratip Suwannasart

Abstract:

Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.

Keywords: software testing, test case, BPMN, flow graph

Procedia PDF Downloads 553
6139 Design and Burnback Analysis of Three Dimensional Modified Star Grain

Authors: Almostafa Abdelaziz, Liang Guozhu, Anwer Elsayed

Abstract:

The determination of grain geometry is an important and critical step in the design of solid propellant rocket motor. In this study, the design process involved parametric geometry modeling in CAD, MATLAB coding of performance prediction and 2D star grain ignition experiment. The 2D star grain burnback achieved by creating new surface via each web increment and calculating geometrical properties at each step. The 2D star grain is further modified to burn as a tapered 3D star grain. Zero dimensional method used to calculate the internal ballistic performance. Experimental and theoretical results were compared in order to validate the performance prediction of the solid rocket motor. The results show that the usage of 3D grain geometry will decrease the pressure inside the combustion chamber and enhance the volumetric loading ratio.

Keywords: burnback analysis, rocket motor, star grain, three dimensional grains

Procedia PDF Downloads 236
6138 Agile Manifesto Construct for the Film Industry

Authors: Kiri Trier, Theresa Treffers

Abstract:

In the course of continuous volatility like production stops due to the COVID-19 pandemic, video-on-demand player monopolizing the film industry, filmmakers are stuck in traditional, linear content development processes. The industry has to become more agile in order to react quickly and easily to changes. Since content development in agile project management is scientifically–empirically not at all recorded, and a lack beyond the software development in terms of agile methods consists, we examined if the agile manifesto values and principles from the software development can be adapted to the film industry to enable agility and digitalization of content development in the industry. We conducted an online questionnaire with 184 German filmmakers (producers, authors, directors, actors, film financiers) for a first cross-sectional assessment for adaptability of the agile manifesto from the software development to the film industry, factor analysis was used to validate the construct. Our results show that it is crucial to digitalize traditional content development to agile content development end-to-end, with tools, lean processes, new collaboration structures, and holacracy to prepare for any volatility. Overall, we examined the first construct for an agile manifesto for the film industry with four values related to nine own principles. Our findings help to get a better understanding of the agile manifesto beyond the software development as a guideline for implementing agility in the film industry.

Keywords: agile manifesto, agile project management, agility, film industry

Procedia PDF Downloads 193
6137 Benefits of Gamification in Agile Software Project Courses

Authors: Nina Dzamashvili Fogelström

Abstract:

This paper examines concepts of Game-Based Learning and Gamification. Conducted literature survey found an increased interest in the academia in these concepts, limited evidence of a positive effect on student motivation and academic performance, but also certain scepticism for adding games to traditional educational activities. A small-scale empirical study presented in this paper aims to evaluate student experience and usefulness of GameBased Learning and Gamification for a better understanding of the threshold concepts in software engineering project courses. The participants of the study were 22 second year students from bachelor’s program in software engineering at Blekinge Institute of Technology. As a part of the course instruction, the students were introduced to a digital game specifically designed to simulate agile software project. The game mechanics were designed as to allow manipulation of the agile concept of team velocity. After the application of the game, the students were surveyed to measure the degree of a perceived increase in understanding of the studied threshold concept. The students were also asked whether they would like to have games included in their education. The results show that majority of the students found the game helpful in increasing their understanding of the threshold concept. Most of the students have indicated that they would like to see games included in their education. These results are encouraging. Since the study was of small scale and based on convenience sampling, more studies in the area are recommended.

Keywords: agile development, gamification, game based learning, digital games, software engineering, threshold concepts

Procedia PDF Downloads 164
6136 Simulation of Kinetic Friction in L-Bending of Sheet Metals

Authors: Maziar Ramezani, Thomas Neitzert, Timotius Pasang

Abstract:

This paper aims at experimental and numerical investigation of springback behavior of sheet metals during L-bending process with emphasis on Stribeck-type friction modeling. The coefficient of friction in Stribeck curve depends on sliding velocity and contact pressure. The springback behavior of mild steel and aluminum alloy 6022-T4 sheets was studied experimentally and using numerical simulations with ABAQUS software with two types of friction model: Coulomb friction and Stribeck friction. The influence of forming speed on springback behavior was studied experimentally and numerically. The results showed that Stribeck-type friction model has better results in predicting springback in sheet metal forming. The FE prediction error for mild steel and 6022-T4 AA is 23.8%, 25.5% respectively, using Coulomb friction model and 11%, 13% respectively, using Stribeck friction model. These results show that Stribeck model is suitable for simulation of sheet metal forming especially at higher forming speed.

Keywords: friction, L-bending, springback, Stribeck curves

Procedia PDF Downloads 488
6135 Towards the Use of Software Product Metrics as an Indicator for Measuring Mobile Applications Power Consumption

Authors: Ching Kin Keong, Koh Tieng Wei, Abdul Azim Abd. Ghani, Khaironi Yatim Sharif

Abstract:

Maintaining factory default battery endurance rate over time in supporting huge amount of running applications on energy-restricted mobile devices has created a new challenge for mobile applications developer. While delivering customers’ unlimited expectations, developers are barely aware of efficient use of energy from the application itself. Thus developers need a set of valid energy consumption indicators in assisting them to develop energy saving applications. In this paper, we present a few software product metrics that can be used as an indicator to measure energy consumption of Android-based mobile applications in the early of design stage. In particular, Trepn Profiler (Power profiling tool for Qualcomm processor) has used to collect the data of mobile application power consumption, and then analyzed for the 23 software metrics in this preliminary study. The results show that McCabe cyclomatic complexity, number of parameters, nested block depth, number of methods, weighted methods per class, number of classes, total lines of code and method lines have direct relationship with power consumption of mobile application.

Keywords: battery endurance, software metrics, mobile application, power consumption

Procedia PDF Downloads 391
6134 Automatic Verification Technology of Virtual Machine Software Patch on IaaS Cloud

Authors: Yoji Yamato

Abstract:

In this paper, we propose an automatic verification technology of software patches for user virtual environments on IaaS Cloud to decrease verification costs of patches. In these days, IaaS services have been spread and many users can customize virtual machines on IaaS Cloud like their own private servers. Regarding to software patches of OS or middleware installed on virtual machines, users need to adopt and verify these patches by themselves. This task increases operation costs of users. Our proposed method replicates user virtual environments, extracts verification test cases for user virtual environments from test case DB, distributes patches to virtual machines on replicated environments and conducts those test cases automatically on replicated environments. We have implemented the proposed method on OpenStack using Jenkins and confirmed the feasibility. Using the implementation, we confirmed the effectiveness of test case creation efforts by our proposed idea of 2-tier abstraction of software functions and test cases. We also evaluated the automatic verification performance of environment replications, test cases extractions and test cases conductions.

Keywords: OpenStack, cloud computing, automatic verification, jenkins

Procedia PDF Downloads 482
6133 Internet Optimization by Negotiating Traffic Times

Authors: Carlos Gonzalez

Abstract:

This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.

Keywords: internet optimization, video download, future demands, secure storage

Procedia PDF Downloads 135
6132 Lab Support: A Computer Laboratory Class Management Support System

Authors: Eugenia P. Ramirez, Kevin Matthe Caramancion, Mia Eleazar

Abstract:

Getting the attention of students is a constant challenge to the instructors/lecturers. Although in the computer laboratories some networking and entertainment websites are blocked, yet, these websites have unlimited ways of attracting students to get into it. Thus, when an instructor gives a specific set of instructions, some students may not be able to follow sequentially the steps that are given. The instructor has to physically go to the specific remote terminal and show the student the details. Sometimes, during an examination in laboratory set-up, a proctor may prefer to give detailed and text-written instructions rather than verbal instructions. Even the mere calling of a specific student at any time will distract the whole class especially when activities are being performed. What is needed is : An application software that is able to lock the student's monitor and at the same time display the instructor’s screen; a software that is powerful enough to process in its side alone and manipulate a specific user’s terminal in terms of free configuration that is, without restrictions at the server level is a required functionality for a modern and optimal server structure; a software that is able to send text messages to students, per terminal or in group will be a solution. These features are found in LabSupport. This paper outlines the LabSupport application software framework to efficiently manage computer laboratory sessions and will include different modules: screen viewer, demonstration mode, monitor locking system, text messaging, and class management. This paper's ultimate aim is to provide a system that increases instructor productivity.

Keywords: application software, broadcast messaging, class management, locking system

Procedia PDF Downloads 437
6131 Geographical Data Visualization Using Video Games Technologies

Authors: Nizar Karim Uribe-Orihuela, Fernando Brambila-Paz, Ivette Caldelas, Rodrigo Montufar-Chaveznava

Abstract:

In this paper, we present the advances corresponding to the implementation of a strategy to visualize geographical data using a Software Development Kit (SDK) for video games. We use multispectral images from Landsat 7 platform and Laser Imaging Detection and Ranging (LIDAR) data from The National Institute of Geography and Statistics of Mexican (INEGI). We select a place of interest to visualize from Landsat platform and make some processing to the image (rotations, atmospheric correction and enhancement). The resulting image will be our gray scale color-map to fusion with the LIDAR data, which was selected using the same coordinates than in Landsat. The LIDAR data is translated to 8-bit raw data. Both images are fused in a software developed using Unity (an SDK employed for video games). The resulting image is then displayed and can be explored moving around. The idea is the software could be used for students of geology and geophysics at the Engineering School of the National University of Mexico. They will download the software and images corresponding to a geological place of interest to a smartphone and could virtually visit and explore the site with a virtual reality visor such as Google cardboard.

Keywords: virtual reality, interactive technologies, geographical data visualization, video games technologies, educational material

Procedia PDF Downloads 239
6130 A Systematic Snapshot of Software Outsourcing Challenges

Authors: Issam Jebreen, Eman Al-Qbelat

Abstract:

Outsourcing software development projects can be challenging, and there are several common challenges that organizations face. A study was conducted with a sample of 46 papers on outsourcing challenges, and the results show that there are several common challenges faced by organizations when outsourcing software development projects. Poor outsourcing relationship was identified as the most significant challenge, with 35% of the papers referencing it. Lack of quality was the second most significant challenge, with 33% of the papers referencing it. Language and cultural differences were the third most significant challenge, with 24% of the papers referencing it. Non-competitive price was another challenge faced by organizations, with 21% of the papers referencing it. Poor coordination and communication were also identified as a challenge, with 21% of the papers referencing it. Opportunistic behavior, lack of contract negotiation, inadequate user involvement, and constraints due to time zone were also challenges faced by organizations. Other challenges faced by organizations included poor project management, lack of technical capabilities, vendor employee high turnover, poor requirement specification, IPR issues, poor management of budget, schedule, and delay, geopolitical and country instability, the difference in development methodologies, failure to manage end-user expectations, and poor monitoring and control. In conclusion, outsourcing software development projects can be challenging, but organizations can mitigate these challenges by selecting the right outsourcing partner, having a well-defined contract and clear communication, having a clear understanding of the requirements, and implementing effective project management practices.

Keywords: software outsourcing, vendor, outsourcing challenges, quality model, continent, country, global outsourcing, IT workforce outsourcing.

Procedia PDF Downloads 86
6129 Effects of Global Validity of Predictive Cues upon L2 Discourse Comprehension: Evidence from Self-paced Reading

Authors: Binger Lu

Abstract:

It remains unclear whether second language (L2) speakers could use discourse context cues to predict upcoming information as native speakers do during online comprehension. Some researchers propose that L2 learners may have a reduced ability to generate predictions during discourse processing. At the same time, there is evidence that discourse-level cues are weighed more heavily in L2 processing than in L1. Previous studies showed that L1 prediction is sensitive to the global validity of predictive cues. The current study aims to explore whether and to what extent L2 learners can dynamically and strategically adjust their prediction in accord with the global validity of predictive cues in L2 discourse comprehension as native speakers do. In a self-paced reading experiment, Chinese native speakers (N=128), C-E bilinguals (N=128), and English native speakers (N=128) read high-predictable (e.g., Jimmy felt thirsty after running. He wanted to get some water from the refrigerator.) and low-predictable (e.g., Jimmy felt sick this morning. He wanted to get some water from the refrigerator.) discourses in two-sentence frames. The global validity of predictive cues was manipulated by varying the ratio of predictable (e.g., Bill stood at the door. He opened it with the key.) and unpredictable fillers (e.g., Bill stood at the door. He opened it with the card.), such that across conditions, the predictability of the final word of the fillers ranged from 100% to 0%. The dependent variable was reading time on the critical region (the target word and the following word), analyzed with linear mixed-effects models in R. C-E bilinguals showed reliable prediction across all validity conditions (β = -35.6 ms, SE = 7.74, t = -4.601, p< .001), and Chinese native speakers showed significant effect (β = -93.5 ms, SE = 7.82, t = -11.956, p< .001) in two of the four validity conditions (namely, the High-validity and MedLow conditions, where fillers ended with predictable words in 100% and 25% cases respectively), whereas English native speakers didn’t predict at all (β = -2.78 ms, SE = 7.60, t = -.365, p = .715). There was neither main effect (χ^²(3) = .256, p = .968) nor interaction (Predictability: Background: Validity, χ^²(3) = 1.229, p = .746; Predictability: Validity, χ^²(3) = 2.520, p = .472; Background: Validity, χ^²(3) = 1.281, p = .734) of Validity with speaker groups. The results suggest that prediction occurs in L2 discourse processing but to a much less extent in L1, witha significant effect in some conditions of L1 Chinese and anull effect in L1 English processing, consistent with the view that L2 speakers are more sensitive to discourse cues compared with L1 speakers. Additionally, the pattern of L1 and L2 predictive processing was not affected by the global validity of predictive cues. C-E bilinguals’ predictive processing could be partly transferred from their L1, as prior research showed that discourse information played a more significant role in L1 Chinese processing.

Keywords: bilingualism, discourse processing, global validity, prediction, self-paced reading

Procedia PDF Downloads 137
6128 A Comparative Assessment of Industrial Composites Using Thermography and Ultrasound

Authors: Mosab Alrashed, Wei Xu, Stephen Abineri, Yifan Zhao, Jörn Mehnen

Abstract:

Thermographic inspection is a relatively new technique for Non-Destructive Testing (NDT) which has been gathering increasing interest due to its relatively low cost hardware and extremely fast data acquisition properties. This technique is especially promising in the area of rapid automated damage detection and quantification. In collaboration with a major industry partner from the aerospace sector advanced thermography-based NDT software for impact damaged composites is introduced. The software is based on correlation analysis of time-temperature profiles in combination with an image enhancement process. The prototype software is aiming to a) better visualise the damages in a relatively easy-to-use way and b) automatically and quantitatively measure the properties of the degradation. Knowing that degradation properties play an important role in the identification of degradation types, tests and results on specimens which were artificially damaged have been performed and analyzed.

Keywords: NDT, correlation analysis, image processing, damage, inspection

Procedia PDF Downloads 544
6127 Predicting National Football League (NFL) Match with Score-Based System

Authors: Marcho Setiawan Handok, Samuel S. Lemma, Abdoulaye Fofana, Naseef Mansoor

Abstract:

This paper is proposing a method to predict the outcome of the National Football League match with data from 2019 to 2022 and compare it with other popular models. The model uses open-source statistical data of each team, such as passing yards, rushing yards, fumbles lost, and scoring. Each statistical data has offensive and defensive. For instance, a data set of anticipated values for a specific matchup is created by comparing the offensive passing yards obtained by one team to the defensive passing yards given by the opposition. We evaluated the model’s performance by contrasting its result with those of established prediction algorithms. This research is using a neural network to predict the score of a National Football League match and then predict the winner of the game.

Keywords: game prediction, NFL, football, artificial neural network

Procedia PDF Downloads 78
6126 A Study on How to Develop the Usage Metering Functions of BIM (Building Information Modeling) Software under Cloud Computing Environment

Authors: Kim Byung-Kon, Kim Young-Jin

Abstract:

As project opportunities for the Architecture, Engineering and Construction (AEC) industry have grown more complex and larger, the utilization of BIM (Building Information Modeling) technologies for 3D design and simulation practices has been increasing significantly; the typical applications of the BIM technologies include clash detection and design alternative based on 3D planning, which have been expanded over to the technology of construction management in the AEC industry for virtual design and construction. As for now, commercial BIM software has been operated under a single-user environment, which is why initial costs for its introduction are very high. Cloud computing, one of the most promising next-generation Internet technologies, enables simple Internet devices to use services and resources provided with BIM software. Recently in Korea, studies to link between BIM and cloud computing technologies have been directed toward saving costs to build BIM-related infrastructure, and providing various BIM services for small- and medium-sized enterprises (SMEs). This study addressed how to develop the usage metering functions of BIM software under cloud computing architecture in order to archive and use BIM data and create an optimal revenue structure so that the BIM services may grow spontaneously, considering a demand for cloud resources. To this end, the author surveyed relevant cases, and then analyzed needs and requirements from AEC industry. Based on the results & findings of the foregoing survey & analysis, the author proposed herein how to optimally develop the usage metering functions of cloud BIM software.

Keywords: construction IT, BIM (Building Information Modeling), cloud computing, BIM-based cloud computing, 3D design, cloud BIM

Procedia PDF Downloads 504
6125 Role of von Willebrand Factor Antigen as Non-Invasive Biomarker for the Prediction of Portal Hypertensive Gastropathy in Patients with Liver Cirrhosis

Authors: Mohamed El Horri, Amine Mouden, Reda Messaoudi, Mohamed Chekkal, Driss Benlaldj, Malika Baghdadi, Lahcene Benmahdi, Fatima Seghier

Abstract:

Background/aim: Recently, the Von Willebrand factor antigen (vWF-Ag)has been identified as a new marker of portal hypertension (PH) and its complications. Few studies talked about its role in the prediction of esophageal varices. VWF-Ag is considered a non-invasive approach, In order to avoid the endoscopic burden, cost, drawbacks, unpleasant and repeated examinations to the patients. In our study, we aimed to evaluate the ability of this marker in the prediction of another complication of portal hypertension, which is portal hypertensive gastropathy (PHG), the one that is diagnosed also by endoscopic tools. Patients and methods: It is about a prospective study, which include 124 cirrhotic patients with no history of bleeding who underwent screening endoscopy for PH-related complications like esophageal varices (EVs) and PHG. Routine biological tests were performed as well as the VWF-Ag testing by both ELFA and Immunoturbidimetric techniques. The diagnostic performance of our marker was assessed using sensitivity, specificity, positive predictive value, negative predictive value, accuracy, and receiver operating characteristic curves. Results: 124 patients were enrolled in this study, with a mean age of 58 years [CI: 55 – 60 years] and a sex ratio of 1.17. Viral etiologies were found in 50% of patients. Screening endoscopy revealed the presence of PHG in 20.2% of cases, while for EVsthey were found in 83.1% of cases. VWF-Ag levels, were significantly increased in patients with PHG compared to those who have not: 441% [CI: 375 – 506], versus 279% [CI: 253 – 304], respectively (p <0.0001). Using the area under the receiver operating characteristic curve (AUC), vWF-Ag was a good predictor for the presence of PHG. With a value higher than 320% and an AUC of 0.824, VWF-Ag had an 84% sensitivity, 74% specificity, 44.7% positive predictive value, 94.8% negative predictive value, and 75.8% diagnostic accuracy. Conclusion: VWF-Ag is a good non-invasive low coast marker for excluding the presence of PHG in patients with liver cirrhosis. Using this marker as part of a selective screening strategy might reduce the need for endoscopic screening and the coast of the management of these kinds of patients.

Keywords: von willebrand factor, portal hypertensive gastropathy, prediction, liver cirrhosis

Procedia PDF Downloads 198
6124 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank will be focused first when developing their test cases as these modules are vulnerable to defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: software testing, mutation test, network centrality measure, test case prioritization

Procedia PDF Downloads 107
6123 Stock Price Prediction with 'Earnings' Conference Call Sentiment

Authors: Sungzoon Cho, Hye Jin Lee, Sungwhan Jeon, Dongyoung Min, Sungwon Lyu

Abstract:

Major public corporations worldwide use conference calls to report their quarterly earnings. These 'earnings' conference calls allow for questions from stock analysts. We investigated if it is possible to identify sentiment from the call script and use it to predict stock price movement. We analyzed call scripts from six companies, two each from Korea, China and Indonesia during six years 2011Q1 – 2017Q2. Random forest with Frequency-based sentiment scores using Loughran MacDonald Dictionary did better than control model with only financial indicators. When the stock prices went up 20 days from earnings release, our model predicted correctly 77% of time. When the model predicted 'up,' actual stock prices went up 65% of time. This preliminary result encourages us to investigate advanced sentiment scoring methodologies such as topic modeling, auto-encoder, and word2vec variants.

Keywords: earnings call script, random forest, sentiment analysis, stock price prediction

Procedia PDF Downloads 290
6122 Conflicts Identification Approach among Stakeholders in Goal-Oriented Requirements Analysis

Authors: Muhammad Suhaib

Abstract:

Requirements Analysis are the most important part of software Engineering for both system application development, and project requirements. Conflicts often arise during the requirements gathering and analysis phase. This research aims to identify conflicts during the requirements gathering phase in software development life cycle, Research, Development, and Technology converted the world into a global village. During requirements elicitation/gathering phase it’s very difficult to understand the main objective of stakeholders, after completion of requirements elicitation task final results are used for Software Requirements Specification (SRS), SRS is the highly important outcome of the requirements analysis phase. this is the foundation between the developers and stakeholders or customers, proposed methodology will be helpful to identify those conflicts in a very easy manner during the initial phase of the project.

Keywords: goal oriented requirements analysis, conflicts identification model, requirements analysis, requirements engineering

Procedia PDF Downloads 129
6121 Forecasting Direct Normal Irradiation at Djibouti Using Artificial Neural Network

Authors: Ahmed Kayad Abdourazak, Abderafi Souad, Zejli Driss, Idriss Abdoulkader Ibrahim

Abstract:

In this paper Artificial Neural Network (ANN) is used to predict the solar irradiation in Djibouti for the first Time that is useful to the integration of Concentrating Solar Power (CSP) and sites selections for new or future solar plants as part of solar energy development. An ANN algorithm was developed to establish a forward/reverse correspondence between the latitude, longitude, altitude and monthly solar irradiation. For this purpose the German Aerospace Centre (DLR) data of eight Djibouti sites were used as training and testing in a standard three layers network with the back propagation algorithm of Lavenber-Marquardt. Results have shown a very good agreement for the solar irradiation prediction in Djibouti and proves that the proposed approach can be well used as an efficient tool for prediction of solar irradiation by providing so helpful information concerning sites selection, design and planning of solar plants.

Keywords: artificial neural network, solar irradiation, concentrated solar power, Lavenberg-Marquardt

Procedia PDF Downloads 352
6120 Meet Automotive Software Safety and Security Standards Expectations More Quickly

Authors: Jean-François Pouilly

Abstract:

This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.

Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods

Procedia PDF Downloads 12
6119 Applying the Regression Technique for ‎Prediction of the Acute Heart Attack ‎

Authors: Paria Soleimani, Arezoo Neshati

Abstract:

Myocardial infarction is one of the leading causes of ‎death in the world. Some of these deaths occur even before the patient ‎reaches the hospital. Myocardial infarction occurs as a result of ‎impaired blood supply. Because the most of these deaths are due to ‎coronary artery disease, hence the awareness of the warning signs of a ‎heart attack is essential. Some heart attacks are sudden and intense, but ‎most of them start slowly, with mild pain or discomfort, then early ‎detection and successful treatment of these symptoms is vital to save ‎them. Therefore, importance and usefulness of a system designing to ‎assist physicians in the early diagnosis of the acute heart attacks is ‎obvious.‎ The purpose of this study is to determine how well a predictive ‎model would perform based on the only patient-reportable clinical ‎history factors, without using diagnostic tests or physical exams. This ‎type of the prediction model might have application outside of the ‎hospital setting to give accurate advice to patients to influence them to ‎seek care in appropriate situations. For this purpose, the data were ‎collected on 711 heart patients in Iran hospitals. 28 attributes of clinical ‎factors can be reported by patients; were studied. Three logistic ‎regression models were made on the basis of the 28 features to predict ‎the risk of heart attacks. The best logistic regression model in terms of ‎performance had a C-index of 0.955 and with an accuracy of 94.9%. ‎The variables, severe chest pain, back pain, cold sweats, shortness of ‎breath, nausea, and vomiting were selected as the main features.‎

Keywords: Coronary heart disease, Acute heart attacks, Prediction, Logistic ‎regression‎

Procedia PDF Downloads 446
6118 Gamification of a Business Intelligence Tool

Authors: Stephen Miller

Abstract:

The act of applying game mechanics and dynamics (which have been traditionally used in video games) into business applications is being widely trialed in an effort to make conventional business software a bit more participative, fun and engaging. This new trend, named ‘gamification’ has its believers and of course, its critics who still need convincing that the concept is an effective and beneficial business tool worthy of investment. The literature reveals that user engagement of business intelligence (BI) tools is much lower than expected and investors are failing to get a good return on their investment (ROI). So, a software prototype will be designed and developed to add gamification to a BI tool to determine its effect upon the user engagement levels of test participants. The experimental study will be evaluated using the comprehensive User Engagement Scale (UES) to see if there are improvements in areas such as; aesthetics, perceived usability, endurability, novelty, felt involvement and focused attention. The results of this unique study should demonstrate whether or not ‘gamifying’ a BI tool has the potential to increase an individual’s motivation to use BI software more often.

Keywords: business intelligence, gamification, human computer interaction, user engagement

Procedia PDF Downloads 582
6117 A Convolution Neural Network PM-10 Prediction System Based on a Dense Measurement Sensor Network in Poland

Authors: Piotr A. Kowalski, Kasper Sapala, Wiktor Warchalowski

Abstract:

PM10 is a suspended dust that primarily has a negative effect on the respiratory system. PM10 is responsible for attacks of coughing and wheezing, asthma or acute, violent bronchitis. Indirectly, PM10 also negatively affects the rest of the body, including increasing the risk of heart attack and stroke. Unfortunately, Poland is a country that cannot boast of good air quality, in particular, due to large PM concentration levels. Therefore, based on the dense network of Airly sensors, it was decided to deal with the problem of prediction of suspended particulate matter concentration. Due to the very complicated nature of this issue, the Machine Learning approach was used. For this purpose, Convolution Neural Network (CNN) neural networks have been adopted, these currently being the leading information processing methods in the field of computational intelligence. The aim of this research is to show the influence of particular CNN network parameters on the quality of the obtained forecast. The forecast itself is made on the basis of parameters measured by Airly sensors and is carried out for the subsequent day, hour after hour. The evaluation of learning process for the investigated models was mostly based upon the mean square error criterion; however, during the model validation, a number of other methods of quantitative evaluation were taken into account. The presented model of pollution prediction has been verified by way of real weather and air pollution data taken from the Airly sensor network. The dense and distributed network of Airly measurement devices enables access to current and archival data on air pollution, temperature, suspended particulate matter PM1.0, PM2.5, and PM10, CAQI levels, as well as atmospheric pressure and air humidity. In this investigation, PM2.5, and PM10, temperature and wind information, as well as external forecasts of temperature and wind for next 24h served as inputted data. Due to the specificity of the CNN type network, this data is transformed into tensors and then processed. This network consists of an input layer, an output layer, and many hidden layers. In the hidden layers, convolutional and pooling operations are performed. The output of this system is a vector containing 24 elements that contain prediction of PM10 concentration for the upcoming 24 hour period. Over 1000 models based on CNN methodology were tested during the study. During the research, several were selected out that give the best results, and then a comparison was made with the other models based on linear regression. The numerical tests carried out fully confirmed the positive properties of the presented method. These were carried out using real ‘big’ data. Models based on the CNN technique allow prediction of PM10 dust concentration with a much smaller mean square error than currently used methods based on linear regression. What's more, the use of neural networks increased Pearson's correlation coefficient (R²) by about 5 percent compared to the linear model. During the simulation, the R² coefficient was 0.92, 0.76, 0.75, 0.73, and 0.73 for 1st, 6th, 12th, 18th, and 24th hour of prediction respectively.

Keywords: air pollution prediction (forecasting), machine learning, regression task, convolution neural networks

Procedia PDF Downloads 145
6116 A Machine Learning Model for Dynamic Prediction of Chronic Kidney Disease Risk Using Laboratory Data, Non-Laboratory Data, and Metabolic Indices

Authors: Amadou Wurry Jallow, Adama N. S. Bah, Karamo Bah, Shih-Ye Wang, Kuo-Chung Chu, Chien-Yeh Hsu

Abstract:

Chronic kidney disease (CKD) is a major public health challenge with high prevalence, rising incidence, and serious adverse consequences. Developing effective risk prediction models is a cost-effective approach to predicting and preventing complications of chronic kidney disease (CKD). This study aimed to develop an accurate machine learning model that can dynamically identify individuals at risk of CKD using various kinds of diagnostic data, with or without laboratory data, at different follow-up points. Creatinine is a key component used to predict CKD. These models will enable affordable and effective screening for CKD even with incomplete patient data, such as the absence of creatinine testing. This retrospective cohort study included data on 19,429 adults provided by a private research institute and screening laboratory in Taiwan, gathered between 2001 and 2015. Univariate Cox proportional hazard regression analyses were performed to determine the variables with high prognostic values for predicting CKD. We then identified interacting variables and grouped them according to diagnostic data categories. Our models used three types of data gathered at three points in time: non-laboratory, laboratory, and metabolic indices data. Next, we used subgroups of variables within each category to train two machine learning models (Random Forest and XGBoost). Our machine learning models can dynamically discriminate individuals at risk for developing CKD. All the models performed well using all three kinds of data, with or without laboratory data. Using only non-laboratory-based data (such as age, sex, body mass index (BMI), and waist circumference), both models predict chronic kidney disease as accurately as models using laboratory and metabolic indices data. Our machine learning models have demonstrated the use of different categories of diagnostic data for CKD prediction, with or without laboratory data. The machine learning models are simple to use and flexible because they work even with incomplete data and can be applied in any clinical setting, including settings where laboratory data is difficult to obtain.

Keywords: chronic kidney disease, glomerular filtration rate, creatinine, novel metabolic indices, machine learning, risk prediction

Procedia PDF Downloads 102
6115 Prediction of Dubai Financial Market Stocks Movement Using K-Nearest Neighbor and Support Vector Regression

Authors: Abdulla D. Alblooshi

Abstract:

The stock market is a representation of human behavior and psychology, such as fear, greed, and discipline. Those are manifested in the form of price movements during the trading sessions. Therefore, predicting the stock movement and prices is a challenging effort. However, those trading sessions produce a large amount of data that can be utilized to train an AI agent for the purpose of predicting the stock movement. Predicting the stock market price action will be advantageous. In this paper, the stock movement data of three DFM listed stocks are studied using historical price movements and technical indicators value and used to train an agent using KNN and SVM methods to predict the future price movement. MATLAB Toolbox and a simple script is written to process and classify the information and output the prediction. It will also compare the different learning methods and parameters s using metrics like RMSE, MAE, and R².

Keywords: KNN, ANN, style, SVM, stocks, technical indicators, RSI, MACD, moving averages, RMSE, MAE

Procedia PDF Downloads 164
6114 Modeling of Water Erosion in the M'Goun Watershed Using OpenGIS Software

Authors: M. Khal, Ab. Algouti, A. Algouti

Abstract:

Water erosion is the major cause of the erosion that shapes the earth's surface. Modeling water erosion requires the use of software and GIS programs, commercial or closed source. The very high prices for commercial GIS licenses, motivates users and researchers to find open source software as relevant and applicable as the proprietary GIS. The objective of this study is the modeling of water erosion and the hydrogeological and morphophysical characterization of the Oued M'Goun watershed (southern flank of the Central High Atlas) developed by free programs of GIS. The very pertinent results are obtained by executing tasks and algorithms in a simple and easy way. Thus, the various geoscientific and geostatistical analyzes of a digital elevation model (SRTM 30 m resolution) and their combination with the treatments and interpretation of satellite imagery information allowed us to characterize the region studied and to map the area most vulnerable to water erosion.

Keywords: central High-Atlas, hydrogeology, M’Goun watershed, OpenGis, water erosion

Procedia PDF Downloads 156
6113 Neuronal Networks for the Study of the Effects of Cosmic Rays on Climate Variations

Authors: Jossitt Williams Vargas Cruz, Aura Jazmín Pérez Ríos

Abstract:

The variations of solar dynamics have become a relevant topic of study due to the effects of climate changes generated on the earth. One of the most disconcerting aspects is the variability that the sun has on the climate is the role played by sunspots (extra-atmospheric variable) in the modulation of the Cosmic Rays CR (extra-atmospheric variable). CRs influence the earth's climate by affecting cloud formation (atmospheric variable), and solar cycle influence is associated with the presence of solar storms, and the magnetic activity is greater, resulting in less CR entering the earth's atmosphere. The different methods of climate prediction in Colombia do not take into account the extra-atmospheric variables. Therefore, correlations between atmospheric and extra-atmospheric variables were studied in order to implement a Python code based on neural networks to make the prediction of the extra-atmospheric variable with the highest correlation.

Keywords: correlations, cosmic rays, sun, sunspots and variations.

Procedia PDF Downloads 68
6112 A Wall Law for Two-Phase Turbulent Boundary Layers

Authors: Dhahri Maher, Aouinet Hana

Abstract:

The presence of bubbles in the boundary layer introduces corrections into the log law, which must be taken into account. In this work, a logarithmic wall law was presented for bubbly two phase flows. The wall law presented in this work was based on the postulation of additional turbulent viscosity associated with bubble wakes in the boundary layer. The presented wall law contained empirical constant accounting both for shear induced turbulence interaction and for non-linearity of bubble. This constant was deduced from experimental data. The wall friction prediction achieved with the wall law was compared to the experimental data, in the case of a turbulent boundary layer developing on a vertical flat plate in the presence of millimetric bubbles. A very good agreement between experimental and numerical wall friction prediction was verified. The agreement was especially noticeable for the low void fraction when bubble induced turbulence plays a significant role.

Keywords: bubbly flows, log law, boundary layer, CFD

Procedia PDF Downloads 275