Search results for: computational techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8328

Search results for: computational techniques

7368 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity

Authors: Urbánek Jiří J., Krahulec Josef, Urbánek Jiří F., Johanidesová Jitka

Abstract:

These Computational assistance for the research and modelling of critical infrastructure subjects continuity deal with this paper. It enables us the using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In the first part of the paper, it will be introduced entities, operators and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In the second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is a necessary condition for the encompassment of the both the emergency event and the mitigation of organization´s damages. Uninterrupted and continuous cycling process bring for crisis management fruitfulness and it is a good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.

Keywords: blazons, computational assistance, DYVELOP method, critical infrastructure

Procedia PDF Downloads 375
7367 An Integrated Cognitive Performance Evaluation Framework for Urban Search and Rescue Applications

Authors: Antonio D. Lee, Steven X. Jiang

Abstract:

A variety of techniques and methods are available to evaluate cognitive performance in Urban Search and Rescue (USAR) applications. However, traditional cognitive performance evaluation techniques typically incorporate either the conscious or systematic aspect, failing to take into consideration the subconscious or intuitive aspect. This leads to incomplete measures and produces ineffective designs. In order to fill the gaps in past research, this study developed a theoretical framework to facilitate the integration of situation awareness (SA) and intuitive pattern recognition (IPR) to enhance the cognitive performance representation in USAR applications. This framework provides guidance to integrate both SA and IPR in order to evaluate the cognitive performance of the USAR responders. The application of this framework will help improve the system design.

Keywords: cognitive performance, intuitive pattern recognition, situation awareness, urban search and rescue

Procedia PDF Downloads 324
7366 Quantitative Characterization of Single Orifice Hydraulic Flat Spray Nozzle

Authors: Y. C. Khoo, W. T. Lai

Abstract:

The single orifice hydraulic flat spray nozzle was evaluated with two global imaging techniques to characterize various aspects of the resulting spray. The two techniques were high resolution flow visualization and Particle Image Velocimetry (PIV). A CCD camera with 29 million pixels was used to capture shadowgraph images to realize ligament formation and collapse as well as droplet interaction. Quantitative analysis was performed to give the sizing information of the droplets and ligaments. This camera was then applied with a PIV system to evaluate the overall velocity field of the spray, from nozzle exit to droplet discharge. PIV images were further post-processed to determine the inclusion angle of the spray. The results from those investigations provided significant quantitative understanding of the spray structure. Based on the quantitative results, detailed understanding of the spray behavior was achieved.

Keywords: spray, flow visualization, PIV, shadowgraph, quantitative sizing, velocity field

Procedia PDF Downloads 374
7365 Short-Term Physiological Evaluation of Augmented Reality System for Thanatophobia Psychotherapy

Authors: Kais Siala, Mohamed Kharrat, Mohamed Abid

Abstract:

Exposure therapies encourage patients to gradually begin facing their painful memories of the trauma in order to reduce fear and anxiety. In this context, virtual reality techniques are widely used for treatment of different kinds of phobia. The particular case of fear of death phobia (thanataphobia) is addressed in this paper. For this purpose, we propose to make a simulation of Near Death Experience (NDE) using augmented reality techniques. We propose in particular to simulate the Out-of-Body experience (OBE) which is the first step of a Near-Death-Experience (NDE). In this paper, we present technical aspects of this simulation as well as short-term impact in terms of physiological measures. The non-linear Poincéré plot is used to describe the difference in Heart Rate Variability between In-Body and Out-Of-Body conditions.

Keywords: Out-of-Body simulation, physiological measure, augmented reality, phobia psychotherapy, HRV, Poincaré plot

Procedia PDF Downloads 301
7364 Linking Excellence in Biomedical Knowledge and Computational Intelligence Research for Personalized Management of Cardiovascular Diseases within Personal Health Care

Authors: T. Rocha, P. Carvalho, S. Paredes, J. Henriques, A. Bianchi, V. Traver, A. Martinez

Abstract:

The main goal of LINK project is to join competences in intelligent processing in order to create a research ecosystem to address two central scientific and technical challenges for personal health care (PHC) deployment: i) how to merge clinical evidence knowledge in computational decision support systems for PHC management and ii) how to provide achieve personalized services, i.e., solutions adapted to the specific user needs and characteristics. The final goal of one of the work packages (WP2), designated Sustainable Linking and Synergies for Excellence, is the definition, implementation and coordination of the necessary activities to create and to strengthen durable links between the LiNK partners. This work focuses on the strategy that has been followed to achieve the definition of the Research Tracks (RT), which will support a set of actions to be pursued along the LiNK project. These include common research activities, knowledge transfer among the researchers of the consortium, and PhD student and post-doc co-advisement. Moreover, the RTs will establish the basis for the definition of concepts and their evolution to project proposals.

Keywords: LiNK Twin European Project, personal health care, cardiovascular diseases, research tracks

Procedia PDF Downloads 214
7363 CFD Simulation on Gas Turbine Blade and Effect of Twisted Hole Shape on Film Cooling Effectiveness

Authors: Thulodin Mat Lazim, Aminuddin Saat, Ammar Fakhir Abdulwahid, Zaid Sattar Kareem

Abstract:

Film cooling is one of the cooling systems investigated for the application to gas turbine blades. Gas turbines use film cooling in addition to turbulence internal cooling to protect the blades outer surface from hot gases. The present study concentrates on the numerical investigation of film cooling performance for a row of twisted cylindrical holes in modern turbine blade. The adiabatic film effectiveness and the heat transfer coefficient are determined numerical on a flat plate downstream of a row of inclined different cross section area hole exit by using Computational Fluid Dynamics (CFD). The swirling motion of the film coolant was induced the twisted angle of film cooling holes, which inclined an angle of α toward the vertical direction and surface of blade turbine. The holes angle α of the impingement mainstream was changed from 90°, 65°, 45°, 30° and 20°. The film cooling effectiveness on surface of blade turbine wall was measured by using 3D Computational Fluid Dynamics (CFD). Results showed that the effectiveness of rectangular twisted hole has the effectiveness among other cross section area of the hole at blowing ratio (0.5, 1, 1.5 and 2).

Keywords: turbine blade cooling, film cooling, geometry shape of hole, turbulent flow

Procedia PDF Downloads 534
7362 Virtual Experiments on Coarse-Grained Soil Using X-Ray CT and Finite Element Analysis

Authors: Mohamed Ali Abdennadher

Abstract:

Digital rock physics, an emerging field leveraging advanced imaging and numerical techniques, offers a promising approach to investigating the mechanical properties of granular materials without extensive physical experiments. This study focuses on using X-Ray Computed Tomography (CT) to capture the three-dimensional (3D) structure of coarse-grained soil at the particle level, combined with finite element analysis (FEA) to simulate the soil's behavior under compression. The primary goal is to establish a reliable virtual testing framework that can replicate laboratory results and offer deeper insights into soil mechanics. The methodology involves acquiring high-resolution CT scans of coarse-grained soil samples to visualize internal particle morphology. These CT images undergo processing through noise reduction, thresholding, and watershed segmentation techniques to isolate individual particles, preparing the data for subsequent analysis. A custom Python script is employed to extract particle shapes and conduct a statistical analysis of particle size distribution. The processed particle data then serves as the basis for creating a finite element model comprising approximately 500 particles subjected to one-dimensional compression. The FEA simulations explore the effects of mesh refinement and friction coefficient on stress distribution at grain contacts. A multi-layer meshing strategy is applied, featuring finer meshes at inter-particle contacts to accurately capture mechanical interactions and coarser meshes within particle interiors to optimize computational efficiency. Despite the known challenges in parallelizing FEA to high core counts, this study demonstrates that an appropriate domain-level parallelization strategy can achieve significant scalability, allowing simulations to extend to very high core counts. The results show a strong correlation between the finite element simulations and laboratory compression test data, validating the effectiveness of the virtual experiment approach. Detailed stress distribution patterns reveal that soil compression behavior is significantly influenced by frictional interactions, with frictional sliding, rotation, and rolling at inter-particle contacts being the primary deformation modes under low to intermediate confining pressures. These findings highlight that CT data analysis combined with numerical simulations offers a robust method for approximating soil behavior, potentially reducing the need for physical laboratory experiments.

Keywords: X-Ray computed tomography, finite element analysis, soil compression behavior, particle morphology

Procedia PDF Downloads 16
7361 A New Multi-Target, Multi-Agent Search and Rescue Path Planning Approach

Authors: Jean Berger, Nassirou Lo, Martin Noel

Abstract:

Perfectly suited for natural or man-made emergency and disaster management situations such as flood, earthquakes, tornadoes, or tsunami, multi-target search path planning for a team of rescue agents is known to be computationally hard, and most techniques developed so far come short to successfully estimate optimality gap. A novel mixed-integer linear programming (MIP) formulation is proposed to optimally solve the multi-target multi-agent discrete search and rescue (SAR) path planning problem. Aimed at maximizing cumulative probability of successful target detection, it captures anticipated feedback information associated with possible observation outcomes resulting from projected path execution, while modeling agent discrete actions over all possible moving directions. Problem modeling further takes advantage of network representation to encompass decision variables, expedite compact constraint specification, and lead to substantial problem-solving speed-up. The proposed MIP approach uses CPLEX optimization machinery, efficiently computing near-optimal solutions for practical size problems, while giving a robust upper bound obtained from Lagrangean integrality constraint relaxation. Should eventually a target be positively detected during plan execution, a new problem instance would simply be reformulated from the current state, and then solved over the next decision cycle. A computational experiment shows the feasibility and the value of the proposed approach.

Keywords: search path planning, search and rescue, multi-agent, mixed-integer linear programming, optimization

Procedia PDF Downloads 366
7360 Crashworthiness Optimization of an Automotive Front Bumper in Composite Material

Authors: S. Boria

Abstract:

In the last years, the crashworthiness of an automotive body structure can be improved, since the beginning of the design stage, thanks to the development of specific optimization tools. It is well known how the finite element codes can help the designer to investigate the crashing performance of structures under dynamic impact. Therefore, by coupling nonlinear mathematical programming procedure and statistical techniques with FE simulations, it is possible to optimize the design with reduced number of analytical evaluations. In engineering applications, many optimization methods which are based on statistical techniques and utilize estimated models, called meta-models, are quickly spreading. A meta-model is an approximation of a detailed simulation model based on a dataset of input, identified by the design of experiments (DOE); the number of simulations needed to build it depends on the number of variables. Among the various types of meta-modeling techniques, Kriging method seems to be excellent in accuracy, robustness and efficiency compared to other ones when applied to crashworthiness optimization. Therefore the application of such meta-model was used in this work, in order to improve the structural optimization of a bumper for a racing car in composite material subjected to frontal impact. The specific energy absorption represents the objective function to maximize and the geometrical parameters subjected to some design constraints are the design variables. LS-DYNA codes were interfaced with LS-OPT tool in order to find the optimized solution, through the use of a domain reduction strategy. With the use of the Kriging meta-model the crashworthiness characteristic of the composite bumper was improved.

Keywords: composite material, crashworthiness, finite element analysis, optimization

Procedia PDF Downloads 251
7359 Constraint-Based Computational Modelling of Bioenergetic Pathway Switching in Synaptic Mitochondria from Parkinson's Disease Patients

Authors: Diana C. El Assal, Fatima Monteiro, Caroline May, Peter Barbuti, Silvia Bolognin, Averina Nicolae, Hulda Haraldsdottir, Lemmer R. P. El Assal, Swagatika Sahoo, Longfei Mao, Jens Schwamborn, Rejko Kruger, Ines Thiele, Kathrin Marcus, Ronan M. T. Fleming

Abstract:

Degeneration of substantia nigra pars compacta dopaminergic neurons is one of the hallmarks of Parkinson's disease. These neurons have a highly complex axonal arborisation and a high energy demand, so any reduction in ATP synthesis could lead to an imbalance between supply and demand, thereby impeding normal neuronal bioenergetic requirements. Synaptic mitochondria exhibit increased vulnerability to dysfunction in Parkinson's disease. After biogenesis in and transport from the cell body, synaptic mitochondria become highly dependent upon oxidative phosphorylation. We applied a systems biochemistry approach to identify the metabolic pathways used by neuronal mitochondria for energy generation. The mitochondrial component of an existing manual reconstruction of human metabolism was extended with manual curation of the biochemical literature and specialised using omics data from Parkinson's disease patients and controls, to generate reconstructions of synaptic and somal mitochondrial metabolism. These reconstructions were converted into stoichiometrically- and fluxconsistent constraint-based computational models. These models predict that Parkinson's disease is accompanied by an increase in the rate of glycolysis and a decrease in the rate of oxidative phosphorylation within synaptic mitochondria. This is consistent with independent experimental reports of a compensatory switching of bioenergetic pathways in the putamen of post-mortem Parkinson's disease patients. Ongoing work, in the context of the SysMedPD project is aimed at computational prediction of mitochondrial drug targets to slow the progression of neurodegeneration in the subset of Parkinson's disease patients with overt mitochondrial dysfunction.

Keywords: bioenergetics, mitochondria, Parkinson's disease, systems biochemistry

Procedia PDF Downloads 289
7358 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 218
7357 Effect of Relaxation Techniques on Immunological Properties of Breast Milk

Authors: Ahmed Ali Torad

Abstract:

Background: Breast feeding maintains the maternal fetal immunological link, favours the transmission of immune-competence from the mother to her infant and is considered an important contributory factor to the neo natal immune defense system. Purpose: This study was conducted to investigate the effect of relaxation techniques on immunological properties of breast milk. Subjects and Methods: Thirty breast feeding mothers with a single, mature infant without any complications participated in the study. Subjects will be recruited from outpatient clinic of obstetric department of El Kasr El-Aini university hospital in Cairo. Mothers were randomly divided into two equal groups using coin toss method: Group (A) (relaxation training group) (experimental group): It will be composed of 15 women who received relaxation training program in addition to breast feeding and nutritional advices and Group (B) (control group): It will be composed of 15 women who received breast feeding and nutritional advices only. Results: The results showed that mean mother’s age was 28.4 ± 3.68 and 28.07 ± 4.09 for group A and B respectively, there were statistically significant differences between pre and post values regarding cortisol level, IgA level, leucocyte count and infant’s weight and height and there is only statistically significant differences between both groups regarding post values of all immunological variables (cortisol – IgA – leucocyte count). Conclusion: We could conclude that there is a statistically significant effect of relaxation techniques on immunological properties of breast milk.

Keywords: relaxation, breast, milk, immunology, lactation

Procedia PDF Downloads 112
7356 Comparison Between Fuzzy and P&O Control for MPPT for Photovoltaic System Using Boost Converter

Authors: M. Doumi, A. Miloudi, A. G. Aissaoui, K. Tahir, C. Belfedal, S. Tahir

Abstract:

The studies on the photovoltaic system are extensively increasing because of a large, secure, essentially exhaustible and broadly available resource as a future energy supply. However, the output power induced in the photovoltaic modules is influenced by an intensity of solar cell radiation, temperature of the solar cells and so on. Therefore, to maximize the efficiency of the photovoltaic system, it is necessary to track the maximum power point of the PV array, for this Maximum Power Point Tracking (MPPT) technique is used. Some MPPT techniques are available in that perturbation and observation (P&O) and Fuzzy logic controller (FLC). The fuzzy control method has been compared with perturb and observe (P&O) method as one of the most widely conventional method used in this area. Both techniques have been analyzed and simulated. MPPT using fuzzy logic shows superior performance and more reliable control with respect to the P&O technique for this application.

Keywords: photovoltaic system, MPPT, perturb and observe, fuzzy logic

Procedia PDF Downloads 595
7355 Dynamic Bandwidth Allocation in Fiber-Wireless (FiWi) Networks

Authors: Eman I. Raslan, Haitham S. Hamza, Reda A. El-Khoribi

Abstract:

Fiber-Wireless (FiWi) networks are a promising candidate for future broadband access networks. These networks combine the optical network as the back end where different passive optical network (PON) technologies are realized and the wireless network as the front end where different wireless technologies are adopted, e.g. LTE, WiMAX, Wi-Fi, and Wireless Mesh Networks (WMNs). The convergence of both optical and wireless technologies requires designing architectures with robust efficient and effective bandwidth allocation schemes. Different bandwidth allocation algorithms have been proposed in FiWi networks aiming to enhance the different segments of FiWi networks including wireless and optical subnetworks. In this survey, we focus on the differentiating between the different bandwidth allocation algorithms according to their enhancement segment of FiWi networks. We classify these techniques into wireless, optical and Hybrid bandwidth allocation techniques.

Keywords: fiber-wireless (FiWi), dynamic bandwidth allocation (DBA), passive optical networks (PON), media access control (MAC)

Procedia PDF Downloads 527
7354 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 500
7353 [Keynote Talk]: Applying p-Balanced Energy Technique to Solve Liouville-Type Problems in Calculus

Authors: Lina Wu, Ye Li, Jia Liu

Abstract:

We are interested in solving Liouville-type problems to explore constancy properties for maps or differential forms on Riemannian manifolds. Geometric structures on manifolds, the existence of constancy properties for maps or differential forms, and energy growth for maps or differential forms are intertwined. In this article, we concentrate on discovery of solutions to Liouville-type problems where manifolds are Euclidean spaces (i.e. flat Riemannian manifolds) and maps become real-valued functions. Liouville-type results of vanishing properties for functions are obtained. The original work in our research findings is to extend the q-energy for a function from finite in Lq space to infinite in non-Lq space by applying p-balanced technique where q = p = 2. Calculation skills such as Hölder's Inequality and Tests for Series have been used to evaluate limits and integrations for function energy. Calculation ideas and computational techniques for solving Liouville-type problems shown in this article, which are utilized in Euclidean spaces, can be universalized as a successful algorithm, which works for both maps and differential forms on Riemannian manifolds. This innovative algorithm has a far-reaching impact on research work of solving Liouville-type problems in the general settings involved with infinite energy. The p-balanced technique in this algorithm provides a clue to success on the road of q-energy extension from finite to infinite.

Keywords: differential forms, holder inequality, Liouville-type problems, p-balanced growth, p-harmonic maps, q-energy growth, tests for series

Procedia PDF Downloads 227
7352 Multidimensional Modeling of Solidification Process of Multi-Crystalline Silicon under Magnetic Field for Solar Cell Technology

Authors: Mouhamadou Diop, Mohamed I. Hassan

Abstract:

Molten metallic flow in metallurgical plant is highly turbulent and presents a complex coupling with heat transfer, phase transfer, chemical reaction, momentum transport, etc. Molten silicon flow has significant effect in directional solidification of multicrystalline silicon by affecting the temperature field and the emerging crystallization interface as well as the transport of species and impurities during casting process. Owing to the complexity and limits of reliable measuring techniques, computational models of fluid flow are useful tools to study and quantify these problems. The overall objective of this study is to investigate the potential of a traveling magnetic field for an efficient operating control of the molten metal flow. A multidimensional numerical model will be developed for the calculations of Lorentz force, molten metal flow, and the related phenomenon. The numerical model is implemented in a laboratory-scale silicon crystallization furnace. This study presents the potential of traveling magnetic field approach for an efficient operating control of the molten flow. A numerical model will be used to study the effects of magnetic force applied on the molten flow, and their interdependencies. In this paper, coupled and decoupled, steady and unsteady models of molten flow and crystallization interface will be compared. This study will allow us to retrieve the optimal traveling magnetic field parameter range for crystallization furnaces and the optimal numerical simulations strategy for industrial application.

Keywords: multidimensional, numerical simulation, solidification, multicrystalline, traveling magnetic field

Procedia PDF Downloads 241
7351 Non-Centrifugal Cane Sugar Production: Heat Transfer Study to Optimize the Use of Energy

Authors: Fabian Velasquez, John Espitia, Henry Hernadez, Sebastian Escobar, Jader Rodriguez

Abstract:

Non-centrifuged cane sugar (NCS) is a concentrated product obtained through the evaporation of water contain from sugarcane juice inopen heat exchangers (OE). The heat supplied to the evaporation stages is obtained from the cane bagasse through the thermochemical process of combustion, where the thermal energy released is transferred to OE by the flue gas. Therefore, the optimization of energy usage becomes essential for the proper design of the production process. For optimize the energy use, it is necessary modeling and simulation of heat transfer between the combustion gases and the juice and to understand the major mechanisms involved in the heat transfer. The main objective of this work was simulated heat transfer phenomena between the flue gas and open heat exchangers using Computational Fluid Dynamics model (CFD). The simulation results were compared to field measured data. Numerical results about temperature profile along the flue gas pipeline at the measurement points are in good accordance with field measurements. Thus, this study could be of special interest in design NCS production process and the optimization of the use of energy.

Keywords: mathematical modeling, design variables, computational fluid dynamics, overall thermal efficiency

Procedia PDF Downloads 120
7350 A Finite Element/Finite Volume Method for Dam-Break Flows over Deformable Beds

Authors: Alia Alghosoun, Ashraf Osman, Mohammed Seaid

Abstract:

A coupled two-layer finite volume/finite element method was proposed for solving dam-break flow problem over deformable beds. The governing equations consist of the well-balanced two-layer shallow water equations for the water flow and a linear elastic model for the bed deformations. Deformations in the topography can be caused by a brutal localized force or simply by a class of sliding displacements on the bathymetry. This deformation in the bed is a source of perturbations, on the water surface generating water waves which propagate with different amplitudes and frequencies. Coupling conditions at the interface are also investigated in the current study and two mesh procedure is proposed for the transfer of information through the interface. In the present work a new procedure is implemented at the soil-water interface using the finite element and two-layer finite volume meshes with a conservative distribution of the forces at their intersections. The finite element method employs quadratic elements in an unstructured triangular mesh and the finite volume method uses the Rusanove to reconstruct the numerical fluxes. The numerical coupled method is highly efficient, accurate, well balanced, and it can handle complex geometries as well as rapidly varying flows. Numerical results are presented for several test examples of dam-break flows over deformable beds. Mesh convergence study is performed for both methods, the overall model provides new insight into the problems at minimal computational cost.

Keywords: dam-break flows, deformable beds, finite element method, finite volume method, hybrid techniques, linear elasticity, shallow water equations

Procedia PDF Downloads 174
7349 Use of Multivariate Statistical Techniques for Water Quality Monitoring Network Assessment, Case of Study: Jequetepeque River Basin

Authors: Jose Flores, Nadia Gamboa

Abstract:

A proper water quality management requires the establishment of a monitoring network. Therefore, evaluation of the efficiency of water quality monitoring networks is needed to ensure high-quality data collection of critical quality chemical parameters. Unfortunately, in some Latin American countries water quality monitoring programs are not sustainable in terms of recording historical data or environmentally representative sites wasting time, money and valuable information. In this study, multivariate statistical techniques, such as principal components analysis (PCA) and hierarchical cluster analysis (HCA), are applied for identifying the most significant monitoring sites as well as critical water quality parameters in the monitoring network of the Jequetepeque River basin, in northern Peru. The Jequetepeque River basin, like others in Peru, shows socio-environmental conflicts due to economical activities developed in this area. Water pollution by trace elements in the upper part of the basin is mainly related with mining activity, and agricultural land lost due to salinization is caused by the extensive use of groundwater in the lower part of the basin. Since the 1980s, the water quality in the basin has been non-continuously assessed by public and private organizations, and recently the National Water Authority had established permanent water quality networks in 45 basins in Peru. Despite many countries use multivariate statistical techniques for assessing water quality monitoring networks, those instruments have never been applied for that purpose in Peru. For this reason, the main contribution of this study is to demonstrate that application of the multivariate statistical techniques could serve as an instrument that allows the optimization of monitoring networks using least number of monitoring sites as well as the most significant water quality parameters, which would reduce costs concerns and improve the water quality management in Peru. Main socio-economical activities developed and the principal stakeholders related to the water management in the basin are also identified. Finally, water quality management programs will also be discussed in terms of their efficiency and sustainability.

Keywords: PCA, HCA, Jequetepeque, multivariate statistical

Procedia PDF Downloads 349
7348 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security

Authors: Shanshan Zhu, Mohammad Nasim

Abstract:

Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.

Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection

Procedia PDF Downloads 19
7347 Calculation of Orbital Elements for Sending Interplanetary Probes

Authors: Jorge Lus Nisperuza Toledo, Juan Pablo Rubio Ospina, Daniel Santiago Umana, Hector Alejandro Alvarez

Abstract:

This work develops and implements computational codes to calculate the optimal launch trajectories for sending a probe from the earth to different planets of the Solar system, making use of trajectories of the Hohmann and No-Hohmann type and gravitational assistance in intermediate steps. Specifically, the orbital elements, the graphs and the dynamic simulations of the trajectories for sending a probe from the Earth towards the planets Mercury, Venus, Mars, Jupiter, and Saturn are obtained. A detailed study was made of the state vectors of the position and orbital velocity of the considered planets in order to determine the optimal trajectories of the probe. For this purpose, computer codes were developed and implemented to obtain the orbital elements of the Mariner 10 (Mercury), Magellan (Venus), Mars Global Surveyor (Mars) and Voyager 1 (Jupiter and Saturn) missions, as an exercise in corroborating the algorithms. This exercise gives validity to computational codes, allowing to find the orbital elements and the simulations of trajectories of three future interplanetary missions with specific launch windows.

Keywords: gravitational assistance, Hohmann’s trajectories, interplanetary mission, orbital elements

Procedia PDF Downloads 177
7346 Auteur 3D Filmmaking: From Hitchcock’s Protrusion Technique to Godard’s Immersion Aesthetic

Authors: Delia Enyedi

Abstract:

Throughout film history, the regular return of 3D cinema has been discussed in connection to crises caused by the advent of television or the competition of the Internet. In addition, the three waves of stereoscopic 3D (from 1952 up to 1983) and its current digital version have been blamed for adding a challenging technical distraction to the viewing experience. By discussing the films Dial M for Murder (1954) and Goodbye to Language (2014), the paper aims to analyze the response of recognized auteurs to the use of 3D techniques in filmmaking. For Alfred Hitchcock, the solution to attaining perceptual immersion paradoxically resided in restraining the signature effect of 3D, namely protrusion. In Jean-Luc Godard’s vision, 3D techniques allowed him to explore perceptual absorption by means of depth of field, for which he had long advocated as being central to cinema. Thus, both directors contribute to the foundation of an auteur aesthetic in 3D filmmaking.

Keywords: Alfred Hitchcock, authorship, 3D filmmaking, Jean-Luc Godard, perceptual absorption, perceptual immersion

Procedia PDF Downloads 283
7345 Documenting the 15th Century Prints with RTI

Authors: Peter Fornaro, Lothar Schmitt

Abstract:

The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.

Keywords: art history, computational photography, paste prints, reflectance transformation imaging

Procedia PDF Downloads 273
7344 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques

Authors: Jonathan Iworiso

Abstract:

Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.

Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains

Procedia PDF Downloads 99
7343 Scientific Development as Diffusion on a Social Network: An Empirical Case Study

Authors: Anna Keuchenius

Abstract:

Broadly speaking, scientific development is studied in either a qualitative manner with a focus on the behavior and interpretations of academics, such as the sociology of science and science studies or in a quantitative manner with a focus on the analysis of publications, such as scientometrics and bibliometrics. Both come with a different set of methodologies and few cross-references. This paper contributes to the bridging of this divide, by on the on hand approaching the process of scientific progress from a qualitative sociological angle and using on the other hand quantitative and computational techniques. As a case study, we analyze the diffusion of Granovetter's hypothesis from his 1973 paper 'On The Strength of Weak Ties.' A network is constructed of all scientists that have referenced this particular paper, with directed edges to all other researchers that are concurrently referenced with Granovetter's 1973 paper. Studying the structure and growth of this network over time, it is found that Granovetter's hypothesis is used by distinct communities of scientists, each with their own key-narrative into which the hypothesis is fit. The diffusion within the communities shares similarities with the diffusion of an innovation in which innovators, early adopters, and an early-late majority can clearly be distinguished. Furthermore, the network structure shows that each community is clustered around one or few hub scientists that are disproportionately often referenced and seem largely responsible for carrying the hypothesis into their scientific subfield. The larger implication of this case study is that the diffusion of scientific hypotheses and ideas are not the spreading of well-defined objects over a network. Rather, the diffusion is a process in which the object itself dynamically changes in concurrence with its spread. Therefore it is argued that the methodology presented in this paper has potential beyond the scientific domain, in the study of diffusion of other not well-defined objects, such as opinions, behavior, and ideas.

Keywords: diffusion of innovations, network analysis, scientific development, sociology of science

Procedia PDF Downloads 299
7342 An Integrative Computational Pipeline for Detection of Tumor Epitopes in Cancer Patients

Authors: Tanushree Jaitly, Shailendra Gupta, Leila Taher, Gerold Schuler, Julio Vera

Abstract:

Genomics-based personalized medicine is a promising approach to fight aggressive tumors based on patient's specific tumor mutation and expression profiles. A remarkable case is, dendritic cell-based immunotherapy, in which tumor epitopes targeting patient's specific mutations are used to design a vaccine that helps in stimulating cytotoxic T cell mediated anticancer immunity. Here we present a computational pipeline for epitope-based personalized cancer vaccines using patient-specific haplotype and cancer mutation profiles. In the workflow proposed, we analyze Whole Exome Sequencing and RNA Sequencing patient data to detect patient-specific mutations and their expression level. Epitopes including the tumor mutations are computationally predicted using patient's haplotype and filtered based on their expression level, binding affinity, and immunogenicity. We calculate binding energy for each filtered major histocompatibility complex (MHC)-peptide complex using docking studies, and use this feature to select good epitope candidates further.

Keywords: cancer immunotherapy, epitope prediction, NGS data, personalized medicine

Procedia PDF Downloads 244
7341 Determination of Weathering at Kilistra Ancient City by Using Non-Destructive Techniques, Central Anatolia, Turkey

Authors: İsmail İnce, Osman Günaydin, Fatma Özer

Abstract:

Stones used in the construction of historical structures are exposed to various direct or indirect atmospheric effects depending on climatic conditions. Building stones deteriorate partially or fully as a result of this exposure. The historic structures are important symbols of any cultural heritage. Therefore, it is important to protect and restore these historical structures. The aim of this study is to determine the weathering conditions at the Kilistra ancient city. It is located in the southwest of the Konya city, Central Anatolia, and was built by carving into pyroclastic rocks during the Byzantine Era. For this purpose, the petrographic and mechanical properties of the pyroclastic rocks were determined. In the assessment of weathering of structures in the ancient city, in-situ non-destructive testing (i.e., Schmidt hardness rebound value, relative humidity measurement) methods were applied.

Keywords: cultural heritage, Kilistra ancient city, non-destructive techniques, weathering

Procedia PDF Downloads 355
7340 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques

Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad

Abstract:

In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.

Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet

Procedia PDF Downloads 133
7339 A Study on Thermal and Flow Characteristics by Solar Radiation for Single-Span Greenhouse by Computational Fluid Dynamics Simulation

Authors: Jonghyuk Yoon, Hyoungwoon Song

Abstract:

Recently, there are lots of increasing interest in a smart farming that represents application of modern Information and Communication Technologies (ICT) into agriculture since it provides a methodology to optimize production efficiencies by managing growing conditions of crops automatically. In order to obtain high performance and stability for smart greenhouse, it is important to identify the effect of various working parameters such as capacity of ventilation fan, vent opening area and etc. In the present study, a 3-dimensional CFD (Computational Fluid Dynamics) simulation for single-span greenhouse was conducted using the commercial program, Ansys CFX 18.0. The numerical simulation for single-span greenhouse was implemented to figure out the internal thermal and flow characteristics. In order to numerically model solar radiation that spread over a wide range of wavelengths, the multiband model that discretizes the spectrum into finite bands of wavelength based on Wien’s law is applied to the simulation. In addition, absorption coefficient of vinyl varied with the wavelength bands is also applied based on Beer-Lambert Law. To validate the numerical method applied herein, the numerical results of the temperature at specific monitoring points were compared with the experimental data. The average error rates (12.2~14.2%) between them was shown and numerical results of temperature distribution are in good agreement with the experimental data. The results of the present study can be useful information for the design of various greenhouses. This work was supported by Korea Institute of Planning and Evaluation for Technology in Food, Agriculture, Forestry and Fisheries (IPET) through Advanced Production Technology Development Program, funded by Ministry of Agriculture, Food and Rural Affairs (MAFRA)(315093-03).

Keywords: single-span greenhouse, CFD (computational fluid dynamics), solar radiation, multiband model, absorption coefficient

Procedia PDF Downloads 131