Search results for: virtual simulation gaming
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6143

Search results for: virtual simulation gaming

4133 A Crop Growth Subroutine for Watershed Resources Management (WRM) Model 1: Description

Authors: Kingsley Nnaemeka Ogbu, Constantine Mbajiorgu

Abstract:

Vegetation has a marked effect on runoff and has become an important component in hydrologic model. The watershed Resources Management (WRM) model, a process-based, continuous, distributed parameter simulation model developed for hydrologic and soil erosion studies at the watershed scale lack a crop growth component. As such, this model assumes a constant parameter values for vegetation and hydraulic parameters throughout the duration of hydrologic simulation. Our approach is to develop a crop growth algorithm based on the original plant growth model used in the Environmental Policy Integrated Climate Model (EPIC) model. This paper describes the development of a single crop growth model which has the capability of simulating all crops using unique parameter values for each crop. Simulated crop growth processes will reflect the vegetative seasonality of the natural watershed system. An existing model was employed for evaluating vegetative resistance by hydraulic and vegetative parameters incorporated into the WRM model. The improved WRM model will have the ability to evaluate the seasonal variation of the vegetative roughness coefficient with depth of flow and further enhance the hydrologic model’s capability for accurate hydrologic studies.

Keywords: runoff, roughness coefficient, PAR, WRM model

Procedia PDF Downloads 380
4132 Model Based Design of Fly-by-Wire Flight Controls System of a Fighter Aircraft

Authors: Nauman Idrees

Abstract:

Modeling and simulation during the conceptual design phase are the most effective means of system testing resulting in time and cost savings as compared to the testing of hardware prototypes, which are mostly not available during the conceptual design phase. This paper uses the model-based design (MBD) method in designing the fly-by-wire flight controls system of a fighter aircraft using Simulink. The process begins with system definition and layout where modeling requirements and system components were identified, followed by hierarchical system layout to identify the sequence of operation and interfaces of system with external environment as well as the internal interface between the components. In the second step, each component within the system architecture was modeled along with its physical and functional behavior. Finally, all modeled components were combined to form the fly-by-wire flight controls system of a fighter aircraft as per system architecture developed. The system model developed using this method can be simulated using any simulation software to ensure that desired requirements are met even without the development of a physical prototype resulting in time and cost savings.

Keywords: fly-by-wire, flight controls system, model based design, Simulink

Procedia PDF Downloads 120
4131 Serious Game for Learning: A Model for Efficient Game Development

Authors: Zahara Abdulhussan Al-Awadai

Abstract:

In recent years, serious games have started to gain an increasing interest as a tool to support learning across different educational and training fields. It began to serve as a powerful educational tool for improving learning outcomes. In this research, we discuss the potential of virtual experiences and games research outside of the games industry and explore the multifaceted impact of serious games and related technologies on various aspects of our lives. We highlight the usage of serious games as a tool to improve education and other applications with a purpose beyond the entertainment industry. One of the main contributions of this research is proposing a model that facilitates the design and development of serious games in a flexible and easy-to-use way. This is achieved by exploring different requirements to develop a model that describes a serious game structure with a focus on both aspects of serious games (educational and entertainment aspects).

Keywords: game development, requirements, serious games, serious game model

Procedia PDF Downloads 68
4130 Near Shore Wave Manipulation for Electricity Generation

Authors: K. D. R. Jagath-Kumara, D. D. Dias

Abstract:

The sea waves carry thousands of GWs of power globally. Although there are a number of different approaches to harness offshore energy, they are likely to be expensive, practically challenging and vulnerable to storms. Therefore, this paper considers using the near shore waves for generating mechanical and electrical power. It introduces two new approaches, the wave manipulation and using a variable duct turbine, for intercepting very wide wave fronts and coping with the fluctuations of the wave height and the sea level, respectively. The first approach effectively allows capturing much more energy yet with a much narrower turbine rotor. The second approach allows using a rotor with a smaller radius but captures energy of higher wave fronts at higher sea levels yet preventing it from totally submerging. To illustrate the effectiveness of the approach, the paper contains a description and the simulation results of a scale model of a wave manipulator. Then, it includes the results of testing a physical model of the manipulator and a single duct, axial flow turbine, in a wave flume in the laboratory. The paper also includes comparisons of theoretical predictions, simulation results and wave flume tests with respect to the incident energy, loss in wave manipulation, minimal loss, brake torque and the angular velocity.

Keywords: near-shore sea waves, renewable energy, wave energy conversion, wave manipulation

Procedia PDF Downloads 484
4129 ePAM: Advancing Sustainable Mobility through Digital Parking, AI-Driven Vehicle Recognition, and CO₂ Reporting

Authors: Robert Monsberger

Abstract:

The increasing scarcity of resources and the pressing challenge of climate change demand transformative technological, economic, and societal approaches. In alignment with the European Green Deal's goal to achieve net-zero greenhouse gas emissions by 2050, this paper presents the development and implementation of an electronic parking and mobility system (ePAM). This system offers a distinct, integrated solution aimed at promoting climate-positive mobility, reducing individual vehicle use, and advancing the digital transformation of off-street parking. The core objectives include the accurate recognition of electric vehicles and occupant counts using advanced camera-based systems, achieving a very high accuracy. This capability enables the dynamic categorization and classification of vehicles to provide fair and automated tariff adjustments. The study also seeks to replace physical barriers with virtual ‘digital gates’ using augmented reality, significantly improving user acceptance as shown in studies conducted. The system is designed to operate as an end-to-end software solution, enabling a fully digital and paperless parking management system by leveraging license plate recognition (LPR) and metadata processing. By eliminating physical infrastructure like gates and terminals, the system significantly reduces resource consumption, maintenance complexity, and operational costs while enhancing energy efficiency. The platform also integrates CO₂ reporting tools to support compliance with upcoming EU emission trading schemes and to incentivize eco-friendly transportation behaviors. By fostering the adoption of electric vehicles and ride-sharing models, the system contributes to the optimization of traffic flows and the minimization of search traffic in urban centers. The platform's open data interfaces enable seamless integration into multimodal transport systems, facilitating a transition from individual to public transportation modes. This study emphasizes sustainability, data privacy, and compliance with the AI Act, aiming to achieve a market share of at least 4.5% in the DACH region by 2030. ePAM sets a benchmark for innovative mobility solutions, driving significant progress toward climate-neutral urban mobility.

Keywords: sustainable mobility, digital parking, AI-driven vehicle recognition, license plate recognition, virtual gates, multimodal transport integration

Procedia PDF Downloads 10
4128 Progressive Type-I Interval Censoring with Binomial Removal-Estimation and Its Properties

Authors: Sonal Budhiraja, Biswabrata Pradhan

Abstract:

This work considers statistical inference based on progressive Type-I interval censored data with random removal. The scheme of progressive Type-I interval censoring with random removal can be described as follows. Suppose n identical items are placed on a test at time T0 = 0 under k pre-fixed inspection times at pre-specified times T1 < T2 < . . . < Tk, where Tk is the scheduled termination time of the experiment. At inspection time Ti, Ri of the remaining surviving units Si, are randomly removed from the experiment. The removal follows a binomial distribution with parameters Si and pi for i = 1, . . . , k, with pk = 1. In this censoring scheme, the number of failures in different inspection intervals and the number of randomly removed items at pre-specified inspection times are observed. Asymptotic properties of the maximum likelihood estimators (MLEs) are established under some regularity conditions. A β-content γ-level tolerance interval (TI) is determined for two parameters Weibull lifetime model using the asymptotic properties of MLEs. The minimum sample size required to achieve the desired β-content γ-level TI is determined. The performance of the MLEs and TI is studied via simulation.

Keywords: asymptotic normality, consistency, regularity conditions, simulation study, tolerance interval

Procedia PDF Downloads 254
4127 A Study of the Trap of Multi-Homing in Customers: A Comparative Case Study of Digital Payments

Authors: Shari S. C. Shang, Lynn S. L. Chiu

Abstract:

In the digital payment market, some consumers use only one payment wallet while many others play multi-homing with a variety of payment services. With the diffusion of new payment systems, we examined the determinants of the adoption of multi-homing behavior. This study aims to understand how a digital payment provider dynamically expands business touch points with cross-business strategies to enrich the digital ecosystem and avoid the trap of multi-homing in customers. By synthesizing platform ecosystem literature, we constructed a two-dimensional research framework with one determinant of user digital behavior from offline to online intentions and the other determinant of digital payment touch points from convenient accessibility to cross-business platforms. To explore on a broader scale, we selected 12 digital payments from 5 countries of UK, US, Japan, Korea, and Taiwan. With the interplays of user digital behaviors and payment touch points, we group the study cases into four types: (1) Channel Initiated: users originated from retailers with high access to in-store shopping with face-to-face guidance for payment adoption. Providers offer rewards for customer loyalty and secure the retailer’s efficient cash flow management. (2) Social Media Dependent: users usually are digital natives with high access to social media or the internet who shop and pay digitally. Providers might not own physical or online shops but are licensed to aggregate money flows through virtual ecosystems. (3) Early Life Engagement: digital banks race to capture the next generation from popularity to profitability. This type of payment aimed to give children a taste of financial freedom while letting parents track their spending. Providers are to capitalize on the digital payment and e-commerce boom and hold on to new customers into adulthood. (4) Traditional Banking: plastic credit cards are purposely designed as a control group to track the evolvement of business strategies in digital payments. Traditional credit card users may follow the bank’s digital strategy to land on different types of digital wallets or mostly keep using plastic credit cards. This research analyzed business growth models and inter-firms’ coopetition strategies of the selected cases. Results of the multiple case analysis reveal that channel initiated payments bundled rewards with retailer’s business discount for recurring purchases. They also extended other financial services, such as insurance, to fulfill customers’ new demands. Contrastively, social media dependent payments developed new usages and new value creation, such as P2P money transfer through network effects among the virtual social ties, while early life engagements offer virtual banking products to children who are digital natives but overlooked by incumbents. It has disrupted the banking business domains in preparation for the metaverse economy. Lastly, the control group of traditional plastic credit cards has gradually converted to a BaaS (banking as a service) model depending on customers’ preferences. The multi-homing behavior is not avoidable in digital payment competitions. Payment providers may encounter multiple waves of a multi-homing threat after a short period of success. A dynamic cross-business collaboration strategy should be explored to continuously evolve the digital ecosystems and allow users for a broader shopping experience and continual usage.

Keywords: digital payment, digital ecosystems, multihoming users, cross business strategy, user digital behavior intentions

Procedia PDF Downloads 169
4126 A Study on the False Alarm Rates of MEWMA and MCUSUM Control Charts When the Parameters Are Estimated

Authors: Umar Farouk Abbas, Danjuma Mustapha, Hamisu Idi

Abstract:

It is now a known fact that quality is an important issue in manufacturing industries. A control chart is an integrated and powerful tool in statistical process control (SPC). The mean µ and standard deviation σ parameters are estimated. In general, the multivariate exponentially weighted moving average (MEWMA) and multivariate cumulative sum (MCUSUM) are used in the detection of small shifts in joint monitoring of several correlated variables; the charts used information from past data which makes them sensitive to small shifts. The aim of the paper is to compare the performance of Shewhart xbar, MEWMA, and MCUSUM control charts in terms of their false rates when parameters are estimated with autocorrelation. A simulation was conducted in R software to generate the average run length (ARL) values of each of the charts. After the analysis, the results show that a comparison of the false alarm rates of the charts shows that MEWMA chart has lower false alarm rates than the MCUSUM chart at various levels of parameter estimated to the number of ARL0 (in control) values. Also noticed was that the sample size has an advert effect on the false alarm of the control charts.

Keywords: average run length, MCUSUM chart, MEWMA chart, false alarm rate, parameter estimation, simulation

Procedia PDF Downloads 226
4125 Learning and Teaching Strategies in Association with EXE Program for Master Course Students of Yerevan Brusov State University of Languages and Social Sciences

Authors: Susanna Asatryan

Abstract:

The author will introduce a single module related to English teaching methodology for master course students getting specialization “A Foreign Language Teacher of High Schools And Professional Educational Institutions” of Yerevan Brusov State University of Languages and Social Sciences. The overall aim of the presentation is to introduce learning and teaching strategies within EXE Computer program for Mastery student-teachers of the University. The author will display the advantages of the use of this program. The learners interact with the teacher in the classroom as well as they are provided an opportunity for virtual domain to carry out their learning procedures in association with assessment and self-assessment. So they get integrated into blended learning. As this strategy is in its piloting stage, the author has elaborated a single module, embracing 3 main sections: -Teaching English vocabulary at high school, -Teaching English grammar at high school, and -Teaching English pronunciation at high school. The author will present the above mentioned topics with corresponding sections and subsections. The strong point is that preparing this module we have planned to display it on the blended learning landscape. So for this account working with EXE program is highly effective. As it allows the users to operate several tools for self-learning and self-testing/assessment. The author elaborated 3 single EXE files for each topic. Each file starts with the section’s subject-specific description: - Objectives and Pre-knowledge, followed by the theoretical part. The author associated and flavored her observations with appropriate samples of charts, drawings, diagrams, recordings, video-clips, photos, pictures, etc. to make learning process more effective and enjoyable. Before or after the article the author has downloaded a video clip, related to the current topic. EXE offers a wide range of tools to work out or prepare different activities and exercises for the learners: 'Interactive/non-interactive' and 'Textual/non-textual'. So with the use of these tools Multi-Select, Multi-Choice, Cloze, Drop-Down, Case Study, Gap-Filling, Matching and different other types of activities have been elaborated and submitted to the appropriate sections. The learners task is to prepare themselves for the coming module or seminar, related to teaching methodology of English vocabulary, grammar, and pronunciation. The point is that the teacher has an opportunity for face to face communication, as well as to connect with the learners through the Moodle, or as a single EXE file offer it to the learners for their self-study and self-assessment. As for the students’ feedback –EXE environment also makes it available.

Keywords: blended learning, EXE program, learning/teaching strategies, self-study/assessment, virtual domain,

Procedia PDF Downloads 471
4124 Effect of Different Porous Media Models on Drug Delivery to Solid Tumors: Mathematical Approach

Authors: Mostafa Sefidgar, Sohrab Zendehboudi, Hossein Bazmara, Madjid Soltani

Abstract:

Based on findings from clinical applications, most drug treatments fail to eliminate malignant tumors completely even though drug delivery through systemic administration may inhibit their growth. Therefore, better understanding of tumor formation is crucial in developing more effective therapeutics. For this purpose, nowadays, solid tumor modeling and simulation results are used to predict how therapeutic drugs are transported to tumor cells by blood flow through capillaries and tissues. A solid tumor is investigated as a porous media for fluid flow simulation. Most of the studies use Darcy model for porous media. In Darcy model, the fluid friction is neglected and a few simplified assumptions are implemented. In this study, the effect of these assumptions is studied by considering Brinkman model. A multi scale mathematical method which calculates fluid flow to a solid tumor is used in this study to investigate how neglecting fluid friction affects the solid tumor simulation. In this work, the mathematical model in our previous studies is developed by considering two model of momentum equation for porous media: Darcy and Brinkman. The mathematical method involves processes such as fluid flow through solid tumor as porous media, extravasation of blood flow from vessels, blood flow through vessels and solute diffusion, convective transport in extracellular matrix. The sprouting angiogenesis model is used for generating capillary network and then fluid flow governing equations are implemented to calculate blood flow through the tumor-induced capillary network. Finally, the two models of porous media are used for modeling fluid flow in normal and tumor tissues in three different shapes of tumors. Simulations of interstitial fluid transport in a solid tumor demonstrate that the simplifications used in Darcy model affect the interstitial velocity and Brinkman model predicts a lower value for interstitial velocity than the values that Darcy model does.

Keywords: solid tumor, porous media, Darcy model, Brinkman model, drug delivery

Procedia PDF Downloads 311
4123 Copula Autoregressive Methodology for Simulation of Solar Irradiance and Air Temperature Time Series for Solar Energy Forecasting

Authors: Andres F. Ramirez, Carlos F. Valencia

Abstract:

The increasing interest in renewable energies strategies application and the path for diminishing the use of carbon related energy sources have encouraged the development of novel strategies for integration of solar energy into the electricity network. A correct inclusion of the fluctuating energy output of a photovoltaic (PV) energy system into an electric grid requires improvements in the forecasting and simulation methodologies for solar energy potential, and the understanding not only of the mean value of the series but the associated underlying stochastic process. We present a methodology for synthetic generation of solar irradiance (shortwave flux) and air temperature bivariate time series based on copula functions to represent the cross-dependence and temporal structure of the data. We explore the advantages of using this nonlinear time series method over traditional approaches that use a transformation of the data to normal distributions as an intermediate step. The use of copulas gives flexibility to represent the serial variability of the real data on the simulation and allows having more control on the desired properties of the data. We use discrete zero mass density distributions to assess the nature of solar irradiance, alongside vector generalized linear models for the bivariate time series time dependent distributions. We found that the copula autoregressive methodology used, including the zero mass characteristics of the solar irradiance time series, generates a significant improvement over state of the art strategies. These results will help to better understand the fluctuating nature of solar energy forecasting, the underlying stochastic process, and quantify the potential of a photovoltaic (PV) energy generating system integration into a country electricity network. Experimental analysis and real data application substantiate the usage and convenience of the proposed methodology to forecast solar irradiance time series and solar energy across northern hemisphere, southern hemisphere, and equatorial zones.

Keywords: copula autoregressive, solar irradiance forecasting, solar energy forecasting, time series generation

Procedia PDF Downloads 327
4122 Design and Simulation of Low Threshold Nanowire Photonic Crystal Surface Emitting Lasers

Authors: Balthazar Temu, Zhao Yan, Bogdan-Petrin Ratiu, Sang Soon Oh, Qiang Li

Abstract:

Nanowire based Photonic Crystal Surface Emitting Lasers (PCSELs) reported in the literature have been designed using a triangular, square or honeycomb patterns. The triangular and square pattern PCSELs have limited degrees of freedom in tuning the design parameters which hinders the ability to design high quality factor (Q-factor) devices. Nanowire based PCSELs designed using triangular and square patterns have been reported with the lasing thresholds of 130 kW/〖cm〗^2 and 7 kW/〖cm〗^2 respectively. On the other hand the honeycomb pattern gives more degrees of freedom in tuning the design parameters, which can allow one to design high Q-factor devices. A deformed honeycomb pattern device was reported with lasing threshold of 6.25 W/〖cm〗^2 corresponding to a simulated Q-factor of 5.84X〖10〗^5.Despite this achievement, the design principles which can lead to realization of even higher Q-factor honeycomb pattern PCSELs have not yet been investigated. In this work we show that through deforming the honeycomb pattern and tuning the heigh and lattice constants of the nanowires, it is possible to achieve even higher Q-factor devices. Considering three different band edge modes, we investigate how the resonance wavelength changes as the device is deformed, which is useful in designing high Q-factor devices in different wavelength bands. We eventually establish the design and simulation of honeycomb PCSELs operating around the wavelength of 960nm , in the O and the C band with Q-factors up to 7X〖10〗^7. We also investigate the Q-factors of undeformed device, and establish that the mode at the band edge close to 960nm can attain highest Q-factor of all the modes when the device is undeformed and the Q-factor degrades as the device is deformed. This work is a stepping stone towards the fabrication of very high Q-factor, nanowire based honey comb PCSELs, which are expected to have very low lasing threshold.

Keywords: designing nanowire PCSEL, designing PCSEL on silicon substrates, low threshold nanowire laser, simulation of photonic crystal lasers

Procedia PDF Downloads 20
4121 Generative AI in Higher Education: Pedagogical and Ethical Guidelines for Implementation

Authors: Judit Vilarmau

Abstract:

Generative AI is emerging rapidly and transforming higher education in many ways, occasioning new challenges and disrupting traditional models and methods. The studies and authors explored remark on the impact on the ethics, curriculum, and pedagogical methods. Students are increasingly using generative AI for study, as a virtual tutor, and as a resource for generating works and doing assignments. This point is crucial for educators to make sure that students are using generative AI with ethical considerations. Generative AI also has relevant benefits for educators and can help them personalize learning experiences and promote self-regulation. Educators must seek and explore tools like ChatGPT to innovate without forgetting an ethical and pedagogical perspective. Eighteen studies were systematically reviewed, and the findings provide implementation guidelines with pedagogical and ethical considerations.

Keywords: ethics, generative artificial intelligence, guidelines, higher education, pedagogy

Procedia PDF Downloads 92
4120 Numerical Simulation of Production of Microspheres from Polymer Emulsion in Microfluidic Device toward Using in Drug Delivery Systems

Authors: Nizar Jawad Hadi, Sajad Abd Alabbas

Abstract:

Because of their ability to encapsulate and release drugs in a controlled manner, microspheres fabricated from polymer emulsions using microfluidic devices have shown promise for drug delivery applications. In this study, the effects of velocity, density, viscosity, and surface tension, as well as channel diameter, on microsphere generation were investigated using Fluent Ansys software. The software was programmed with the physical properties of the polymer emulsion such as density, viscosity and surface tension. Simulation will then be performed to predict fluid flow and microsphere production and improve the design of drug delivery applications based on changes in these parameters. The effects of capillary and Weber numbers are also studied. The results of the study showed that the size of the microspheres can be controlled by adjusting the speed and diameter of the channel. Narrower microspheres resulted from narrower channel widths and higher flow rates, which could improve drug delivery efficiency, while smaller microspheres resulted from lower interfacial surface tension. The viscosity and density of the polymer emulsion significantly affected the size of the microspheres, ith higher viscosities and densities producing smaller microspheres. The loading and drug release properties of the microspheres created with the microfluidic technique were also predicted. The results showed that the microspheres can efficiently encapsulate drugs and release them in a controlled manner over a period of time. This is due to the high surface area to volume ratio of the microspheres, which allows for efficient drug diffusion. The ability to tune the manufacturing process using factors such as speed, density, viscosity, channel diameter, and surface tension offers a potential opportunity to design drug delivery systems with greater efficiency and fewer side effects.

Keywords: polymer emulsion, microspheres, numerical simulation, microfluidic device

Procedia PDF Downloads 70
4119 The Influence of Fiber Volume Fraction on Thermal Conductivity of Pultruded Profile

Authors: V. Lukášová, P. Peukert, V. Votrubec

Abstract:

Thermal conductivity in the x, y and z-directions was measured on a pultruded profile that was manufactured by the technology of pulling from glass fibers and a polyester matrix. The results of measurements of thermal conductivity showed considerable variability in different directions. The caused variability in thermal conductivity was expected due fraction variations. The cross-section of the pultruded profile was scanned. An image analysis illustrated an uneven distribution of the fibers and the matrix in the cross-section. The distribution of these inequalities was processed into a Voronoi diagram in the observed area of the pultruded profile cross-section. In order to verify whether the variation of the fiber volume fraction in the pultruded profile can affect its thermal conductivity, the numerical simulations in the ANSYS Fluent were performed. The simulation was based on the geometry reconstructed from image analysis. The aim is to quantify thermal conductivity numerically. Above all, images with different volume fractions were chosen. The results of the measured thermal conductivity were compared with the calculated thermal conductivity. The evaluated data proved a strong correlation between volume fraction and thermal conductivity of the pultruded profile. Based on presented results, a modification of production technology may be proposed.

Keywords: pultrusion profile, volume fraction, thermal conductivity, numerical simulation

Procedia PDF Downloads 351
4118 Optical Simulation of HfO₂ Film - Black Silicon Structures for Solar Cells Applications

Authors: Gagik Ayvazyan, Levon Hakhoyan, Surik Khudaverdyan, Laura Lakhoyan

Abstract:

Black Si (b-Si) is a nano-structured Si surface formed by a self-organized, maskless process with needle-like surfaces discernible by their black color. The combination of low reflectivity and the semi-conductive properties of Si found in b-Si make it a prime candidate for application in solar cells as an antireflection surface. However, surface recombination losses significantly reduce the efficiency of b-Si solar cells. Surface passivation using suitable dielectric films can minimize these losses. Nowadays some works have demonstrated that excellent passivation of b-Si nanostructures can be reached using Al₂O₃ films. However, the negative fixed charge present in Al₂O₃ films should provide good field effect passivation only for p- and p+-type Si surfaces. HfO2 thin films have not been practically tested for passivation of b-Si. HfO₂ could provide an alternative for n- and n+- type Si surface passivation since it has been shown to exhibit positive fixed charge. Using optical simulation by Finite-Difference Time Domain (FDTD) method, the possibility of b-Si passivation by HfO2 films has been analyzed. The FDTD modeling revealed that b-Si layers with HfO₂ films effectively suppress reflection in the wavelength range 400–1000 nm and across a wide range of incidence angles. The light-trapping performance primarily depends on geometry of the needles and film thickness. With the decrease of periodicity and increase of height of the needles, the reflectance decrease significantly, and the absorption increases significantly. Increase in thickness results in an even greater decrease in the calculated reflection coefficient of model structures and, consequently, to an improvement in the antireflection characteristics in the visible range. The excellent surface passivation and low reflectance results prove the potential of using the combination of the b-Si surface and the HfO₂ film for solar cells applications.

Keywords: antireflection, black silicon, HfO₂, passivation, simulation, solar cell

Procedia PDF Downloads 149
4117 On the Cluster of the Families of Hybrid Polynomial Kernels in Kernel Density Estimation

Authors: Benson Ade Eniola Afere

Abstract:

Over the years, kernel density estimation has been extensively studied within the context of nonparametric density estimation. The fundamental components of kernel density estimation are the kernel function and the bandwidth. While the mathematical exploration of the kernel component has been relatively limited, its selection and development remain crucial. The Mean Integrated Squared Error (MISE), serving as a measure of discrepancy, provides a robust framework for assessing the effectiveness of any kernel function. A kernel function with a lower MISE is generally considered to perform better than one with a higher MISE. Hence, the primary aim of this article is to create kernels that exhibit significantly reduced MISE when compared to existing classical kernels. Consequently, this article introduces a cluster of hybrid polynomial kernel families. The construction of these proposed kernel functions is carried out heuristically by combining two kernels from the classical polynomial kernel family using probability axioms. We delve into the analysis of error propagation within these kernels. To assess their performance, simulation experiments, and real-life datasets are employed. The obtained results demonstrate that the proposed hybrid kernels surpass their classical kernel counterparts in terms of performance.

Keywords: classical polynomial kernels, cluster of families, global error, hybrid Kernels, Kernel density estimation, Monte Carlo simulation

Procedia PDF Downloads 98
4116 FPGA Based Vector Control of PM Motor Using Sliding Mode Observer

Authors: Hanan Mikhael Dawood, Afaneen Anwer Abood Al-Khazraji

Abstract:

The paper presents an investigation of field oriented control strategy of Permanent Magnet Synchronous Motor (PMSM) based on hardware in the loop simulation (HIL) over a wide speed range. A sensorless rotor position estimation using sliding mode observer for permanent magnet synchronous motor is illustrated considering the effects of magnetic saturation between the d and q axes. The cross saturation between d and q axes has been calculated by finite-element analysis. Therefore, the inductance measurement regards the saturation and cross saturation which are used to obtain the suitable id-characteristics in base and flux weakening regions. Real time matrix multiplication in Field Programmable Gate Array (FPGA) using floating point number system is used utilizing Quartus-II environment to develop FPGA designs and then download these designs files into development kit. dSPACE DS1103 is utilized for Pulse Width Modulation (PWM) switching and the controller. The hardware in the loop results conducted to that from the Matlab simulation. Various dynamic conditions have been investigated.

Keywords: magnetic saturation, rotor position estimation, sliding mode observer, hardware in the loop (HIL)

Procedia PDF Downloads 529
4115 Optimizing the Passenger Throughput at an Airport Security Checkpoint

Authors: Kun Li, Yuzheng Liu, Xiuqi Fan

Abstract:

High-security standard and high efficiency of screening seem to be contradictory to each other in the airport security check process. Improving the efficiency as far as possible while maintaining the same security standard is significantly meaningful. This paper utilizes the knowledge of Operation Research and Stochastic Process to establish mathematical models to explore this problem. We analyze the current process of airport security check and use the M/G/1 and M/G/k models in queuing theory to describe the process. Then we find the least efficient part is the pre-check lane, the bottleneck of the queuing system. To improve passenger throughput and reduce the variance of passengers’ waiting time, we adjust our models and use Monte Carlo method, then put forward three modifications: adjust the ratio of Pre-Check lane to regular lane flexibly, determine the optimal number of security check screening lines based on cost analysis and adjust the distribution of arrival and service time based on Monte Carlo simulation results. We also analyze the impact of cultural differences as the sensitivity analysis. Finally, we give the recommendations for the current process of airport security check process.

Keywords: queue theory, security check, stochatic process, Monte Carlo simulation

Procedia PDF Downloads 203
4114 Money Laundering and Governance in Cryptocurrencies: The Double-Edged Sword of Blockchain Technology

Authors: Jiaqi Yan, Yani Shi

Abstract:

With the growing popularity of bitcoin transactions, criminals have exploited the bitcoin like cryptocurrencies, and cybercriminals such as money laundering have thrived. Unlike traditional currencies, the Internet-based virtual currencies can be used anonymously via the blockchain technology underpinning. In this paper, we analyze the double-edged sword features of blockchain technology in the context of money laundering. In particular, the traceability feature of blockchain-based system facilitates a level of governance, while the decentralization feature of blockchain-based system may bring governing difficulties. Based on the analysis, we propose guidelines for policy makers in governing blockchain-based cryptocurrency systems.

Keywords: cryptocurrency, money laundering, blockchain, decentralization, traceability

Procedia PDF Downloads 205
4113 Parallelization of Random Accessible Progressive Streaming of Compressed 3D Models over Web

Authors: Aayushi Somani, Siba P. Samal

Abstract:

Three-dimensional (3D) meshes are data structures, which store geometric information of an object or scene, generally in the form of vertices and edges. Current technology in laser scanning and other geometric data acquisition technologies acquire high resolution sampling which leads to high resolution meshes. While high resolution meshes give better quality rendering and hence is used often, the processing, as well as storage of 3D meshes, is currently resource-intensive. At the same time, web applications for data processing have become ubiquitous owing to their accessibility. For 3D meshes, the advancement of 3D web technologies, such as WebGL, WebVR, has enabled high fidelity rendering of huge meshes. However, there exists a gap in ability to stream huge meshes to a native client and browser application due to high network latency. Also, there is an inherent delay of loading WebGL pages due to large and complex models. The focus of our work is to identify the challenges faced when such meshes are streamed into and processed on hand-held devices, owing to its limited resources. One of the solutions that are conventionally used in the graphics community to alleviate resource limitations is mesh compression. Our approach deals with a two-step approach for random accessible progressive compression and its parallel implementation. The first step includes partition of the original mesh to multiple sub-meshes, and then we invoke data parallelism on these sub-meshes for its compression. Subsequent threaded decompression logic is implemented inside the Web Browser Engine with modification of WebGL implementation in Chromium open source engine. This concept can be used to completely revolutionize the way e-commerce and Virtual Reality technology works for consumer electronic devices. These objects can be compressed in the server and can be transmitted over the network. The progressive decompression can be performed on the client device and rendered. Multiple views currently used in e-commerce sites for viewing the same product from different angles can be replaced by a single progressive model for better UX and smoother user experience. Can also be used in WebVR for commonly and most widely used activities like virtual reality shopping, watching movies and playing games. Our experiments and comparison with existing techniques show encouraging results in terms of latency (compressed size is ~10-15% of the original mesh), processing time (20-22% increase over serial implementation) and quality of user experience in web browser.

Keywords: 3D compression, 3D mesh, 3D web, chromium, client-server architecture, e-commerce, level of details, parallelization, progressive compression, WebGL, WebVR

Procedia PDF Downloads 174
4112 Discrete Element Method Simulation of Crushable Pumice Sand

Authors: Sayed Hessam Bahmani, Rolsndo P. Orense

Abstract:

From an engineering point of view, pumice particles are problematic because of their crushability and compressibility due to their vesicular nature. Currently, information on the geotechnical characteristics of pumice sands is limited. While extensive empirical and laboratory tests can be implemented to characterize their behavior, these are generally time-consuming and expensive. These drawbacks have motivated attempts to study the effects of particle breakage of pumice sand through the Discrete Element Method (DEM). This method provides insights into the behavior of crushable granular material at both the micro and macro-level. In this paper, the results of single-particle crushing tests conducted in the laboratory are simulated using DEM through the open-source code YADE. This is done to better understand the parameters necessary to represent the pumice microstructure that governs its crushing features, and to examine how the resulting microstructure evolution affects a particle’s properties. The DEM particle model is then used to simulate the behavior of pumice sand during consolidated drained triaxial tests. The results indicate the importance of incorporating particle porosity and unique surface textures in the material characterization and show that interlocking between the crushed particles significantly influences the drained behavior of the pumice specimen.

Keywords: pumice sand, triaxial compression, simulation, particle breakage

Procedia PDF Downloads 252
4111 Interpretation of Sweep Frequency Response Analysis (SFRA) Traces for the Earth Fault Damage Practically Simulated on the Power Transformer Specially Developed for Performing Sweep Frequency Response Analysis for Various Transformers

Authors: Akshay A. Pandya, B. R. Parekh

Abstract:

This paper presents how earth fault damage in the transformer can be detected by Sweep Frequency Response Analysis (SFRA). The test methods used by the authors for presenting the results are described. The power transformer of rating 10 KVA, 11000 V/440 V, 3-phase, 50 Hz, Dyn11 has been specially developed in-house for carrying out SFRA testing by practically simulated various transformer damages on it. Earth fault has been practically simulated on HV “U” phase winding and LV “W” phase winding separately. The result of these simulated faults are presented and discussed. The motivation of this presented work is to extend the guideline approach; there are ideas to organize database containing collected measurement results. Since the SFRA interpretation is based on experience, such databases are thought to be of great importance when interpreting SFRA response. The evaluation of the SFRA responses against guidelines and experience have to be performed and conclusions regarding usefulness of each simulation has been drawn and at last overall conclusion has also been drawn.

Keywords: earth fault damage, power transformer, practical simulation, SFRA traces, transformer damages

Procedia PDF Downloads 289
4110 Analyzing the Effects of Adding Bitcoin to Portfolio

Authors: Shashwat Gangwal

Abstract:

This paper analyses the effect of adding Bitcoin, to the portfolio (stocks, bonds, Baltic index, MXEF, gold, real estate and crude oil) of an international investor by using daily data available from 2nd of July, 2010 to 2nd of August, 2016. We conclude that adding Bitcoin to portfolio, over the course of the considered period, always yielded a higher Sharpe ratio. This means that Bitcoin’s returns offset its high volatility. This paper, recognizing the fact that Bitcoin is a relatively new asset class, gives the readers a basic idea about the working of the virtual currency, the increasing number developments in the financial industry revolving around it, its unique features and the detailed look into its continuously growing acceptance across different fronts (Banks, Merchants and Countries) globally. We also construct optimal portfolios to reflect the highly lucrative and largely unexplored opportunities associated with investment in Bitcoin.

Keywords: bitcoin, financial instruments, portfolio management, risk adjusted return

Procedia PDF Downloads 237
4109 Analog Input Output Buffer Information Specification Modelling Techniques for Single Ended Inter-Integrated Circuit and Differential Low Voltage Differential Signaling I/O Interfaces

Authors: Monika Rawat, Rahul Kumar

Abstract:

Input output Buffer Information Specification (IBIS) models are used for describing the analog behavior of the Input Output (I/O) buffers of a digital device. They are widely used to perform signal integrity analysis. Advantages of using IBIS models include simple structure, IP protection and fast simulation time with reasonable accuracy. As design complexity of driver and receiver increases, capturing exact behavior from transistor level model into IBIS model becomes an essential task to achieve better accuracy. In this paper, an improvement in existing methodology of generating IBIS model for complex I/O interfaces such as Inter-Integrated Circuit (I2C) and Low Voltage Differential Signaling (LVDS) is proposed. Furthermore, the accuracy and computational performance of standard method and proposed approach with respect to SPICE are presented. The investigations will be useful to further improve the accuracy of IBIS models and to enhance their wider acceptance.

Keywords: IBIS, signal integrity, open-drain buffer, low voltage differential signaling, behavior modelling, transient simulation

Procedia PDF Downloads 200
4108 Estimating X-Ray Spectra for Digital Mammography by Using the Expectation Maximization Algorithm: A Monte Carlo Simulation Study

Authors: Chieh-Chun Chang, Cheng-Ting Shih, Yan-Lin Liu, Shu-Jun Chang, Jay Wu

Abstract:

With the widespread use of digital mammography (DM), radiation dose evaluation of breasts has become important. X-ray spectra are one of the key factors that influence the absorbed dose of glandular tissue. In this study, we estimated the X-ray spectrum of DM using the expectation maximization (EM) algorithm with the transmission measurement data. The interpolating polynomial model proposed by Boone was applied to generate the initial guess of the DM spectrum with the target/filter combination of Mo/Mo and the tube voltage of 26 kVp. The Monte Carlo N-particle code (MCNP5) was used to tally the transmission data through aluminum sheets of 0.2 to 3 mm. The X-ray spectrum was reconstructed by using the EM algorithm iteratively. The influence of the initial guess for EM reconstruction was evaluated. The percentage error of the average energy between the reference spectrum inputted for Monte Carlo simulation and the spectrum estimated by the EM algorithm was -0.14%. The normalized root mean square error (NRMSE) and the normalized root max square error (NRMaSE) between both spectra were 0.6% and 2.3%, respectively. We conclude that the EM algorithm with transmission measurement data is a convenient and useful tool for estimating x-ray spectra for DM in clinical practice.

Keywords: digital mammography, expectation maximization algorithm, X-Ray spectrum, X-Ray

Procedia PDF Downloads 735
4107 Fetal Movement Study Using Biomimics of the Maternal March

Authors: V. Diaz, B. Pardo , D. Villegas

Abstract:

In premature births most babies have complications at birth, these complications can be reduced, if an atmosphere of relaxation is provided and is also similar to intrauterine life, for this, there are programs where their mothers lull and sway them; however, the conditions in which they do so and the way in they do it may not be the indicated. Here we describe an investigation based on the biomimics of the kinematics of human fetal movement, which consists of determining the movements that the fetus experiences and the deformations of the components that surround the fetus during a gentle walk at week 32 of the gestation stage. This research is based on a 3D model that has the anatomical structure of the pelvis, fetus, muscles, uterus and its most important supporting elements (ligaments). Normal load conditions are applied to this model according to the stage of gestation and the kinematics of a gentle walk of a pregnant mother, which focuses on the pelvic bone, this allows to receive a response from the other elements of the model. To accomplish this modeling and subsequent simulation Solidworks software was used. From this analysis, the curves that describe the movement of the fetus at three different points were obtained. Additionally, we could found the deformation of the uterus and the ligaments that support it, showing the characteristics that these tissues can have in the face of the support of the fetus. These data can be used for the construction of artifacts that help the normal development of premature infants.

Keywords: simulation, biomimic, uterine model, fetal movement study

Procedia PDF Downloads 166
4106 The Observable Method for the Regularization of Shock-Interface Interactions

Authors: Teng Li, Kamran Mohseni

Abstract:

This paper presents an inviscid regularization technique that is capable of regularizing the shocks and sharp interfaces simultaneously in the shock-interface interaction simulations. The direct numerical simulation of flows involving shocks has been investigated for many years and a lot of numerical methods were developed to capture the shocks. However, most of these methods rely on the numerical dissipation to regularize the shocks. Moreover, in high Reynolds number flows, the nonlinear terms in hyperbolic Partial Differential Equations (PDE) dominates, constantly generating small scale features. This makes direct numerical simulation of shocks even harder. The same difficulty happens in two-phase flow with sharp interfaces where the nonlinear terms in the governing equations keep sharpening the interfaces to discontinuities. The main idea of the proposed technique is to average out the small scales that is below the resolution (observable scale) of the computational grid by filtering the convective velocity in the nonlinear terms in the governing PDE. This technique is named “observable method” and it results in a set of hyperbolic equations called observable equations, namely, observable Navier-Stokes or Euler equations. The observable method has been applied to the flow simulations involving shocks, turbulence, and two-phase flows, and the results are promising. In the current paper, the observable method is examined on the performance of regularizing shocks and interfaces at the same time in shock-interface interaction problems. Bubble-shock interactions and Richtmyer-Meshkov instability are particularly chosen to be studied. Observable Euler equations will be numerically solved with pseudo-spectral discretization in space and third order Total Variation Diminishing (TVD) Runge Kutta method in time. Results are presented and compared with existing publications. The interface acceleration and deformation and shock reflection are particularly examined.

Keywords: compressible flow simulation, inviscid regularization, Richtmyer-Meshkov instability, shock-bubble interactions.

Procedia PDF Downloads 352
4105 Imposing Speed Constraints on Arrival Flights: Case Study for Changi Airport

Authors: S. Aneeka, S.M. Phyoe, R. Guo, Z.W. Zhong

Abstract:

Arrival flights tend to spend long waiting times at holding stacks if the arrival airport is congested. However, the waiting time spent in the air in the vicinity of the arrival airport may be reduced if the delays are distributed to the cruising phase of the arrival flights by means of speed control. Here, a case study was conducted for the flights arriving at Changi Airport. The flights that were assigned holdings were simulated to fly at a reduced speed during the cruising phase. As the study involves a single airport and is limited to imposing speed constraints to arrivals within 200 NM from its location, the simulation setup in this study could be considered as an application of the Extended Arrival Management (E-AMAN) technique, which is proven to result in considerable fuel savings and more efficient management of delays. The objective of this experiment was to quantify the benefits of imposing cruise speed constraints to arrivals at Changi Airport and to assess the effects on controllers’ workload. The simulation results indicated considerable fuel savings, reduced aircraft emissions and reduced controller workload.

Keywords: aircraft emissions, air traffic flow management, controller workload, fuel consumption

Procedia PDF Downloads 147
4104 Emulation of a Wind Turbine Using Induction Motor Driven by Field Oriented Control

Authors: L. Benaaouinate, M. Khafallah, A. Martinez, A. Mesbahi, T. Bouragba

Abstract:

This paper concerns with the modeling, simulation, and emulation of a wind turbine emulator for standalone wind energy conversion systems. By using emulation system, we aim to reproduce the dynamic behavior of the wind turbine torque on the generator shaft: it provides the testing facilities to optimize generator control strategies in a controlled environment, without reliance on natural resources. The aerodynamic, mechanical, electrical models have been detailed as well as the control of pitch angle using Fuzzy Logic for horizontal axis wind turbines. The wind turbine emulator consists mainly of an induction motor with AC power drive with torque control. The control of the induction motor and the mathematical models of the wind turbine are designed with MATLAB/Simulink environment. The simulation results confirm the effectiveness of the induction motor control system and the functionality of the wind turbine emulator for providing all necessary parameters of the wind turbine system such as wind speed, output torque, power coefficient and tip speed ratio. The findings are of direct practical relevance.

Keywords: electrical generator, induction motor drive, modeling, pitch angle control, real time control, renewable energy, wind turbine, wind turbine emulator

Procedia PDF Downloads 238