Search results for: complex network platform
2792 Machine Learning Facing Behavioral Noise Problem in an Imbalanced Data Using One Side Behavioral Noise Reduction: Application to a Fraud Detection
Authors: Salma El Hajjami, Jamal Malki, Alain Bouju, Mohammed Berrada
Abstract:
With the expansion of machine learning and data mining in the context of Big Data analytics, the common problem that affects data is class imbalance. It refers to an imbalanced distribution of instances belonging to each class. This problem is present in many real world applications such as fraud detection, network intrusion detection, medical diagnostics, etc. In these cases, data instances labeled negatively are significantly more numerous than the instances labeled positively. When this difference is too large, the learning system may face difficulty when tackling this problem, since it is initially designed to work in relatively balanced class distribution scenarios. Another important problem, which usually accompanies these imbalanced data, is the overlapping instances between the two classes. It is commonly referred to as noise or overlapping data. In this article, we propose an approach called: One Side Behavioral Noise Reduction (OSBNR). This approach presents a way to deal with the problem of class imbalance in the presence of a high noise level. OSBNR is based on two steps. Firstly, a cluster analysis is applied to groups similar instances from the minority class into several behavior clusters. Secondly, we select and eliminate the instances of the majority class, considered as behavioral noise, which overlap with behavior clusters of the minority class. The results of experiments carried out on a representative public dataset confirm that the proposed approach is efficient for the treatment of class imbalances in the presence of noise.Keywords: machine learning, imbalanced data, data mining, big data
Procedia PDF Downloads 1332791 Subjective Mapping Methodologies: Mapping Local Perceptions with Geographic Information Systems
Authors: A. Llopis Alvarez, D. Muller-Eie
Abstract:
Participatory GIS (geographic information systems) are designed for community mapping exercises in order to produce spatial representations of local knowledge. Ideally, participatory GIS caters to public participation through the use of spatial data in order to increase community-led policy-and decision-making. Having defined a spatial object, such as a neighborhood, subjective mapping involves attaining a description of the spatial, physical, social and psychological characteristics of that spatial object. This paper highlights an emerging appreciation of the subjective component, particularly in spatial analyses. The beliefs, feelings, and behaviors associated with an urban area reflect its sense of place for an individual or a group. It is important therefore to understand what types of beliefs, emotions, and behavioral patterns are relevant to particular resident, groups and urban scales. In this sense, resident’s emotional attachment to their urban areas motivates civic engagement and facilitates awareness of its strengths and its problems. Similarly, subjective perceptions act in complex ways to influence the formation and maintenance of social identity and quality of life. This paper reports on findings from a case study of immigrant population in Norwegian cities, their residential conditions and their relationship to quality of urban life. Cognitive mapping methodologies are used in this study to understand local perceptions of urban qualities. Thus, measures to alleviate disadvantages and improve quality of urban life are more likely to be effective when they are informed by an understanding of a place as constructed by those who live in it, meaning their subjective perceptions about it.Keywords: mapping methodologies, participatory GIS, perceptual maps, public participation, spatial analysis, subjective perceptions
Procedia PDF Downloads 1462790 Optimization of Multi Commodities Consumer Supply Chain: Part 1-Modelling
Authors: Zeinab Haji Abolhasani, Romeo Marian, Lee Luong
Abstract:
This paper and its companions (Part II, Part III) will concentrate on optimizing a class of supply chain problems known as Multi- Commodities Consumer Supply Chain (MCCSC) problem. MCCSC problem belongs to production-distribution (P-D) planning category. It aims to determine facilities location, consumers’ allocation, and facilities configuration to minimize total cost (CT) of the entire network. These facilities can be manufacturer units (MUs), distribution centres (DCs), and retailers/end-users (REs) but not limited to them. To address this problem, three major tasks should be undertaken. At the first place, a mixed integer non-linear programming (MINP) mathematical model is developed. Then, system’s behaviors under different conditions will be observed using a simulation modeling tool. Finally, the most optimum solution (minimum CT) of the system will be obtained using a multi-objective optimization technique. Due to the large size of the problem, and the uncertainties in finding the most optimum solution, integration of modeling and simulation methodologies is proposed followed by developing new approach known as GASG. It is a genetic algorithm on the basis of granular simulation which is the subject of the methodology of this research. In part II, MCCSC is simulated using discrete-event simulation (DES) device within an integrated environment of SimEvents and Simulink of MATLAB® software package followed by a comprehensive case study to examine the given strategy. Also, the effect of genetic operators on the obtained optimal/near optimal solution by the simulation model will be discussed in part III.Keywords: supply chain, genetic algorithm, optimization, simulation, discrete event system
Procedia PDF Downloads 3202789 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method
Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay
Abstract:
This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.Keywords: agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition
Procedia PDF Downloads 2672788 Research of Seepage Field and Slope Stability Considering Heterogeneous Characteristics of Waste Piles: A Less Costly Way to Reduce High Leachate Levels and Avoid Accidents
Authors: Serges Mendomo Meye, Li Guowei, Shen Zhenzhong, Gan Lei, Xu Liqun
Abstract:
Due to the characteristics of high-heap and large-volume, the complex layers of waste and the high-water level of leachate, environmental pollution, and slope instability are easily produced. It is therefore of great significance to research the heterogeneous seepage field and stability of landfills. This paper focuses on the heterogeneous characteristics of the landfill piles and analyzes the seepage field and slope stability of the landfill using statistical and numerical analysis methods. The calculated results are compared with the field measurement and literature research data to verify the reliability of the model, which may provide the basis for the design, safe, and eco-friendly operation of the landfill. The main innovations are as follows: (1) The saturated-unsaturated seepage equation of heterogeneous soil is derived theoretically. The heterogeneous landfill is regarded as composed of infinite layers of homogeneous waste, and a method for establishing the heterogeneous seepage model is proposed. Then the formation law of the stagnant water level of heterogeneous landfills is studied. It is found that the maximum stagnant water level of landfills is higher when considering the heterogeneous seepage characteristics, which harms the stability of landfills. (2) Considering the heterogeneity weight and strength characteristics of waste, a method of establishing a heterogeneous stability model is proposed, and it is extended to the three-dimensional stability study. It is found that the distribution of heterogeneous characteristics has a great influence on the stability of landfill slope. During the operation and management of the landfill, the reservoir bank should also be considered while considering the capacity of the landfill.Keywords: heterogeneous characteristics, leachate levels, saturated-unsaturated seepage, seepage field, slope stability
Procedia PDF Downloads 2592787 Optimizing Data Integration and Management Strategies for Upstream Oil and Gas Operations
Authors: Deepak Singh, Rail Kuliev
Abstract:
The abstract highlights the critical importance of optimizing data integration and management strategies in the upstream oil and gas industry. With its complex and dynamic nature generating vast volumes of data, efficient data integration and management are essential for informed decision-making, cost reduction, and maximizing operational performance. Challenges such as data silos, heterogeneity, real-time data management, and data quality issues are addressed, prompting the proposal of several strategies. These strategies include implementing a centralized data repository, adopting industry-wide data standards, employing master data management (MDM), utilizing real-time data integration technologies, and ensuring data quality assurance. Training and developing the workforce, “reskilling and upskilling” the employees and establishing robust Data Management training programs play an essential role and integral part in this strategy. The article also emphasizes the significance of data governance and best practices, as well as the role of technological advancements such as big data analytics, cloud computing, Internet of Things (IoT), and artificial intelligence (AI) and machine learning (ML). To illustrate the practicality of these strategies, real-world case studies are presented, showcasing successful implementations that improve operational efficiency and decision-making. In present study, by embracing the proposed optimization strategies, leveraging technological advancements, and adhering to best practices, upstream oil and gas companies can harness the full potential of data-driven decision-making, ultimately achieving increased profitability and a competitive edge in the ever-evolving industry.Keywords: master data management, IoT, AI&ML, cloud Computing, data optimization
Procedia PDF Downloads 732786 Role of Cellulose Fibers in Tuning the Microstructure and Crystallographic Phase of α-Fe₂O₃ and α-FeOOH Nanoparticles
Authors: Indu Chauhan, Bhupendra S. Butola, Paritosh Mohanty
Abstract:
It is very well known that properties of material changes as their size approach to nanoscale level due to the high surface area to volume ratio. However, in last few decades, a tenet ‘structure dictates function’ is quickly being adopted by researchers working with nanomaterials. The design and exploitation of nanoparticles with tailored shape and size has become one of the primary goals of materials science researchers to expose the properties of nanostructures. To date, various methods, including soft/hard template/surfactant assisted route hydrothermal reaction, seed mediated growth method, capping molecule-assisted synthesis, polyol process, etc. have been adopted to synthesize the nanostructures with controlled size and shape and monodispersity. However controlling the shape and size of nanoparticles is an ultimate challenge of modern material research. In particular, many efforts have been devoted to rational and skillful control of hierarchical and complex nanostructures. Thus in our research work, role of cellulose in manipulating the nanostructures has been discussed. Nanoparticles of α-Fe₂O₃ (diameter ca. 15 to 130 nm) were immobilized on the cellulose fiber surface by a single step in situ hydrothermal method. However, nanoflakes of α-FeOOH having thickness ca. ~25 nm and length ca. ~250 nm were obtained by the same method in absence of cellulose fibers. A possible nucleation and growth mechanism of the formation of nanostructures on cellulose fibers have been proposed. The covalent bond formation between the cellulose fibers and nanostructures has been discussed with supporting evidence from the spectroscopic and other analytical studies such as Fourier transform infrared spectroscopy and X-ray photoelectron spectroscopy. The role of cellulose in manipulating the nanostructures has been discussed.Keywords: cellulose fibers, α-Fe₂O₃, α-FeOOH, hydrothermal, nanoflakes, nanoparticles
Procedia PDF Downloads 1542785 English as a Medium of Instruction in Algerian Higher Business Degree Programmes
Authors: Sidi Ahmed Berrabah
Abstract:
English as a Medium of Instruction (EMI) is expanding rapidly in the world. A growing volume of research has been dedicated to investigating its introduction, with findings that describe a complex picture and suggest that the practicality and effectiveness of EMI are still the subjects of debate. However, considerably less attention has been given to understanding EMI in a context where its introduction has been discussed but not yet put into practice. One such context is Algeria, where discourses about a potential introduction of EMI have been going on for some time. It is likely that the first courses where EMI is introduced are Business degree programmes. This study aims to examine the current discourses and attitudes towards the potential implementation of EMI and the language practices in Business degree programmes in three Algerian universities. The research is conducted in three different universities in three different regions in Algeria with the aim of including both ‘centre’ and ‘periphery’ Algerian universities. In order to achieve the previous aims, a mixed research paradigm is used. Questionnaires, semi structured interviews, and classroom observations are used to gather data from three participant cohorts: university students of Business, lecturers of Business, and lecturers of English for specific purposes. The findings showed that students and lecturers of Business are found in favour of the introduction of English instead of French or standard Arabic as a medium of instruction. The reason is that English is seen as having internationalisation and instrumental benefits, while French was too closely linked to the colonial history of the country. The favourable attitudes towards EMI, however, seem to contrast with the daily classroom practices at the departments of Business studies, where students and lecturers make practical choices of using their language repertoire based on their linguistic background and skills. Classrooms in the three Algerian universities featured fluid and translanguaging practices that cannot be reduced to a monolingual EMI policy.Keywords: EMI, Algerian universities, business degree programmes, translanguaging
Procedia PDF Downloads 2212784 The Role of Questioning Ability as an Indicator of Scientific Thinking in Children Aged 5-9
Authors: Aliya K. Salahova
Abstract:
Scientific thinking is a fundamental cognitive skill that plays a crucial role in preparing young minds for an increasingly complex world. This study explores the connection between scientific thinking and the ability to ask questions in children aged 5-9. The research aims to identify and assess how questioning ability serves as an indicator of scientific thinking development in this age group. A longitudinal investigation was conducted over a span of 240 weeks, involving 72 children from diverse backgrounds. The participants were divided into an experimental group, engaging in weekly STEM activities, and a control group with no STEM involvement. The development of scientific thinking was evaluated through a comprehensive assessment of questioning skills, hypothesis formulation, logical reasoning, and problem-solving abilities. The findings reveal a significant correlation between the ability to ask questions and the level of scientific thinking in children aged 5-9. Participants in the experimental group exhibited a remarkable improvement in their questioning ability, which positively influenced their scientific thinking growth. In contrast, the control group, devoid of STEM activities, showed minimal progress in questioning skills and subsequent scientific thinking development. This study highlights the pivotal role of questioning ability as a key indicator of scientific thinking in young children. The results provide valuable insights for educators and researchers, emphasizing the importance of fostering and nurturing questioning skills to enhance scientific thinking capabilities from an early age. The implications of these findings are crucial for designing effective educational interventions to promote scientific curiosity and critical thinking in the next generation of scientific minds.Keywords: scientific thinking, education, STEM, intervention, psychology, pedagogy, collaborative learning, longitudinal study
Procedia PDF Downloads 712783 Electrocatalysts for Lithium-Sulfur Energy Storage Systems
Authors: Mirko Ante, Şeniz Sörgel, Andreas Bund
Abstract:
Li-S- (Lithium-Sulfur-) battery systems provide very high specific gravimetric energy (2600 Wh/kg) and volumetric energy density (2800Wh/l). Hence, Li-S batteries are one of the key technologies for both the upcoming electromobility and stationary applications. Furthermore, the Li-S battery system is potentially cheap and environmentally benign. However, the technical implementation suffers from cycling stability, low charge and discharge rates and incomplete understanding of the complex polysulfide reaction mechanism. The aim of this work is to develop an effective electrocatalyst for the polysulfide reactions so that the electrode kinetics of the sulfur half-cell will be improved. Accordingly, the overvoltage will be decreased, and the efficiency of the cell will be increased. An enhanced electroactive surface additionally improves the charge and discharge rates. To reach this goal, functionalized electrocatalytic coatings are investigated to accelerate the kinetics of the polysulfide reactions. In order to determine a suitable electrocatalyst, apparent exchange current densities of a variety of materials (Ni, Co, Pt, Cr, Al, Cu, ITO, stainless steel) have been evaluated in a polysulfide containing electrolyte by potentiodynamic measurements and a Butler-Volmer fit including diffusion limitation. The samples have been examined by Scanning Electron Microscopy (SEM) after the potentiodynamic measurements. Up to now, our work shows that cobalt is a promising material with good electrocatalytic properties for the polysulfide reactions and good chemical stability in the system. Furthermore, an electrodeposition from a modified Watt’s nickel electrolyte with a sulfur source seems to provide an autocatalytic effect, but the electrocatalytic behavior decreases after several cycles of the current-potential-curve.Keywords: electrocatalyst, energy storage, lithium sulfur battery, sulfur electrode materials
Procedia PDF Downloads 3722782 Analyzing the Impact of Knowledge Sharing on Product Innovation: A Moderated Mediation Framework of Employees Creativity and Top Management Support
Authors: Aqsa Akbar, Sadaf Ehsan, Suheera Khalid Sheikh
Abstract:
Purpose: In the today’s competitive world, situational dynamism presents complex challenges for organizations to pursue production innovation. Calling for dire need to remain sustainable, the research aims to examine the interlinking mechanism of knowledge sharing and product innovation relationship. For this, a moderated mediation framework is developed in which employees’ creativity and top management support are suggested as viable factors affecting the knowledge sharing and product innovation relationship. Design/Methodology/Approaches A survey-based quantitative research design is selected for data collection via self-administered questionnaires from employees of Pakistan’s E-commerce organizations. Almost, 350 questionnaires were circulated and 285 were received back through a cross-sectional method. Data analysis is performed on SPSS 22.0 and AMOS. Finding The outcomes suggest that knowledge sharing is critical for companies undergoing product innovation. In addition, findings disclose that employees’ creativity partially mediates the relationship between knowledge sharing and product innovation. Furthermore, the moderation impact of top management support also substantiated the proposed hypothesis. Results are discussed in the light of the literature review, followed by the study’s limitations and future directions. Originality/Value The study donates significance towards the development of better understanding of how knowledge sharing is vital for product innovation. It adds on to the literature by highlighting mechanisms responsible for successful product innovation. Moreover, the study offers practical insights to Pakistan’s E-commerce industry and suggests about how to develop capabilities for product innovation.Keywords: employees creativity, knowledge sharing, product innovation, top management support
Procedia PDF Downloads 922781 Surface Modified Quantum Dots for Nanophotonics, Stereolithography and Hybrid Systems for Biomedical Studies
Authors: Redouane Krini, Lutz Nuhn, Hicham El Mard Cheol Woo Ha, Yoondeok Han, Kwang-Sup Lee, Dong-Yol Yang, Jinsoo Joo, Rudolf Zentel
Abstract:
To use Quantum Dots (QDs) in the two photon initiated polymerization technique (TPIP) for 3D patternings, QDs were modified on the surface with photosensitive end groups which are able to undergo a photopolymerization. We were able to fabricate fluorescent 3D lattice structures using photopatternable QDs by TPIP for photonic devices such as photonic crystals and metamaterials. The QDs in different diameter have different emission colors and through mixing of RGB QDs white light fluorescent from the polymeric structures has been created. Metamaterials are capable for unique interaction with the electrical and magnetic components of the electromagnetic radiation and for manipulating light it is crucial to have a negative refractive index. In combination with QDs via TPIP technique polymeric structures can be designed with properties which cannot be found in nature. This makes these artificial materials gaining a huge importance for real-life applications in photonic and optoelectronic. Understanding of interactions between nanoparticles and biological systems is of a huge interest in the biomedical research field. We developed a synthetic strategy of polymer functionalized nanoparticles for biomedical studies to obtain hybrid systems of QDs and copolymers with a strong binding network in an inner shell and which can be modified in the end through their poly(ethylene glycol) functionalized outer shell. These hybrid systems can be used as models for investigation of cell penetration and drug delivery by using measurements combination between CryoTEM and fluorescence studies.Keywords: biomedical study models, lithography, photo induced polymerization, quantum dots
Procedia PDF Downloads 5292780 Mannose-Functionalized Lipopolysaccharide Nanoparticles for Macrophage-Targeted Dual Delivery of Rifampicin and Isoniazid
Authors: Mumuni Sumaila, Viness Pillay, Yahya E. Choonara, Pradeep Kumar, Pierre P. Kondiah
Abstract:
Tuberculosis (TB) remains a serious challenge to public health globally, despite every effort put together to curb the disease. Current TB therapeutics available have proven to be inefficient due to a multitude of drawbacks that range from serious adverse effects/drug toxicity to inconsistent bioavailability, which ultimately contributes to the emergence of drug-resistant TB. An effective ‘cargo’ system designed to cleverly deliver therapeutic doses of anti-TB drugs to infection sites and in a sustained-release manner may provide a better therapeutic choice towards winning the war against TB. In the current study, we investigated mannose-functionalized lipopolysaccharide hybrid nanoparticles for safety and efficacy towards macrophage-targeted simultaneous delivery of the two first-line anti-TB drugs, rifampicin (RF) and isoniazid (IS). RF-IS-loaded lipopolysaccharide hybrid nanoparticles were fabricated using the solvent injection technique (SIT), incorporating soy lecithin (SL) and low molecular weight chitosan (CS) as the lipid and polysaccharide components, respectively. Surface-functionalized nanoparticles were obtained through the reaction of the aldehyde group of mannose with free amine functionality present at the surface of the nanoparticles. The functionalized nanocarriers were spherical with average particle size and surface charge of 107.83 nm and +21.77 mV, respectively, and entrapment efficiencies (EE) were 53.52% and 69.80% for RF and IS, respectively. FTIR spectrum revealed high-intensity bands between 1663 cm⁻¹ and 1408 cm⁻¹ wavenumbers (absent in non-functionalized nanoparticles), which could be attributed to the C=N stretching vibration produced by the formation of Schiff’s base (–N=CH–) during the mannosylation reaction. In vitro release studies showed a sustained-release profile for RF and IS, with less than half of the total payload released over a 48-hour period. The nanocarriers were biocompatible and safe, with more than 80% cell viability achieved when incubated with RAW 264.7 cells at concentrations 30 to 500 μg/mL over a 24-hour period. Cellular uptake studies (after a 24-hour incubation period with the murine macrophage cells, RAW 264.7) revealed a 13- and a 9-fold increase in intracellular accumulation of RF and IS, respectively, when compared with the unformulated RF+IS solution. A 6- and a 3-fold increase in intracellular accumulation of RF and IS, respectively, were observed when compared with the non-functionalized nanoparticles. Furthermore, fluorescent microscopy images showed nanoparticle internalization and accumulation within the RAW 264.7 cells, which was more significant in the mannose-functionalized system compared to the non-functionalized nanoparticles. The overall results suggested that the fabricated mannose-functionalized lipopolysaccharide nanoparticles are a safe and promising platform for macrophage-targeted delivery of anti-TB therapeutics. However, in vivo pharmacokinetic/pharmacodynamics studies are required to further substantiate the therapeutic efficacy of the nanosystem.Keywords: anti-tuberculosis therapeutics, hybrid nanosystem, lipopolysaccharide nanoparticles, macrophage-targeted delivery
Procedia PDF Downloads 1762779 Composite Coatings of Piezoelectric Quartz Sensors Based on Viscous Sorbents and Casein Micelles
Authors: Shuba Anastasiia, Kuchmenko Tatiana, Umarkhanov Ruslan
Abstract:
The development of new sensitive coatings for sensors is one of the key directions in the development of sensor technologies. Recently, there has been a trend towards the creation of multicomponent coatings for sensors, which make it possible to increase the sensitivity, and specificity, and improve the performance properties of sensors. When analyzing samples with a complex matrix of biological origin, the inclusion of micelles of bioactive substances (amino and nucleic acids, peptides, proteins) in the composition of the sensor coating can also increase useful analytical information. The purpose of this work is to evaluate the analytical characteristics of composite coatings of piezoelectric quartz sensors based on medium-molecular viscous sorbents with incorporated micellar casein concentrate during the sorption of vapors of volatile organic compounds. The sorption properties of the coatings were studied by piezoelectric quartz microbalance. Macromolecular compounds (dicyclohexyl-18-crown-6, triton X-100, lanolin, micellar casein concentrate) were used as sorbents. Highly volatile organic compounds of various classes (alcohols, acids, aldehydes, esters) and water were selected as test substances. It has been established that composite coatings of sensors with the inclusion of micellar casein are more stable and selective to vapors of highly volatile compounds than to water vapors. The method and technique of forming a composite coating using molecular viscous sorbents do not affect the kinetic features of VOC sorption. When casein micelles are used, the features of kinetic sorption depend on the matrix of the coating.Keywords: piezoquartz sensor, viscous sorbents, micellar casein, coating, volatile compounds
Procedia PDF Downloads 1292778 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model
Authors: Can Huang, Xiaoliang Wang, Qingquan Liu
Abstract:
Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH
Procedia PDF Downloads 672777 Exploring the Underlying Factors of Student Dropout in Makawanpur Multiple Campus: A Comprehensive Analysis
Authors: Uttam Aryal, Shekhar Thapaliya
Abstract:
This research paper presents a comprehensive analysis of the factors contributing to student dropout at Makawanpur Multiple Campus, utilizing primary data collected directly from dropped out as well as regular students and academic staff. Employing a mixed-method approach, combining qualitative and quantitative methods, this study examines into the complicated issue of student dropout. Data collection methods included surveys, interviews, and a thorough examination of academic records covering multiple academic years. The study focused on students who left their programs prematurely, as well as current students and academic staff, providing a well-rounded perspective on the issue. The analysis reveals a shaded understanding of the factors influencing student dropout, encompassing both academic and non-academic dimensions. These factors include academic challenges, personal choices, socioeconomic barriers, peer influences, and institutional-related issues. Importantly, the study highlights the most influential factors for dropout, such as the pursuit of education abroad, financial restrictions, and employment opportunities, shedding light on the complex web of circumstances that lead students to discontinue their education. The insights derived from this study offer actionable recommendations for campus administrators, policymakers, and educators to develop targeted interventions aimed at reducing dropout rates and improving student retention. The study underscores the importance of addressing the diverse needs and challenges faced by students, with the ultimate goal of fostering a supportive academic environment that encourages student success and program completion.Keywords: drop out, students, factors, opportunities, challenges
Procedia PDF Downloads 682776 An Absolute Femtosecond Rangefinder for Metrological Support in Coordinate Measurements
Authors: Denis A. Sokolov, Andrey V. Mazurkevich
Abstract:
In the modern world, there is an increasing demand for highly precise measurements in various fields, such as aircraft, shipbuilding, and rocket engineering. This has resulted in the development of appropriate measuring instruments that are capable of measuring the coordinates of objects within a range of up to 100 meters, with an accuracy of up to one micron. The calibration process for such optoelectronic measuring devices (trackers and total stations) involves comparing the measurement results from these devices to a reference measurement based on a linear or spatial basis. The reference used in such measurements could be a reference base or a reference range finder with the capability to measure angle increments (EDM). The base would serve as a set of reference points for this purpose. The concept of the EDM for replicating the unit of measurement has been implemented on a mobile platform, which allows for angular changes in the direction of laser radiation in two planes. To determine the distance to an object, a high-precision interferometer with its own design is employed. The laser radiation travels to the corner reflectors, which form a spatial reference with precisely known positions. When the femtosecond pulses from the reference arm and the measuring arm coincide, an interference signal is created, repeating at the frequency of the laser pulses. The distance between reference points determined by interference signals is calculated in accordance with recommendations from the International Bureau of Weights and Measures for the indirect measurement of time of light passage according to the definition of a meter. This distance is D/2 = c/2nF, approximately 2.5 meters, where c is the speed of light in a vacuum, n is the refractive index of a medium, and F is the frequency of femtosecond pulse repetition. The achieved uncertainty of type A measurement of the distance to reflectors 64 m (N•D/2, where N is an integer) away and spaced apart relative to each other at a distance of 1 m does not exceed 5 microns. The angular uncertainty is calculated theoretically since standard high-precision ring encoders will be used and are not a focus of research in this study. The Type B uncertainty components are not taken into account either, as the components that contribute most do not depend on the selected coordinate measuring method. This technology is being explored in the context of laboratory applications under controlled environmental conditions, where it is possible to achieve an advantage in terms of accuracy. In general, the EDM tests showed high accuracy, and theoretical calculations and experimental studies on an EDM prototype have shown that the uncertainty type A of distance measurements to reflectors can be less than 1 micrometer. The results of this research will be utilized to develop a highly accurate mobile absolute range finder designed for the calibration of high-precision laser trackers and laser rangefinders, as well as other equipment, using a 64 meter laboratory comparator as a reference.Keywords: femtosecond laser, pulse correlation, interferometer, laser absolute range finder, coordinate measurement
Procedia PDF Downloads 642775 Effect of Ultrasound-Assisted Pretreatment on Saccharification of Spent Coffee Grounds
Authors: Shady S. Hassan, Brijesh K. Tiwari, Gwilym A. Williams, Amit K. Jaiswal
Abstract:
EU is known as the destination with the highest rate of the coffee consumption per capita in the world. Spent coffee grounds (SCG) are the main by-product of coffee brewing. SCG is either disposed as a solid waste or employed as compost, although the polysaccharides from such lignocellulosic biomass might be used as feedstock for fermentation processes. However, SCG as a lignocellulose have a complex structure and pretreatment process is required to facilitate an efficient enzymatic hydrolysis of carbohydrates. However, commonly used pretreatment methods, such as chemical, physico-chemical and biological techniques are still insufficient to meet optimal industrial production requirements in a sustainable way. Ultrasound is a promising candidate as a sustainable green pretreatment solution for lignocellulosic biomass utilization in a large scale biorefinery. Thus, ultrasound pretreatment of SCG without adding harsh chemicals investigated as a green technology to enhance enzyme hydrolysis. In the present work, ultrasound pretreatment experiments were conducted on SCG using different ultrasound frequencies (25, 35, 45, 130, and 950 kHz) for 60 min. Regardless of ultrasound power, low ultrasound frequency is more effective than high ultrasound frequency in pretreatment of biomass. Ultrasound pretreatment of SCG (at ultrasound frequency of 25 kHz for 60 min) followed by enzymatic hydrolysis resulted in total reducing sugars of 56.1 ± 2.8 mg/g of biomass. Fourier transform Infrared Spectroscopy (FTIR) was employed to investigate changes in functional groups of biomass after pretreatment, while high-performance liquid chromatography (HPLC) was employed for determination of glucose. Pretreatment of lignocellulose by low frequency ultrasound in water only was found to be an effective green approach for SCG to improve saccharification and glucose yield compared to native biomass. Pretreatment conditions will be optimized, and the enzyme hydrolysate will be used as media component substitute for the production of ethanol.Keywords: lignocellulose, ultrasound, pretreatment, spent coffee grounds
Procedia PDF Downloads 3292774 Extraction of Cellulose Nanofibrils from Pulp Using Enzymatic Pretreatment and Evaluation of Their Papermaking Potential
Authors: Ajay Kumar Singh, Arvind Kumar, S. P. Singh
Abstract:
Cellulose nanofibrils (CNF) have shown potential of their extensive use in various fields, including papermaking, due to their unique characteristics. In this study, CNF’s were prepared by fibrillating the pulp obtained from raw materials e.g. bagasse, hardwood and softwood using enzymatic pretreatment followed by mechanical refining. These nanofibrils, when examined under FE-SEM, show that partial fibrillation on fiber surface has resulted in production of nanofibers. Mixing these nanofibers with the unrefined and normally refined fibers show their reinforcing effect. This effect is manifested in observing the improvement in the physical and mechanical properties e.g. tensile index and burst index of paper. Tear index, however, was observed to decrease on blending with nanofibers. The optical properties of paper sheets made from blended fibers showed no significant change in comparison to those made from only mechanically refined pulp. Mixing of normal pulp fibers with nanofibers show increase in ºSR and consequent decrease in drainage rate. These changes observed in mechanical, optical and other physical properties of the paper sheets made from nanofibrils blended pulp have been tried to explain considering the distribution of the nanofibrils alongside microfibrils in the fibrous network. Since usually, paper/boards with higher strength are observed to have diminished optical properties which is a drawback in their quality, the present work has the potential for developing paper/boards having improved strength alongwith undiminished optical properties utilising the concepts of nanoscience and nanotechnology.Keywords: enzymatic pretreatment, mechanical refining, nanofibrils, paper properties
Procedia PDF Downloads 3552773 For Post-traumatic Stress Disorder Counselors in China, the United States, and around the Globe, Cultural Beliefs Offer Challenges and Opportunities
Authors: Anne Giles
Abstract:
Trauma is generally defined as an experience, or multiple experiences, overwhelming a person's ability to cope. Over time, many people recover from the neurobiological, physical, and emotional effects of trauma on their own. For some people, however, troubling symptoms develop over time that can result in distress and disability. This cluster of symptoms is classified as Post-traumatic Stress Disorder (PTSD). People who meet the criteria for PTSD and other trauma-related disorder diagnoses often hold a set of understandable but unfounded beliefs about traumatic events that cause undue suffering. Becoming aware of unhelpful beliefs—termed "cognitive distortions"—and challenging them is the realm of Cognitive Behavior Therapy (CBT). A form of CBT found by researchers to be especially effective for PTSD is Cognitive Processing Therapy (CPT). Through the compassionate use of CPT, people identify, examine, challenge, and relinquish unhelpful beliefs, thereby reducing symptoms and suffering. Widely-held cultural beliefs can interfere with the progress of recovery from trauma-related disorders. Although highly revered, largely unquestioned, and often stabilizing, cultural beliefs can be founded in simplistic, dichotomous thinking, i.e., things are all right, or all wrong, all good, or all bad. The reality, however, is nuanced and complex. After studying examples of cultural beliefs from China and the United States and how these might interfere with trauma recovery, trauma counselors can help clients derive criteria for preserving helpful beliefs, discover, examine, and jettison unhelpful beliefs, reduce trauma symptoms, and live their lives more freely and fully.Keywords: cognitive processing therapy (CPT), cultural beliefs, post-traumatic stress disorder (PTSD), trauma recovery
Procedia PDF Downloads 2552772 Understanding Profit Shifting by Multinationals in the Context of Cross-Border M&A: A Methodological Exploration
Authors: Michal Friedrich
Abstract:
Cross-border investment has never been easier than in today’s global economy. Despite recent initiatives tightening the international tax landscape, profit shifting and tax optimization by multinational entities (MNEs) in the context of cross-border M&A remain persistent and complex phenomena that warrant in-depth exploration. By synthesizing the outcomes of existing research, this study aims to first provide a methodological framework for identifying MNEs’ profit-shifting behavior and quantifying its fiscal impacts via various macroeconomic and microeconomic approaches. The study also proposes additional methods and qualitative/quantitative measures for extracting insight into the profit shifting behavior of MNEs in the context of their M&A activities at industry and entity levels. To develop the proposed methods, this study applies the knowledge of international tax laws and known profit shifting conduits (incl. dividends, interest, and royalties) on several model cases/types of cross-border acquisitions and post-acquisition integration activities by MNEs and highlights important factors that encourage or discourage tax optimization. Follow-up research is envisaged to apply the methods outlined in this study on published data on real-world M&A transactions to gain practical country-by-country, industry and entity-level insights. In conclusion, this study seeks to contribute to the ongoing discourse on profit shifting by providing a methodological toolkit for exploring profit shifting tendencies MNEs in connection with their M&A activities and to serve as a backbone for further research. The study is expected to provide valuable insight to policymakers, tax authorities, and tax professionals alike.Keywords: BEPS, cross-border M&A, international taxation, profit shifting, tax optimization
Procedia PDF Downloads 722771 Time Series Analysis the Case of China and USA Trade Examining during Covid-19 Trade Enormity of Abnormal Pricing with the Exchange rate
Authors: Md. Mahadi Hasan Sany, Mumenunnessa Keya, Sharun Khushbu, Sheikh Abujar
Abstract:
Since the beginning of China's economic reform, trade between the U.S. and China has grown rapidly, and has increased since China's accession to the World Trade Organization in 2001. The US imports more than it exports from China, reducing the trade war between China and the U.S. for the 2019 trade deficit, but in 2020, the opposite happens. In international and U.S. trade, Washington launched a full-scale trade war against China in March 2016, which occurred a catastrophic epidemic. The main goal of our study is to measure and predict trade relations between China and the U.S., before and after the arrival of the COVID epidemic. The ML model uses different data as input but has no time dimension that is present in the time series models and is only able to predict the future from previously observed data. The LSTM (a well-known Recurrent Neural Network) model is applied as the best time series model for trading forecasting. We have been able to create a sustainable forecasting system in trade between China and the US by closely monitoring a dataset published by the State Website NZ Tatauranga Aotearoa from January 1, 2015, to April 30, 2021. Throughout the survey, we provided a 180-day forecast that outlined what would happen to trade between China and the US during COVID-19. In addition, we have illustrated that the LSTM model provides outstanding outcome in time series data analysis rather than RFR and SVR (e.g., both ML models). The study looks at how the current Covid outbreak affects China-US trade. As a comparative study, RMSE transmission rate is calculated for LSTM, RFR and SVR. From our time series analysis, it can be said that the LSTM model has given very favorable thoughts in terms of China-US trade on the future export situation.Keywords: RFR, China-U.S. trade war, SVR, LSTM, deep learning, Covid-19, export value, forecasting, time series analysis
Procedia PDF Downloads 2022770 Experimental Research of Smoke Impact on the Performance of Cylindrical Eight Channel Cyclone
Authors: Pranas Baltrėnas, Dainius Paliulis
Abstract:
Cyclones are widely used for separating particles from gas in energy production objects. Efficiency of normal centrifugal air cleaning devices ranges from 85 to 90%, but weakness of many cyclones is low collection efficiency of particles less than 10 μm in diameter. Many factors have impact on cyclone efficiency – humidity, temperature, gas (air) composition, airflow velocity and etc. Many scientists evaluated only effect of origin and size of PM on cyclone efficiency. Effect of gas (air) composition and temperature on cyclone efficiency still demands contributions. Complex experimental research on efficiency of cylindrical eight-channel system with adjustable half-rings for removing fine dispersive particles (< 20 μm) was carried out. The impact of gaseous smoke components on removal of wood ashes was analyzed. Gaseous components, present in the smoke mixture, with the dynamic viscosity lower than that of same temperature air, decrease the d50 value, simultaneously increasing the overall particulate matter removal efficiency in the cyclone, i.e. this effect is attributed to CO2 and CO, while O2 and NO have the opposite effect. Air temperature influences the d50 value, an increase in air temperature yields an increase in d50 value, i.e. the overall particulate matter removal efficiency declines, the reason for this being an increasing dynamic air viscosity. At 120 °C temperature the d50 value is approximately 11.8 % higher than at air temperature of 20 °C. With an increase in smoke (gas) temperature from 20 °C to 50 °C, the aerodynamic resistance in a 1-tier eight-channel cylindrical cyclone drops from 1605 to 1380 Pa, from 1660 to 1420 Pa in a 2-tier eight-channel cylindrical cyclone, from 1715 to 1450 Pa in a 3-tier eight-channel cylindrical cyclone. The reason for a decline in aerodynamic resistance is the declining gas density. The aim of the paper is to analyze the impact of gaseous smoke components on the eight–channel cyclone with tangential inlet.Keywords: cyclone, adjustable half-rings, particulate matter, efficiency, gaseous compounds, smoke
Procedia PDF Downloads 2912769 Dialectics of Modern Law: Perspectives and Strategies of Resistance from the Margins
Authors: Nisar Alungal Chungath
Abstract:
“No human being is illegal" has become a dictum strongly upheld in the context of global immigration and migration, highlighting the ethical and moral dimensions of how societies and governments treat individuals and communities who have crossed political borders or are living in a country without legal authorization. It seeks to shift the focus from categorizing human beings as illegal immigrants to recognizing their inherent human rights and the complexities of their circumstances. As a complex social phenomenon, law has been a crucial instrument in shaping, regulating and governing human societies and vice versa. The law has now become a humongous political project of the modern majoritarian regimes to democratically illegitimize and illegalize the unpopular sections and minorities. Drawing from the theoretical frameworks of dialectics, the paper explores the philosophical underpinnings of the historical evolution and dynamic nature of modern law. The paper employs a phenomenological approach to analyze the dialectical relations between individuals, societies, and legal systems, aiming to shed light on the ethical and political implications of these interactions. By examining the historical essence of law, its relationship with social and cultural norms, and the role of power dynamics, this article argues for constantly maintaining the dialectics of law—the dynamic interplay between legal norms, social practices, cultural values, and historical contexts through a philosophical and phenomenological lens, in order to bridge the gap between universal principles and particular contexts. The paper will shed light to the dialectics of the law in the context of instances of the legal persecutions of the modern secular democracies such as Citizenship Amendment Act-2019, India.Keywords: phenomenology, dialectic, modern law, politics, resistance, margins
Procedia PDF Downloads 592768 Influence of Internal Topologies on Components Produced by Selective Laser Melting: Numerical Analysis
Authors: C. Malça, P. Gonçalves, N. Alves, A. Mateus
Abstract:
Regardless of the manufacturing process used, subtractive or additive, material, purpose and application, produced components are conventionally solid mass with more or less complex shape depending on the production technology selected. Aspects such as reducing the weight of components, associated with the low volume of material required and the almost non-existent material waste, speed and flexibility of production and, primarily, a high mechanical strength combined with high structural performance, are competitive advantages in any industrial sector, from automotive, molds, aviation, aerospace, construction, pharmaceuticals, medicine and more recently in human tissue engineering. Such features, properties and functionalities are attained in metal components produced using the additive technique of Rapid Prototyping from metal powders commonly known as Selective Laser Melting (SLM), with optimized internal topologies and varying densities. In order to produce components with high strength and high structural and functional performance, regardless of the type of application, three different internal topologies were developed and analyzed using numerical computational tools. The developed topologies were numerically submitted to mechanical compression and four point bending testing. Finite Element Analysis results demonstrate how different internal topologies can contribute to improve mechanical properties, even with a high degree of porosity relatively to fully dense components. Results are very promising not only from the point of view of mechanical resistance, but especially through the achievement of considerable variation in density without loss of structural and functional high performance.Keywords: additive manufacturing, internal topologies, porosity, rapid prototyping, selective laser melting
Procedia PDF Downloads 3332767 Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory
Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad
Abstract:
Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives.Keywords: GAN, long short-term memory, synthetic data generation, traffic management
Procedia PDF Downloads 322766 A Quinary Coding and Matrix Structure Based Channel Hopping Algorithm for Blind Rendezvous in Cognitive Radio Networks
Authors: Qinglin Liu, Zhiyong Lin, Zongheng Wei, Jianfeng Wen, Congming Yi, Hai Liu
Abstract:
The multi-channel blind rendezvous problem in distributed cognitive radio networks (DCRNs) refers to how users in the network can hop to the same channel at the same time slot without any prior knowledge (i.e., each user is unaware of other users' information). The channel hopping (CH) technique is a typical solution to this blind rendezvous problem. In this paper, we propose a quinary coding and matrix structure-based CH algorithm called QCMS-CH. The QCMS-CH algorithm can guarantee the rendezvous of users using only one cognitive radio in the scenario of the asynchronous clock (i.e., arbitrary time drift between the users), heterogeneous channels (i.e., the available channel sets of users are distinct), and symmetric role (i.e., all users play a same role). The QCMS-CH algorithm first represents a randomly selected channel (denoted by R) as a fixed-length quaternary number. Then it encodes the quaternary number into a quinary bootstrapping sequence according to a carefully designed quaternary-quinary coding table with the prefix "R00". Finally, it builds a CH matrix column by column according to the bootstrapping sequence and six different types of elaborately generated subsequences. The user can access the CH matrix row by row and accordingly perform its channel, hoping to attempt rendezvous with other users. We prove the correctness of QCMS-CH and derive an upper bound on its Maximum Time-to-Rendezvous (MTTR). Simulation results show that the QCMS-CH algorithm outperforms the state-of-the-art in terms of the MTTR and the Expected Time-to-Rendezvous (ETTR).Keywords: channel hopping, blind rendezvous, cognitive radio networks, quaternary-quinary coding
Procedia PDF Downloads 952765 Copper Price Prediction Model for Various Economic Situations
Authors: Haidy S. Ghali, Engy Serag, A. Samer Ezeldin
Abstract:
Copper is an essential raw material used in the construction industry. During the year 2021 and the first half of 2022, the global market suffered from a significant fluctuation in copper raw material prices due to the aftermath of both the COVID-19 pandemic and the Russia-Ukraine war, which exposed its consumers to an unexpected financial risk. Thereto, this paper aims to develop two ANN-LSTM price prediction models, using Python, that can forecast the average monthly copper prices traded in the London Metal Exchange; the first model is a multivariate model that forecasts the copper price of the next 1-month and the second is a univariate model that predicts the copper prices of the upcoming three months. Historical data of average monthly London Metal Exchange copper prices are collected from January 2009 till July 2022, and potential external factors are identified and employed in the multivariate model. These factors lie under three main categories: energy prices and economic indicators of the three major exporting countries of copper, depending on the data availability. Before developing the LSTM models, the collected external parameters are analyzed with respect to the copper prices using correlation and multicollinearity tests in R software; then, the parameters are further screened to select the parameters that influence the copper prices. Then, the two LSTM models are developed, and the dataset is divided into training, validation, and testing sets. The results show that the performance of the 3-Month prediction model is better than the 1-Month prediction model, but still, both models can act as predicting tools for diverse economic situations.Keywords: copper prices, prediction model, neural network, time series forecasting
Procedia PDF Downloads 1172764 Maximizing Profit Using Optimal Control by Exploiting the Flexibility in Thermal Power Plants
Authors: Daud Mustafa Minhas, Raja Rehan Khalid, Georg Frey
Abstract:
The next generation power systems are equipped with abundantly available free renewable energy resources (RES). During their low-cost operations, the price of electricity significantly reduces to a lower value, and sometimes it becomes negative. Therefore, it is recommended not to operate the traditional power plants (e.g. coal power plants) and to reduce the losses. In fact, it is not a cost-effective solution, because these power plants exhibit some shutdown and startup costs. Moreover, they require certain time for shutdown and also need enough pause before starting up again, increasing inefficiency in the whole power network. Hence, there is always a trade-off between avoiding negative electricity prices, and the startup costs of power plants. To exploit this trade-off and to increase the profit of a power plant, two main contributions are made: 1) introducing retrofit technology for state of art coal power plant; 2) proposing optimal control strategy for a power plant by exploiting different flexibility features. These flexibility features include: improving ramp rate of power plant, reducing startup time and lowering minimum load. While, the control strategy is solved as mixed integer linear programming (MILP), ensuring optimal solution for the profit maximization problem. Extensive comparisons are made considering pre and post-retrofit coal power plant having the same efficiencies under different electricity price scenarios. It concludes that if the power plant must remain in the market (providing services), more flexibility reflects direct economic advantage to the plant operator.Keywords: discrete optimization, power plant flexibility, profit maximization, unit commitment model
Procedia PDF Downloads 1452763 Harmonic Distortion Analysis in Low Voltage Grid with Grid-Connected Photovoltaic
Authors: Hedi Dghim, Ahmed El-Naggar, Istvan Erlich
Abstract:
Power electronic converters are being introduced in low voltage (LV) grids at an increasingly rapid rate due to the growing adoption of power electronic-based home appliances in residential grid. Photovoltaic (PV) systems are considered one of the potential installed renewable energy sources in distribution power systems. This trend has led to high distortion in the supply voltage which consequently produces harmonic currents in the network and causes an inherent voltage unbalance. In order to investigate the effect of harmonic distortions, a case study of a typical LV grid configuration with high penetration of 3-phase and 1-phase rooftop mounted PV from southern Germany was first considered. Electromagnetic transient (EMT) simulations were then carried out under the MATLAB/Simulink environment which contain detailed models for power electronic-based loads, ohmic-based loads as well as 1- and 3-phase PV. Note that, the switching patterns of the power electronic circuits were considered in this study. Measurements were eventually performed to analyze the distortion levels when PV operating under different solar irradiance. The characteristics of the load-side harmonic impedances were analyzed, and their harmonic contributions were evaluated for different distortion levels. The effect of the high penetration of PV on the harmonic distortion of both positive and negative sequences was also investigated. The simulation results are presented based on case studies. The current distortion levels are in agreement with relevant standards, otherwise the Total Harmonic Distortion (THD) increases under low PV power generation due to its inverse relation with the fundamental current.Keywords: harmonic distortion analysis, power quality, PV systems, residential distribution system
Procedia PDF Downloads 272