Search results for: discrete events simulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7404

Search results for: discrete events simulation

1764 The Willingness to Pay of People in Taiwan for Flood Protection Standard of Regions

Authors: Takahiro Katayama, Hsueh-Sheng Chang

Abstract:

Due to the global climate change, it has increased the extreme rainfall that led to serious floods around the world. In recent years, urbanization and population growth also tend to increase the number of impervious surfaces, resulting in significant loss of life and property during floods especially for the urban areas of Taiwan. In the past, the primary governmental response to floods was structural flood control and the only flood protection standards in use were the design standards. However, these design standards of flood control facilities are generally calculated based on current hydrological conditions. In the face of future extreme events, there is a high possibility to surpass existing design standards and cause damages directly and indirectly to the public. To cope with the frequent occurrence of floods in recent years, it has been pointed out that there is a need for a different standard called FPSR (Flood Protection Standard of Regions) in Taiwan. FPSR is mainly used for disaster reduction and used to ensure that hydraulic facilities draining regional flood immediately under specific return period. FPSR could convey a level of flood risk which is useful for land use planning and reflect the disaster situations that a region can bear. However, little has been reported on FPSR and its impacts to the public in Taiwan. Hence, this study proposes a quantity procedure to evaluate the FPSR. This study aimed to examine FPSR of the region and public perceptions of and knowledge about FPSR, as well as the public’s WTP (willingness to pay) for FPSR. The research is conducted via literature review and questionnaire method. Firstly, this study will review the domestic and international research on the FPSR, and provide the theoretical framework of FPSR. Secondly, CVM (Contingent Value Method) has been employed to conduct this survey and using double-bounded dichotomous choice, close-ended format elicits households WTP for raising the protection level to understand the social costs. The samplings of this study are citizens living in Taichung city, Taiwan and 700 samplings were chosen in this study. In the end, this research will continue working on surveys, finding out which factors determining WTP, and provide some recommendations for adaption policies for floods in the future.

Keywords: climate change, CVM (Contingent Value Method), FPSR (Flood Protection Standard of Regions), urban flooding

Procedia PDF Downloads 242
1763 The AU Culture Platform Approach to Measure the Impact of Cultural Participation on Individuals

Authors: Sendy Ghirardi, Pau Rausell Köster

Abstract:

The European Commission increasingly pushes cultural policies towards social outcomes and local and regional authorities also call for culture-driven strategies for local development and prosperity and therefore, the measurement of cultural participation becomes increasingly more significant for evidence-based policy-making processes. Cultural participation involves various kinds of social and economic spillovers that combine social and economic objectives of value creation, including social sustainability and respect for human values. Traditionally, from the economic perspective, cultural consumption is measured by the value of financial transactions in purchasing, subscribing to, or renting cultural equipment and content, addressing the market value of cultural products and services. The main sources of data are the household spending survey and merchandise trade survey, among others. However, what characterizes the cultural consumption is that it is linked with the hedonistic and affective dimension rather than the utilitarian one. In fact, nowadays, more and more attention is being paid to the social and psychological dimensions of culture. The aim of this work is to present a comprehensive approach to measure the impacts of cultural participation and cultural users’ behaviour, combining both socio-psychological and economic approaches. The model combines contingent evaluation techniques with the individual characteristic and perception analysis of the cultural experiences to evaluate the cognitive, aesthetic, emotive and social impacts of cultural participation. To investigate the comprehensive approach to measure the impact of the cultural events on individuals, the research has been designed on the basis of prior theoretical development. A deep literature methodology has been done to develop the theoretical model applied to the web platform to measure the impacts of cultural experience on individuals. The developed framework aims to become a democratic tool for evaluating the services that cultural or policy institutions can adopt through the use of an interacting platform that produces big data benefiting academia, cultural management and policies. The Au Culture is a prototype based on an application that can be used on mobile phones or any other digital platform. The development of the AU Culture Platform has been funded by the Valencian Innovation Agency (Government of the Region of Valencia) and it is part of the Horizon 2020 project MESOC.

Keywords: comprehensive approach, cultural participation, economic dimension, socio-psychological dimension

Procedia PDF Downloads 109
1762 The Transformation of Architecture through the Technological Developments in History: Future Architecture Scenario

Authors: Adel Gurel, Ozge Ceylin Yildirim

Abstract:

Nowadays, design and architecture are being affected and underwent change with the rapid advancements in technology, economics, politics, society and culture. Architecture has been transforming with the latest developments after the inclusion of computers into design. Integration of design into the computational environment has revolutionized the architecture and new perspectives in architecture have been gained. The history of architecture shows the various technological developments and changes in which the architecture has transformed with time. Therefore, the analysis of integration between technology and the history of the architectural process makes it possible to build a consensus on the idea of how architecture is to proceed. In this study, each period that occurs with the integration of technology into architecture is addressed within historical process. At the same time, changes in architecture via technology are identified as important milestones and predictions with regards to the future of architecture have been determined. Developments and changes in technology and the use of technology in architecture within years are analyzed in charts and graphs comparatively. The historical process of architecture and its transformation via technology are supported with detailed literature review and they are consolidated with the examination of focal points of 20th-century architecture under the titles; parametric design, genetic architecture, simulation, and biomimicry. It is concluded that with the historical research between past and present; the developments in architecture cannot keep up with the advancements in technology and recent developments in technology overshadow the architecture, even the technology decides the direction of architecture. As a result, a scenario is presented with regards to the reach of technology in the future of architecture and the role of the architect.

Keywords: computer technologies, future architecture, scientific developments, transformation

Procedia PDF Downloads 179
1761 Photodetector Engineering with Plasmonic Properties

Authors: Hasan Furkan Kurt, Tugba Nur Atabey, Onat Cavit Dereli, Ahmad Salmanogli, H. Selcuk Gecim

Abstract:

In the article, the main goal is to study the effect of the plasmonic properties on the photocurrent generated by a photodetector. Fundamentally, a typical photodetector is designed and simulated using the finite element methods. To utilize the plasmonic effect, gold nanoparticles with different shape, size and morphology are buried into the intrinsic region. Plasmonic effect is arisen through the interaction of the incoming light with nanoparticles by which electrical properties of the photodetector are manipulated. In fact, using plasmonic nanoparticles not only increases the absorption bandwidth of the incoming light, but also generates a high intensity near-field close to the plasmonic nanoparticles. Those properties strongly affect the generated photocurrent. The simulation results show that using plasmonic nanoparticles significantly enhances the electrical properties of the photodetectors. More importantly, one can easily manipulate the plasmonic properties of the gold nanoparticles through engineering the nanoparticles' size, shape and morphology. Another important phenomenon is plasmon-plasmon interaction inside the photodetector. It is shown that plasmon-plasmon interaction improves the electron-hole generation rate by which the rate of the current generation is severely enhanced. This is the key factor that we want to focus on, to improve the photodetector electrical properties.

Keywords: plasmonic photodetector, plasmon-plasmon interaction, Gold nanoparticle, electrical properties

Procedia PDF Downloads 131
1760 Nondecoupling Signatures of Supersymmetry and an Lμ-Lτ Gauge Boson at Belle-II

Authors: Heerak Banerjee, Sourov Roy

Abstract:

Supersymmetry, one of the most celebrated fields of study for explaining experimental observations where the standard model (SM) falls short, is reeling from the lack of experimental vindication. At the same time, the idea of additional gauge symmetry, in particular, the gauged Lμ-Lτ symmetric models have also generated significant interest. They have been extensively proposed in order to explain the tantalizing discrepancy in the predicted and measured value of the muon anomalous magnetic moment alongside several other issues plaguing the SM. While very little parameter space within these models remain unconstrained, this work finds that the γ + Missing Energy (ME) signal at the Belle-II detector will be a smoking gun for supersymmetry (SUSY) in the presence of a gauged U(1)Lμ-Lτ symmetry. A remarkable consequence of breaking the enhanced symmetry appearing in the limit of degenerate (s)leptons is the nondecoupling of the radiative contribution of heavy charged sleptons to the γ-Z΄ kinetic mixing. The signal process, e⁺e⁻ →γZ΄→γ+ME, is an outcome of this ubiquitous feature. Taking the severe constraints on gauged Lμ-Lτ models by several low energy observables into account, it is shown that any significant excess in all but the highest photon energy bin would be an undeniable signature of such heavy scalar fields in SUSY coupling to the additional gauge boson Z΄. The number of signal events depends crucially on the logarithm of the ratio of stau to smuon mass in the presence of SUSY. In addition, the number is also inversely proportional to the e⁺e⁻ collision energy, making a low-energy, high-luminosity collider like Belle-II an ideal testing ground for this channel. This process can probe large swathes of the hitherto free slepton mass ratio vs. additional gauge coupling (gₓ) parameter space. More importantly, it can explore the narrow slice of Z΄ mass (MZ΄) vs. gₓ parameter space still allowed in gauged U(1)Lμ-Lτ models for superheavy sparticles. The spectacular finding that the signal significance is independent of individual slepton masses is an exciting prospect indeed. Further, the prospect that signatures of even superheavy SUSY particles that may have escaped detection at the LHC may show up at the Belle-II detector is an invigorating revelation.

Keywords: additional gauge symmetry, electron-positron collider, kinetic mixing, nondecoupling radiative effect, supersymmetry

Procedia PDF Downloads 120
1759 Defects Analysis, Components Distribution, and Properties Simulation in the Fuel Cells and Batteries by 2D and 3D Characterization Techniques

Authors: Amir Peyman Soleymani, Jasna Jankovic

Abstract:

The augmented demand of the clean and renewable energy has necessitated the fuel cell and battery industries to produce more efficient devices at the lower prices, which can be achieved through the improvement of the electrode. Microstructural characterization, as one of the main materials development tools, plays a pivotal role in the production of better clean energy devices. In this study, methods for characterization and studying of the defects and components distribution were performed on the polymer electrolyte membrane fuel cell (PEMFC) and Li-ion battery (LIB) electrodes in 2D and 3D. The particles distribution, porosity, mechanical defects, and component distribution were studied by Scanning Electron Microscope (SEM), SEM-Focused Ion Beam (SEM-FIB), and Scanning Transmission Electron Microscope equipped with Energy Dispersive Spectroscopy (STEM-EDS). The 3D results obtained from X-ray Computed Tomography (XCT) revealed the pathways for electron and ion conductivity and defects progression maps. Computer-aided methods (Avizo) were employed to simulate the properties and performance of the microstructure in the electrodes. The suggestions were provided to improve the performance of PEMFCs and LIBs by adjusting the microstructure and the distribution of the components in the electrodes.

Keywords: PEM fuel cells, Li-ion batteries, 2D and 3D imaging, materials characterizations

Procedia PDF Downloads 147
1758 In Patribus Fidelium Leftist Discourses on Political Violence in Lebanon and Algeria: A Critical Discourse Analysis

Authors: Mehdi Heydari Sanglaji

Abstract:

The dramatic events of the 11 September, and their tragic repercussions, catapulted issues of the political violence in and from the ‘Muslim world’ onto the political discourse, be it in patriotic speeches of campaigning politicians or the TV and news punditry. Depending on what end of the political spectrum the politician/pundit pledges fealty to, the overall analyses of political violence in the West Asia and North Africa (WANA) tends towards two overarching categories: on the Right, the diagnosis has unanimously been, ‘they must hate our freedom.’ On the Left, however, there is the contention that the West has to be counted as the primary cause of such rage, for the years of plundering of lives and resources, through colonialism, the Cold War, coups, etc. All these analyses are premised on at least two presuppositions: the violence in and from the WANA region a) is always reactionary, in the sense that it happens only in response to something the West is or does; and b) must always already be condemned, as it is essentially immoral and wrong. It is the aim of this paper to challenge such viewpoints. Through a rigorous study of the historical discourses on political violence in the Leftist organizations active in Algeria and Lebanon, we claim there is a myriad of diverse reasons and justifications presented for advocating political violence in these countries that defy facile categorization. Inspecting such rhetoric for inciting political violence in Leftist discourses, and how some of these reasonings have percolated into other movements in the region (e.g., Islamist ones), will reveal a wealth of indigenous discourses on the subject that has been largely neglected by the Western Media punditry and even by the academia. The indigenous discourses on political violence, much of which overlaps with emancipatory projects in the region, partly follow grammar and logic, which may be different from those developed in the West, even by its more critical theories. Understanding so different epistemology of violence, and the diverse contexts in which political violence might be justifiable in the mind of ‘the other,’ necessitates a historical, materialist, and genealogical study of the discourse already in practice in the WANA region. In that regard, both critical terrorism studies and critical discourse analysis provide exemplary tools of analysis. Capitalizing on such tools, this project will focus on unearthing a history of thought that renders moot the reduction of all instances of violence in the region to an Islamic culture or imperialism/colonialism. The main argument in our research is that by studying the indigenous discourses on political violence, we will be far more equipped in understanding the reasons and the possible solutions for acts of terrorism in and from the region.

Keywords: political violence, terrorism, leftist organizations, West Asia/North Africa

Procedia PDF Downloads 121
1757 Pharmacophore-Based Modeling of a Series of Human Glutaminyl Cyclase Inhibitors to Identify Lead Molecules by Virtual Screening, Molecular Docking and Molecular Dynamics Simulation Study

Authors: Ankur Chaudhuri, Sibani Sen Chakraborty

Abstract:

In human, glutaminyl cyclase activity is highly abundant in neuronal and secretory tissues and is preferentially restricted to hypothalamus and pituitary. The N-terminal modification of β-amyloids (Aβs) peptides by the generation of a pyro-glutamyl (pGlu) modified Aβs (pE-Aβs) is an important process in the initiation of the formation of neurotoxic plaques in Alzheimer’s disease (AD). This process is catalyzed by glutaminyl cyclase (QC). The expression of QC is characteristically up-regulated in the early stage of AD, and the hallmark of the inhibition of QC is the prevention of the formation of pE-Aβs and plaques. A computer-aided drug design (CADD) process was employed to give an idea for the designing of potentially active compounds to understand the inhibitory potency against human glutaminyl cyclase (QC). This work elaborates the ligand-based and structure-based pharmacophore exploration of glutaminyl cyclase (QC) by using the known inhibitors. Three dimensional (3D) quantitative structure-activity relationship (QSAR) methods were applied to 154 compounds with known IC50 values. All the inhibitors were divided into two sets, training-set, and test-sets. Generally, training-set was used to build the quantitative pharmacophore model based on the principle of structural diversity, whereas the test-set was employed to evaluate the predictive ability of the pharmacophore hypotheses. A chemical feature-based pharmacophore model was generated from the known 92 training-set compounds by HypoGen module implemented in Discovery Studio 2017 R2 software package. The best hypothesis was selected (Hypo1) based upon the highest correlation coefficient (0.8906), lowest total cost (463.72), and the lowest root mean square deviation (2.24Å) values. The highest correlation coefficient value indicates greater predictive activity of the hypothesis, whereas the lower root mean square deviation signifies a small deviation of experimental activity from the predicted one. The best pharmacophore model (Hypo1) of the candidate inhibitors predicted comprised four features: two hydrogen bond acceptor, one hydrogen bond donor, and one hydrophobic feature. The Hypo1 was validated by several parameters such as test set activity prediction, cost analysis, Fischer's randomization test, leave-one-out method, and heat map of ligand profiler. The predicted features were then used for virtual screening of potential compounds from NCI, ASINEX, Maybridge and Chembridge databases. More than seven million compounds were used for this purpose. The hit compounds were filtered by drug-likeness and pharmacokinetics properties. The selective hits were docked to the high-resolution three-dimensional structure of the target protein glutaminyl cyclase (PDB ID: 2AFU/2AFW) to filter these hits further. To validate the molecular docking results, the most active compound from the dataset was selected as a reference molecule. From the density functional theory (DFT) study, ten molecules were selected based on their highest HOMO (highest occupied molecular orbitals) energy and the lowest bandgap values. Molecular dynamics simulations with explicit solvation systems of the final ten hit compounds revealed that a large number of non-covalent interactions were formed with the binding site of the human glutaminyl cyclase. It was suggested that the hit compounds reported in this study could help in future designing of potent inhibitors as leads against human glutaminyl cyclase.

Keywords: glutaminyl cyclase, hit lead, pharmacophore model, simulation

Procedia PDF Downloads 128
1756 Fuzzy Adaptive Control of an Intelligent Hybrid HPS (Pvwindbat), Grid Power System Applied to a Dwelling

Authors: A. Derrouazin, N. Mekkakia-M, R. Taleb, M. Helaimi, A. Benbouali

Abstract:

Nowadays the use of different sources of renewable energy for the production of electricity is the concern of everyone, as, even impersonal domestic use of the electricity in isolated sites or in town. As the conventional sources of energy are shrinking, a need has arisen to look for alternative sources of energy with more emphasis on its optimal use. This paper presents design of a sustainable Hybrid Power System (PV-Wind-Storage) assisted by grid as supplementary sources applied to case study residential house, to meet its entire energy demand. A Fuzzy control system model has been developed to optimize and control flow of power from these sources. This energy requirement is mainly fulfilled from PV and Wind energy stored in batteries module for critical load of a residential house and supplemented by grid for base and peak load. The system has been developed for maximum daily households load energy of 3kWh and can be scaled to any higher value as per requirement of individual /community house ranging from 3kWh/day to 10kWh/day, as per the requirement. The simulation work, using intelligent energy management, has resulted in an optimal yield leading to average reduction in cost of electricity by 50% per day.

Keywords: photovoltaic (PV), wind turbine, battery, microcontroller, fuzzy control (FC), Matlab

Procedia PDF Downloads 642
1755 Perceptions toward Adopting Virtual Reality as a Learning Aid in Information Technology

Authors: S. Alfalah, J. Falah, T. Alfalah, M. Elfalah, O. Falah

Abstract:

The field of education is an ever-evolving area constantly enriched by newly discovered techniques provided by active research in all areas of technologies. The recent years have witnessed the introduction of a number of promising technologies and applications to enhance the teaching and learning experience. Virtual Reality (VR) applications are considered one of the evolving methods that have contributed to enhancing education in many fields. VR creates an artificial environment, using computer hardware and software, which is similar to the real world. This simulation provides a solution to improve the delivery of materials, which facilitates the teaching process by providing a useful aid to instructors, and enhances the learning experience by providing a beneficial learning aid. In order to assure future utilization of such systems, students’ perceptions were examined toward utilizing VR as an educational tool in the Faculty of Information Technology (IT) in The University of Jordan. A questionnaire was administered to IT undergraduates investigating students’ opinions about the potential opportunities that VR technology could offer and its implications as learning and teaching aid. The results confirmed the end users’ willingness to adopt VR systems as a learning aid. The result of this research forms a solid base for investing in a VR system for IT education.

Keywords: information, technology, virtual reality, education

Procedia PDF Downloads 280
1754 Monitoring Land Cover/Land Use Change in Rupandehi District by Optimising Remotely Sensed Image

Authors: Hritik Bhattarai

Abstract:

Land use and land cover play a crucial role in preserving and managing Earth's natural resources. Various factors, such as economic, demographic, social, cultural, technological, and environmental processes, contribute to changes in land use and land cover (LULC). Rupandehi District is significantly influenced by a combination of driving forces, including its geographical location, rapid population growth, economic opportunities, globalization, tourism activities, and political events. Urbanization and urban growth in the region have been occurring in an unplanned manner, with internal migration and natural population growth being the primary contributors. Internal migration, particularly from neighboring districts in the higher and lower Himalayan regions, has been high, leading to increased population growth and density. This study utilizes geospatial technology, specifically geographic information system (GIS), to analyze and illustrate the land cover and land use changes in the Rupandehi district for the years 2009 and 2019, using freely available Landsat images. The identified land cover categories include built-up area, cropland, Das-Gaja, forest, grassland, other woodland, riverbed, and water. The statistical analysis of the data over the 10-year period (2009-2019) reveals significant percentage changes in LULC. Notably, Das-Gaja shows a minimal change of 99.9%, while water and forest exhibit increases of 34.5% and 98.6%, respectively. Riverbed and built-up areas experience changes of 95.3% and 39.6%, respectively. Cropland and grassland, however, show concerning decreases of 102.6% and 140.0%, respectively. Other woodland also indicates a change of 50.6%. The most noteworthy trends are the substantial increase in water areas and built-up areas, leading to the degradation of agricultural and open spaces. This emphasizes the urgent need for effective urban planning activities to ensure the development of a sustainable city. While Das-Gaja seems unaffected, the decreasing trends in cropland and grassland, accompanied by the increasing built-up areas, are unsatisfactory. It is imperative for relevant authorities to be aware of these trends and implement proactive measures for sustainable urban development.

Keywords: land use and land cover, geospatial, urbanization, geographic information system, sustainable urban development

Procedia PDF Downloads 50
1753 Ambiguity Resolution for Ground-based Pulse Doppler Radars Using Multiple Medium Pulse Repetition Frequency

Authors: Khue Nguyen Dinh, Loi Nguyen Van, Thanh Nguyen Nhu

Abstract:

In this paper, we propose an adaptive method to resolve ambiguities and a ghost target removal process to extract targets detected by a ground-based pulse-Doppler radar using medium pulse repetition frequency (PRF) waveforms. The ambiguity resolution method is an adaptive implementation of the coincidence algorithm, which is implemented on a two-dimensional (2D) range-velocity matrix to resolve range and velocity ambiguities simultaneously, with a proposed clustering filter to enhance the anti-error ability of the system. Here we consider the scenario of multiple target environments. The ghost target removal process, which is based on the power after Doppler processing, is proposed to mitigate ghosting detections to enhance the performance of ground-based radars using a short PRF schedule in multiple target environments. Simulation results on a ground-based pulsed Doppler radar model will be presented to show the effectiveness of the proposed approach.

Keywords: ambiguity resolution, coincidence algorithm, medium PRF, ghosting removal

Procedia PDF Downloads 141
1752 Enhancing Throughput for Wireless Multihop Networks

Authors: K. Kalaiarasan, B. Pandeeswari, A. Arockia John Francis

Abstract:

Wireless, Multi-hop networks consist of one or more intermediate nodes along the path that receive and forward packets via wireless links. The backpressure algorithm provides throughput optimal routing and scheduling decisions for multi-hop networks with dynamic traffic. Xpress, a cross-layer backpressure architecture was designed to reach the capacity of wireless multi-hop networks and it provides well coordination between layers of network by turning a mesh network into a wireless switch. Transmission over the network is scheduled using a throughput-optimal backpressure algorithm. But this architecture operates much below their capacity due to out-of-order packet delivery and variable packet size. In this paper, we present Xpress-T, a throughput optimal backpressure architecture with TCP support designed to reach maximum throughput of wireless multi-hop networks. Xpress-T operates at the IP layer, and therefore any transport protocol, including TCP, can run on top of Xpress-T. The proposed design not only avoids bottlenecks but also handles out-of-order packet delivery and variable packet size, optimally load-balances traffic across them when needed, improving fairness among competing flows. Our simulation results shows that Xpress-T gives 65% more throughput than Xpress.

Keywords: backpressure scheduling and routing, TCP, congestion control, wireless multihop network

Procedia PDF Downloads 512
1751 The Application of Video Segmentation Methods for the Purpose of Action Detection in Videos

Authors: Nassima Noufail, Sara Bouhali

Abstract:

In this work, we develop a semi-supervised solution for the purpose of action detection in videos and propose an efficient algorithm for video segmentation. The approach is divided into video segmentation, feature extraction, and classification. In the first part, a video is segmented into clips, and we used the K-means algorithm for this segmentation; our goal is to find groups based on similarity in the video. The application of k-means clustering into all the frames is time-consuming; therefore, we started by the identification of transition frames where the scene in the video changes significantly, and then we applied K-means clustering into these transition frames. We used two image filters, the gaussian filter and the Laplacian of Gaussian. Each filter extracts a set of features from the frames. The Gaussian filter blurs the image and omits the higher frequencies, and the Laplacian of gaussian detects regions of rapid intensity changes; we then used this vector of filter responses as an input to our k-means algorithm. The output is a set of cluster centers. Each video frame pixel is then mapped to the nearest cluster center and painted with a corresponding color to form a visual map. The resulting visual map had similar pixels grouped. We then computed a cluster score indicating how clusters are near each other and plotted a signal representing frame number vs. clustering score. Our hypothesis was that the evolution of the signal would not change if semantically related events were happening in the scene. We marked the breakpoints at which the root mean square level of the signal changes significantly, and each breakpoint is an indication of the beginning of a new video segment. In the second part, for each segment from part 1, we randomly selected a 16-frame clip, then we extracted spatiotemporal features using convolutional 3D network C3D for every 16 frames using a pre-trained model. The C3D final output is a 512-feature vector dimension; hence we used principal component analysis (PCA) for dimensionality reduction. The final part is the classification. The C3D feature vectors are used as input to a multi-class linear support vector machine (SVM) for the training model, and we used a multi-classifier to detect the action. We evaluated our experiment on the UCF101 dataset, which consists of 101 human action categories, and we achieved an accuracy that outperforms the state of art by 1.2%.

Keywords: video segmentation, action detection, classification, Kmeans, C3D

Procedia PDF Downloads 67
1750 Effect of Atmospheric Turbulence on Hybrid FSO/RF Link Availability under Qatar's Harsh Climate

Authors: Abir Touati, Syed Jawad Hussain, Farid Touati, Ammar Bouallegue

Abstract:

Although there has been a growing interest in the hybrid free-space optical link and radio frequency FSO/RF communication system, the current literature is limited to results obtained in moderate or cold environment. In this paper, using a soft switching approach, we investigate the effect of weather inhomogeneities on the strength of turbulence hence the channel refractive index under Qatar harsh environment and their influence on the hybrid FSO/RF availability. In this approach, either FSO/RF or simultaneous or none of them can be active. Based on soft switching approach and a finite state Markov Chain (FSMC) process, we model the channel fading for the two links and derive a mathematical expression for the outage probability of the hybrid system. Then, we evaluate the behavior of the hybrid FSO/RF under hazy and harsh weather. Results show that the FSO/RF soft switching renders the system outage probability less than that of each link individually. A soft switching algorithm is being implemented on FPGAs using Raptor code interfaced to the two terminals of a 1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented in the region. Experimental results are compared to the above simulation results.

Keywords: atmospheric turbulence, haze, hybrid FSO/RF, outage probability, refractive index

Procedia PDF Downloads 410
1749 Automatic Adjustment of Thresholds via Closed-Loop Feedback Mechanism for Solder Paste Inspection

Authors: Chia-Chen Wei, Pack Hsieh, Jeffrey Chen

Abstract:

Surface Mount Technology (SMT) is widely used in the area of the electronic assembly in which the electronic components are mounted to the surface of the printed circuit board (PCB). Most of the defects in the SMT process are mainly related to the quality of solder paste printing. These defects lead to considerable manufacturing costs in the electronics assembly industry. Therefore, the solder paste inspection (SPI) machine for controlling and monitoring the amount of solder paste printing has become an important part of the production process. So far, the setting of the SPI threshold is based on statistical analysis and experts’ experiences to determine the appropriate threshold settings. Because the production data are not normal distribution and there are various variations in the production processes, defects related to solder paste printing still occur. In order to solve this problem, this paper proposes an online machine learning algorithm, called the automatic threshold adjustment (ATA) algorithm, and closed-loop architecture in the SMT process to determine the best threshold settings. Simulation experiments prove that our proposed threshold settings improve the accuracy from 99.85% to 100%.

Keywords: big data analytics, Industry 4.0, SPI threshold setting, surface mount technology

Procedia PDF Downloads 108
1748 Numerical Simulation of Heating Characteristics in a Microwave T-Prong Antenna for Cancer Therapy

Authors: M. Chaichanyut, S. Tungjitkusolmun

Abstract:

This research is presented with microwave (MW) ablation by using the T-Prong monopole antennas. In the study, three-dimensional (3D) finite-element methods (FEM) were utilized to analyse: the tissue heat flux, temperature distributions (heating pattern) and volume destruction during MW ablation in liver cancer tissue. The configurations of T-Prong monopole antennas were considered: Three T-prong antenna, Expand T-Prong antenna and Arrow T-Prong antenna. The 3D FEMs solutions were based on Maxwell and bio-heat equations. The microwave power deliveries were 10 W; the duration of ablation in all cases was 300s. Our numerical result, heat flux and the hotspot occurred at the tip of the T-prong antenna for all cases. The temperature distribution pattern of all antennas was teardrop. The Arrow T-Prong antenna can induce the highest temperature within cancer tissue. The microwave ablation was successful when the region where the temperatures exceed 50°C (i.e. complete destruction). The Expand T-Prong antenna could complete destruction the liver cancer tissue was maximized (6.05 cm³). The ablation pattern or axial ratio (Widest/length) of Expand T-Prong antenna and Arrow T-Prong antenna was 1, but the axial ratio of Three T-prong antenna of about 1.15.

Keywords: liver cancer, T-Prong antenna, finite element, microwave ablation

Procedia PDF Downloads 313
1747 Detection of Latent Fingerprints Recovered from Arson Simulation by a Novel Fluorescent Method

Authors: Somayeh Khanjani, Samaneh Nabavi, Shirin Jalili, Afshin Khara

Abstract:

Fingerprints are area source of ubiquitous evidence and consequential for establishing identity. The detection and subsequent development of fingerprints are thus inevitable in criminal investigations. This becomes a difficult task in the case of certain extreme conditions like fire. A fire scene may be accidental or arson. The evidence subjected to fire is generally overlooked as there is a misconception that they are damaged. There are several scientific approaches to determine whether the fire was deliberate or not. In such as scenario, fingerprints may be most critical to link the perpetrator to the crime. The reason for this may be the destructive nature of fire. Fingerprints subjected to fire are exposed to high temperatures, soot deposition, electromagnetic radiation, and subsequent water force. It is believed that these phenomena damage the fingerprint. A novel fluorescent and a pre existing small particle reagent were investigated for the same. Zinc carbonates based fluorescent small particle reagent was capable of developing latent fingerprints exposed to a maximum temperature of 800 ̊C. Fluorescent SPR may prove very useful in such cases. Fluorescent SPR reagent based on zinc carbonate is a potential method for developing fingerprints from arson sites. The method is cost effective and non hazardous. This formulation is suitable for developing fingerprints exposed to fire/ arson.

Keywords: fingerprint, small particle reagent (SPR), arson, novel fluorescent

Procedia PDF Downloads 465
1746 Fire Characteristic of Commercial Retardant Flame Polycarbonate under Different Oxygen Concentration: Ignition Time and Heat Blockage

Authors: Xuelin Zhang, Shouxiang Lu, Changhai Li

Abstract:

The commercial retardant flame polycarbonate samples as the main high speed train interior carriage material with different thicknesses were investigated in Fire Propagation Apparatus with different external heat fluxes under different oxygen concentration from 12% to 40% to study the fire characteristics and quantitatively analyze the ignition time, mass loss rate and heat blockage. The additives of commercial retardant flame polycarbonate were intumescent and maintained a steady height before ignition when heated. The results showed the transformed ignition time (1/t_ig)ⁿ increased linearly with external flux under different oxygen concentration after deducting the heat blockage due to pyrolysis products, the mass loss rate was taken on linearly with external heat fluxes and the slop of the fitting line for mass loss rate and external heat fluxes decreased with the enhanced oxygen concentration and the heat blockage independent on external heat fluxes rose with oxygen concentration increasing. The inquired data as the input of the fire simulation model was the most important to be used to evaluate the fire risk of commercial retardant flame polycarbonate.

Keywords: ignition time, mass loss rate, heat blockage, fire characteristic

Procedia PDF Downloads 277
1745 The Situation in Afghanistan as a Step Forward in Putting an End to Impunity

Authors: Jelena Radmanovic

Abstract:

On 5 March 2020, the International Criminal Court has decided to authorize the investigation into the crimes allegedly committed on the territory of Afghanistan after 1 May 2003. The said determination has raised several controversies, including the recently imposed sanctions by the United States, furthering the United States' long-standing rejection of the authority of the International Criminal Court. The purpose of this research is to address the said investigation in light of its importance for the prevention of impunity in the cases where the perpetrators are nationals of Non-Party States to the Rome Statute. Difficulties that the International Criminal Court has been facing, concerning the establishment of its jurisdiction in those instances where an involved state is not a Party to the Rome Statute, have become the most significant stumbling block undermining the importance, integrity, and influence of the Court. The Situation in Afghanistan raises even further concern, bearing in mind that the Prosecutor’s Request for authorization of an investigation pursuant to article 15 from 20 November 2017 has initially been rejected with the ‘interests of justice’ as an applied rationale. The first method used for the present research is the description of the actual events regarding the aforementioned decisions and the following reactions in the international community, while with the second method – the method of conceptual analysis, the research will address the decisions pertaining to the International Criminal Court’s jurisdiction and will attempt to address the mentioned Decision of 5 March 2020 as an example of good practice and a precedent that should be followed in all similar situations. The research will attempt parsing the reasons used by the International Criminal Court, giving rather greater attention to the latter decision that has authorized the investigation and the points raised by the officials of the United States. It is a find of this research that the International Criminal Court, together with other similar judicial instances (Nuremberg and Tokyo Tribunals, The International Criminal Tribunal for the former Yugoslavia, The International Criminal Tribunal for Rwanda), has presented the world with the possibility of non-impunity, attempting to prosecute those responsible for the gravest of crimes known to the humanity and has shown that such persons should not enjoy the benefits of their immunities, with its focus primarily on the victims of such crimes. Whilst it is an issue that will most certainly be addressed further in the future, with the situations that will be brought before the International Criminal Court, the present research will make an attempt at pointing to the significance of the situation in Afghanistan, the International Criminal Court as such and the international criminal justice as a whole, for the purpose of putting an end to impunity.

Keywords: Afghanistan, impunity, international criminal court, sanctions, United States

Procedia PDF Downloads 115
1744 Diplomacy in Times of Disaster: Management through Reputational Capital

Authors: Liza Ireni-Saban

Abstract:

The 6.6 magnitude quake event that occurred in 2003 (Bam, Iran) made it impossible for the Iranian government to handle disaster relief efforts domestically. In this extreme event, the Iranian government reached out to the international community, and this created a momentum that had to be carried out by trust-building efforts on all sides, often termed ‘Disaster Diplomacy’. Indeed, the circumstances were even more critical when one considers the increasing political and economic isolation of Iran within the international community. The potential for transformative political space to be opened by disaster has been recognized by dominant international political actors. Despite the fact that Bam 2003 post-disaster relief efforts did not catalyze any diplomatic activities on all sides, it is suggested that few international aid agencies have successfully used disaster recovery to enhance their popular legitimacy and reputation among the international community. In terms of disaster diplomacy, an actor’s reputational capital may affect his ability to build coalitions and alliances to achieve international political ends, to negotiate and build understanding and trust with foreign publics. This study suggests that the post-disaster setting may benefit from using the ecology of games framework to evaluate the role of bridging actors and mediators in facilitating collaborative governance networks. Recent developments in network theory and analysis provide means of structural embeddedness to explore how reputational capital can be built through brokerage roles of actors engaged in a disaster management network. This paper then aims to structure the relations among actors that participated in the post-disaster relief efforts in the 2003 Bam earthquake (Iran) in order to assess under which conditions actors may be strategically utilized to serve as mediating organizations for future disaster events experienced by isolated nations or nations in conflict. The results indicate the strategic use of reputational capital by the Iranian Ministry of Foreign Affairs as key broker to build a successful coordinative system for reducing disaster vulnerabilities. International aid agencies rarely played brokerage roles to coordinate peripheral actors. U.S. foreign assistance (USAID), despite coordination capacities, was prevented from serving brokerage roles in the system.

Keywords: coordination, disaster diplomacy, international aid organizations, Iran

Procedia PDF Downloads 145
1743 Vertical Accuracy Evaluation of Indian National DEM (CartoDEM v3) Using Dual Frequency GNSS Derived Ground Control Points for Lower Tapi Basin, Western India

Authors: Jaypalsinh B. Parmar, Pintu Nakrani, Ashish Chaurasia

Abstract:

Digital Elevation Model (DEM) is considered as an important data in GIS-based terrain analysis for many applications and assessment of processes such as environmental and climate change studies, hydrologic modelling, etc. Vertical accuracy of DEM having geographically dynamic nature depends on different parameters which affect the model simulation outcomes. Vertical accuracy assessment in Indian landscape especially in low-lying coastal urban terrain such as lower Tapi Basin is very limited. In the present study, attempt has been made to evaluate the vertical accuracy of 30m resolution open source Indian National Cartosat-1 DEM v3 for Lower Tapi Basin (LTB) from western India. The extensive field investigation is carried out using stratified random fast static DGPS survey in the entire study region, and 117 high accuracy ground control points (GCPs) have been obtained. The above open source DEM was compared with obtained GCPs, and different statistical attributes were envisaged, and vertical error histograms were also evaluated.

Keywords: CartoDEM, Digital Elevation Model, GPS, lower Tapi basin

Procedia PDF Downloads 352
1742 A Robust and Adaptive Unscented Kalman Filter for the Air Fine Alignment of the Strapdown Inertial Navigation System/GPS

Authors: Jian Shi, Baoguo Yu, Haonan Jia, Meng Liu, Ping Huang

Abstract:

Adapting to the flexibility of war, a large number of guided weapons launch from aircraft. Therefore, the inertial navigation system loaded in the weapon needs to undergo an alignment process in the air. This article proposes the following methods to the problem of inaccurate modeling of the system under large misalignment angles, the accuracy reduction of filtering caused by outliers, and the noise changes in GPS signals: first, considering the large misalignment errors of Strapdown Inertial Navigation System (SINS)/GPS, a more accurate model is made rather than to make a small-angle approximation, and the Unscented Kalman Filter (UKF) algorithms are used to estimate the state; then, taking into account the impact of GPS noise changes on the fine alignment algorithm, the innovation adaptive filtering algorithm is introduced to estimate the GPS’s noise in real-time; at the same time, in order to improve the anti-interference ability of the air fine alignment algorithm, a robust filtering algorithm based on outlier detection is combined with the air fine alignment algorithm to improve the robustness of the algorithm. The algorithm can improve the alignment accuracy and robustness under interference conditions, which is verified by simulation.

Keywords: air alignment, fine alignment, inertial navigation system, integrated navigation system, UKF

Procedia PDF Downloads 154
1741 A Machining Method of Cross-Shape Nano Channel and Experiments for Silicon Substrate

Authors: Zone-Ching Lin, Hao-Yuan Jheng, Zih-Wun Jhang

Abstract:

The paper innovatively proposes using the concept of specific down force energy (SDFE) and AFM machine to establish a machining method of cross-shape nanochannel on single-crystal silicon substrate. As for machining a cross-shape nanochannel by AFM machine, the paper develop a method of machining cross-shape nanochannel groove at a fixed down force by using SDFE theory and combining the planned cutting path of cross-shape nanochannel up to 5th machining layer it finally achieves a cross-shape nanochannel at a cutting depth of around 20nm. Since there may be standing burr at the machined cross-shape nanochannel edge, the paper uses a smaller down force to cut the edge of the cross-shape nanochannel in order to lower the height of standing burr and converge the height of standing burr at the edge to below 0.54nm as set by the paper. Finally, the paper conducts experiments of machining cross-shape nanochannel groove on single-crystal silicon by AFM probe, and compares the simulation and experimental results. It is proved that this proposed machining method of cross-shape nanochannel is feasible.

Keywords: atomic force microscopy (AFM), cross-shape nanochannel, silicon substrate, specific down force energy (SDFE)

Procedia PDF Downloads 361
1740 Delineation of Soil Physical Properties Using Electrical Conductivity, Case Study: Volcanic Soil Simulation Model

Authors: Twin Aji Kusumagiani, Eleonora Agustine, Dini Fitriani

Abstract:

The value changes of soil physical properties in the agricultural area are giving impacts on soil fertility. This can be caused by excessive usage of inorganic fertilizers and imbalances on organic fertilization. Soil physical parameters that can be measured include soil electrical conductivity, water content volume, soil porosity, dielectric permittivity, etc. This study used the electrical conductivity and volume water content as the measured physical parameters. The study was conducted on volcanic soil obtained from agricultural land conditioned with NPK fertilizer and salt in a certain amount. The dimension of the conditioned soil being used is 1 x 1 x 0.5 meters. By using this method, we can delineate the soil electrical conductivity value of land due to changes in the provision of inorganic NPK fertilizer and the salinity in the soil. Zone with the additional 1 kg of salt has the dimension of 60 cm in width, 20 cm in depth and 1 cm in thickness while zone with the additional of 10 kg NPK fertilizer has the dimensions of 70 cm in width, 20 cm in depth and 3 cm in thickness. This salt addition resulted in EC values changes from the original condition. Changes of the EC value tend to occur at a depth of 20 to 40 cm on the line 1B at 9:45 dS/cm and line 1C of 9.35 dS/cm and tend to have the direction to the Northeast.

Keywords: EC, electrical conductivity, VWC, volume water content, NPK fertilizer, salt, volcanic soil

Procedia PDF Downloads 304
1739 Comparative Study of Deep Reinforcement Learning Algorithm Against Evolutionary Algorithms for Finding the Optimal Values in a Simulated Environment Space

Authors: Akshay Paranjape, Nils Plettenberg, Robert Schmitt

Abstract:

Traditional optimization methods like evolutionary algorithms are widely used in production processes to find an optimal or near-optimal solution of control parameters based on the simulated environment space of a process. These algorithms are computationally intensive and therefore do not provide the opportunity for real-time optimization. This paper utilizes the Deep Reinforcement Learning (DRL) framework to find an optimal or near-optimal solution for control parameters. A model based on maximum a posteriori policy optimization (Hybrid-MPO) that can handle both numerical and categorical parameters is used as a benchmark for comparison. A comparative study shows that DRL can find optimal solutions of similar quality as compared to evolutionary algorithms while requiring significantly less time making them preferable for real-time optimization. The results are confirmed in a large-scale validation study on datasets from production and other fields. A trained XGBoost model is used as a surrogate for process simulation. Finally, multiple ways to improve the model are discussed.

Keywords: reinforcement learning, evolutionary algorithms, production process optimization, real-time optimization, hybrid-MPO

Procedia PDF Downloads 104
1738 Phenomenological Ductile Fracture Criteria Applied to the Cutting Process

Authors: František Šebek, Petr Kubík, Jindřich Petruška, Jiří Hůlka

Abstract:

Present study is aimed on the cutting process of circular cross-section rods where the fracture is used to separate one rod into two pieces. Incorporating the phenomenological ductile fracture model into the explicit formulation of finite element method, the process can be analyzed without the necessity of realizing too many real experiments which could be expensive in case of repetitive testing in different conditions. In the present paper, the steel AISI 1045 was examined and the tensile tests of smooth and notched cylindrical bars were conducted together with biaxial testing of the notched tube specimens to calibrate material constants of selected phenomenological ductile fracture models. These were implemented into the Abaqus/Explicit through user subroutine VUMAT and used for cutting process simulation. As the calibration process is based on variables which cannot be obtained directly from experiments, numerical simulations of fracture tests are inevitable part of the calibration. Finally, experiments regarding the cutting process were carried out and predictive capability of selected fracture models is discussed. Concluding remarks then make the summary of gained experience both with the calibration and application of particular ductile fracture criteria.

Keywords: ductile fracture, phenomenological criteria, cutting process, explicit formulation, AISI 1045 steel

Procedia PDF Downloads 443
1737 Sustainable Development: Evaluation of an Urban Neighborhood

Authors: Harith Mohammed Benbouali

Abstract:

The concept of sustainable development is becoming increasingly important in our society. The efforts of specialized agencies, cleverly portrayed in the media, allow a widespread environmental awareness. Far from the old environmental movement in the backward-looking nostalgia, the environment is combined with today's progress. Many areas now include these concerns in their efforts, this in order to try to reduce the negative impact of human activities on the environment. The quantitative dimension of development has given way to the quality aspect. However, this feature is not common, and the initial target was abandoned in favor of economic considerations. Specialists in the field of building and construction have constantly sought to further integrate the environmental dimension, creating a seal of high environmental quality buildings. The pursuit of well-being of neighborhood residents and the quality of buildings are also a hot topic in planning. Quality of life is considered so on, since financial concerns dominate to the detriment of the environment and the welfare of the occupants. This work concerns the development of an analytical method based on multiple indicators of objectives across the district. The quantification of indicators related to objectives allows the construction professional, the developer or the community, to quantify and compare different alternatives for development of a neighborhood. This quantification is based on the use of simulation tools and a multi-criteria aggregation.

Keywords: sustainable development, environment, district, indicators, multi-criteria analysis, evaluation

Procedia PDF Downloads 304
1736 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand

Authors: Jefferson Hernandez, Juan Padilla

Abstract:

Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.

Keywords: price elasticity, volume, correlation structures, Bayesian models

Procedia PDF Downloads 153
1735 Configuring Resilience and Environmental Sustainability to Achieve Superior Performance under Differing Conditions of Transportation Disruptions

Authors: Henry Ataburo, Dominic Essuman, Emmanuel Kwabena Anin

Abstract:

Recent trends of catastrophic events, such as the Covid-19 pandemic, the Suez Canal blockage, the Russia-Ukraine conflict, the Israel-Hamas conflict, and the climate change crisis, continue to devastate supply chains and the broader society. Prior authors have advocated for a simultaneous pursuit of resilience and sustainability as crucial for navigating these challenges. Nevertheless, the relationship between resilience and sustainability is a rather complex one: resilience and sustainability are considered unrelated, substitutes, or complements. Scholars also suggest that different firms prioritize resilience and sustainability differently for varied strategic reasons. However, we know little about whether, how, and when these choices produce different typologies of firms to explain differences in financial and market performance outcomes. This research draws inferences from the systems configuration approach to organizational fit to contend that a taxonomy of firms may emerge based on how firms configure resilience and environmental sustainability. The study further examines the effects of these taxonomies on financial and market performance in differing transportation disruption conditions. Resilience is operationalized as a firm’s ability to adjust current operations, structure, knowledge, and resources in response to disruptions, whereas environmental sustainability is operationalized as the extent to which a firm deploys resources judiciously and keeps the ecological impact of its operations to the barest minimum. Using primary data from 199 firms in Ghana and cluster analysis as an analytical tool, the study identifies four clusters of firms based on how they prioritize resilience and sustainability: Cluster 1 - "strong, moderate resilience, high sustainability firms," Cluster 2 - "sigh resilience, high sustainability firms," Cluster 3 - "high resilience, strong, moderate sustainability firms," and Cluster 4 - "weak, moderate resilience, strong, moderate sustainability firms". In addition, ANOVA and regression analysis revealed the following findings: Only clusters 1 and 2 were significantly associated with both market and financial performance. Under high transportation disruption conditions, cluster 1 firms excel better in market performance, whereas cluster 2 firms excel better in financial performance. Conversely, under low transportation disruption conditions, cluster 1 firms excel better in financial performance, whereas cluster 2 firms excel better in market performance. The study provides theoretical and empirical evidence of how resilience and environmental sustainability can be configured to achieve specific performance objectives under different disruption conditions.

Keywords: resilience, environmental sustainability, developing economy, transportation disruption

Procedia PDF Downloads 60