Search results for: interference mitigation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1309

Search results for: interference mitigation

1069 Multi-Objective Optimization of Intersections

Authors: Xiang Li, Jian-Qiao Sun

Abstract:

As the crucial component of city traffic network, intersections have significant impacts on urban traffic performance. Despite of the rapid development in transportation systems, increasing traffic volumes result in severe congestions especially at intersections in urban areas. Effective regulation of vehicle flows at intersections has always been an important issue in the traffic control system. This study presents a multi-objective optimization method at intersections with cellular automata to achieve better traffic performance. Vehicle conflicts and pedestrian interference are considered. Three categories of the traffic performance are studied including transportation efficiency, energy consumption and road safety. The left-turn signal type, signal timing and lane assignment are optimized for different traffic flows. The multi-objective optimization problem is solved with the cell mapping method. The optimization results show the conflicting nature of different traffic performance. The influence of different traffic variables on the intersection performance is investigated. It is observed that the proposed optimization method is effective in regulating the traffic at the intersection to meet multiple objectives. Transportation efficiency can be usually improved by the permissive left-turn signal, which sacrifices safety. Right-turn traffic suffers significantly when the right-turn lanes are shared with the through vehicles. The effect of vehicle flow on the intersection performance is significant. The display pattern of the optimization results can be changed remarkably by the traffic volume variation. Pedestrians have strong interference with the traffic system.

Keywords: cellular automata, intersection, multi-objective optimization, traffic system

Procedia PDF Downloads 547
1068 Micro-Rest: Extremely Short Breaks in Post-Learning Interference Support Memory Retention over the Long Term

Authors: R. Marhenke, M. Martini

Abstract:

The distraction of attentional resources after learning hinders long-term memory consolidation compared to several minutes of post-encoding inactivity in form of wakeful resting. We tested whether an 8-minute period of wakeful resting, compared to performing an adapted version of the d2 test of attention after learning, supports memory retention. Participants encoded and immediately recalled a word list followed by either an 8 minute period of wakeful resting (eyes closed, relaxed) or by performing an adapted version of the d2 test of attention (scanning and selecting specific characters while ignoring others). At the end of the experimental session (after 12-24 min) and again after 7 days, participants were required to complete a surprise free recall test of both word lists. Our results showed no significant difference in memory retention between the experimental conditions. However, we found that participants who completed the first lines of the d2 test in less than the given time limit of 20 seconds and thus had short unfilled intervals before switching to the next test line, remembered more words over the 12-24 minute and over the 7 days retention interval than participants who did not complete the first lines. This interaction occurred only for the first test lines, with the highest temporal proximity to the encoding task and not for later test lines. Differences in retention scores between groups (completed first line vs. did not complete) seem to be widely independent of the general performance in the d2 test. Implications and limitations of these exploratory findings are discussed.

Keywords: long-term memory, retroactive interference, attention, forgetting

Procedia PDF Downloads 101
1067 Typhoon Disaster Risk Assessment of Mountain Village: A Case Study of Shanlin District in Kaohsiung

Authors: T. C. Hsu, H. L. Lin

Abstract:

Taiwan is mountainous country, 70% of land is covered with mountains. Because of extreme climate, the mountain villages with sensitive and fragile environment often get easily affected by inundation and debris flow from typhoon which brings huge rainfall. Due to inappropriate development, overuse and fewer access roads, occurrence of disaster becomes more frequent through downpour and rescue actions are postponed. However, risk map is generally established through administrative boundaries, the difference of urban and rural area is ignored. The neglect of mountain village characteristics eventually underestimates the importance of factors related to vulnerability and reduces the effectiveness. In disaster management, there are different strategies and actions at each stage. According to different tasks, there will be different risk indices and weights to analyze disaster risk for each stage and then it will contribute to confront threat and reduce impact appropriately on right time. Risk map is important in mitigation, but also in response stage because some factors such as road network will be changed by disaster. This study will use risk assessment to establish risk map of Shanlin District which is mountain village in Kaohsiung as a case study in mitigation and response stage through Analytic Hierarchy Process (AHP). AHP helps to recognize the composition and weights of risk factors in mountain village by experts’ opinions through survey design and is combined with present potential hazard map to produce risk map.

Keywords: risk assessment, mountain village, risk map, analytic hierarchy process

Procedia PDF Downloads 369
1066 Variable vs. Fixed Window Width Code Correlation Reference Waveform Receivers for Multipath Mitigation in Global Navigation Satellite Systems with Binary Offset Carrier and Multiplexed Binary Offset Carrier Signals

Authors: Fahad Alhussein, Huaping Liu

Abstract:

This paper compares the multipath mitigation performance of code correlation reference waveform receivers with variable and fixed window width, for binary offset carrier and multiplexed binary offset carrier signals typically used in global navigation satellite systems. In the variable window width method, such width is iteratively reduced until the distortion on the discriminator with multipath is eliminated. This distortion is measured as the Euclidean distance between the actual discriminator (obtained with the incoming signal), and the local discriminator (generated with a local copy of the signal). The variable window width have shown better performance compared to the fixed window width. In particular, the former yields zero error for all delays for the BOC and MBOC signals considered, while the latter gives rather large nonzero errors for small delays in all cases. Due to its computational simplicity, the variable window width method is perfectly suitable for implementation in low-cost receivers.

Keywords: correlation reference waveform receivers, binary offset carrier, multiplexed binary offset carrier, global navigation satellite systems

Procedia PDF Downloads 95
1065 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: probabilistic methods, risk assessment, risk management, slope stability

Procedia PDF Downloads 346
1064 Robust Processing of Antenna Array Signals under Local Scattering Environments

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

An adaptive array beamformer is designed for automatically preserving the desired signals while cancelling interference and noise. Providing robustness against model mismatches and tracking possible environment changes calls for robust adaptive beamforming techniques. The design criterion yields the well-known generalized sidelobe canceller (GSC) beamformer. In practice, the knowledge of the desired steering vector can be imprecise, which often occurs due to estimation errors in the DOA of the desired signal or imperfect array calibration. In these situations, the SOI is considered as interference, and the performance of the GSC beamformer is known to degrade. This undesired behavior results in a reduction of the array output signal-to-interference plus-noise-ratio (SINR). Therefore, it is worth developing robust techniques to deal with the problem due to local scattering environments. As to the implementation of adaptive beamforming, the required computational complexity is enormous when the array beamformer is equipped with massive antenna array sensors. To alleviate this difficulty, a generalized sidelobe canceller (GSC) with partially adaptivity for less adaptive degrees of freedom and faster adaptive response has been proposed in the literature. Unfortunately, it has been shown that the conventional GSC-based adaptive beamformers are usually very sensitive to the mismatch problems due to local scattering situations. In this paper, we present an effective GSC-based beamformer against the mismatch problems mentioned above. The proposed GSC-based array beamformer adaptively estimates the actual direction of the desired signal by using the presumed steering vector and the received array data snapshots. We utilize the predefined steering vector and a presumed angle tolerance range to carry out the required estimation for obtaining an appropriate steering vector. A matrix associated with the direction vector of signal sources is first created. Then projection matrices related to the matrix are generated and are utilized to iteratively estimate the actual direction vector of the desired signal. As a result, the quiescent weight vector and the required signal blocking matrix required for performing adaptive beamforming can be easily found. By utilizing the proposed GSC-based beamformer, we find that the performance degradation due to the considered local scattering environments can be effectively mitigated. To further enhance the beamforming performance, a signal subspace projection matrix is also introduced into the proposed GSC-based beamformer. Several computer simulation examples show that the proposed GSC-based beamformer outperforms the existing robust techniques.

Keywords: adaptive antenna beamforming, local scattering, signal blocking, steering mismatch

Procedia PDF Downloads 81
1063 Resilence and Adaptation to Water Scarcity in San Martín de las Palmas, Santiago Tilantongo, Nochixtlán Oaxaca

Authors: E. Montesinos-Pedro, L. G. Toscano-Flores, N. Domínguez-Ramírez

Abstract:

Water scarcity is a worldwide issue, coupled with climate change is a relevant problem, that affect not only large cities, but also rural areas. The Municipality of Santiago Tilantongo belongs to the district of Nochixtlán Oaxaca, it’s built up from 14 communities, one of them San Martin de las Palmas. This community was founded in 1900, at that time the inhabitants were supplied with water through rivers of the region which were abundant (they used containers filled in the river for that purpose); However, over the years the level of the rivers began to drop and in 1994 specific wells were located to store water and at the same time make it drinkable, this whit support of the state of Oaxaca and the program Procampo. By the year 2000 the shortage of water in the supply sources was notorious, the community requested support from the Oaxaca State government to solve the problem. The government’s response consisted in the implementation of ferro-cement tanks (2005) and water wells (2010), both for rainwater collection, Hower, it was not enough. Now days the community has a population of 60 inhabitants who have resisted and adapted to water scarcity, not only with the programs implemented by the government, but they also have implemented important structural analysis strategies. The objective of this research is to know the adaptation strategies used by the community to analyze them and propose improvements for water conservation and mitigation of this scarcity.

Keywords: adaptation, climate change, mitigation, resiliencia

Procedia PDF Downloads 64
1062 Simulating the Effect of Chlorine on Dynamic of Main Aquatic Species in Urban Lake with a Mini System Dynamic Model

Authors: Zhiqiang Yan, Chen Fan, Beicheng Xia

Abstract:

Urban lakes play an invaluable role in urban water systems such as flood control, landscape, entertainment, and energy utilization, and have suffered from severe eutrophication over the past few years. To investigate the ecological response of main aquatic species and system stability to chlorine interference in shallow urban lakes, a mini system dynamic model, based on the competition and predation of main aquatic species and TP circulation, was developed. The main species of submerged macrophyte, phytoplankton, zooplankton, benthos and TP in water and sediment were simulated as variables in the model with the interference of chlorine which effect function was attenuation equation. The model was validated by the data which was investigated in the Lotus Lake in Guangzhou from October 1, 2015 to January 31, 2016. Furthermore, the eco-exergy was used to analyze the change in complexity of the shallow urban lake. The results showed the correlation coefficient between observed and simulated values of all components presented significant. Chlorine showed a significant inhibitory effect on Microcystis aeruginosa,Rachionus plicatilis, Diaphanosoma brachyurum Liévin and Mesocyclops leuckarti (Claus).The outbreak of Spiroggra spp. inhibited the growth of Vallisneria natans (Lour.) Hara, caused a gradual decrease of eco-exergy, reflecting the breakdown of ecosystem internal equilibria. It was concluded that the study gives important insight into using chlorine to achieve eutrophication control and understand mechanism process.

Keywords: system dynamic model, urban lake, chlorine, eco-exergy

Procedia PDF Downloads 180
1061 Urban Heat Island Effects on Human Health in Birmingham and Its Mitigation

Authors: N. A. Parvin, E. B. Ferranti, L. A. Chapman, C. A. Pfrang

Abstract:

This study intends to investigate the effects of the Urban Heat Island on public health in Birmingham. Birmingham is located at the center of the West Midlands and its weather is Highly variable due to geographical factors. Residential developments, road networks and infrastructure often replace open spaces and vegetation. This transformation causes the temperature of urban areas to increase and creates an "island" of higher temperatures in the urban landscape. Extreme heat in the urban area is influencing public health in the UK as well as in the world. Birmingham is a densely built-up area with skyscrapers and congested buildings in the city center, which is a barrier to air circulation. We will investigate the city regarding heat and cold-related human mortality and other impacts. We are using primary and secondary datasets to examine the effect of population shift and land-use change on the UHI in Birmingham. We will also use freely available weather data from the Birmingham Urban Observatory and will incorporate satellite data to determine urban spatial expansion and its effect on the UHI. We have produced a temperature map based on summer datasets of 2020, which has covered 25 weather stations in Birmingham to show the differences between diurnal and nocturnal summer and annual temperature trends. Some impacts of the UHI may be beneficial, such as the lengthening of the plant growing season, but most of them are highly negative. We are looking for various effects of urban heat which is impacting human health and investigating mitigation options.

Keywords: urban heat, public health, climate change

Procedia PDF Downloads 69
1060 Workplace Risk Assessment in a Paint Factory

Authors: Rula D. Alshareef, Safa S. Alqathmi, Ghadah K. Alkhouldi, Reem O. Bagabas, Farheen B. Hasan

Abstract:

Safety engineering is among the most crucial considerations in any work environment. Providing mentally, physically, and environmentally safe work conditions must be the top priority of any successful organization. Company X is a local paint production company in Saudi Arabia; in a month, the factory experienced two significant accidents, which indicates that workers’ safety is overlooked. The aim of the research is to examine the risks, assess the root causes and recommend control measures that will eventually contribute to providing a safe workplace. The methodology used is sectioned into three phases, risk identification, assessment, and finally, mitigation. In the identification phase, the team used Rapid Entire Body Assessment (REBA) and National Institute for Occupational Safety and Health Lifting Index (NIOSH LI) tools to holistically establish knowledge about the current risk posed to the factory. The physical hazards in the factory were assessed in two different operations, which are mixing and filling/packaging. For the risk assessment phase, the hazards were deeply analyzed through their severity and impact. Additionally, through risk mitigation, the Rapid Entire Body Assessment (REBA) score decreased from 11 to 7, and the National Institute for Occupational Safety and Health Lifting Index (NIOSH LI) has been reduced from 5.27 to 1.85.

Keywords: ergonomics, safety, workplace risks, hazards, awkward posture, fatigue, work environment

Procedia PDF Downloads 50
1059 Flood Disaster Prevention and Mitigation in Nigeria Using Geographic Information System

Authors: Dinebari Akpee, Friday Aabe Gaage, Florence Fred Nwaigwu

Abstract:

Natural disasters like flood affect many parts of the world including developing countries like Nigeria. As a result, many human lives are lost, properties damaged and so much money is lost in infrastructure damages. These hazards and losses can be mitigated and reduced by providing reliable spatial information to the generality of the people through about flood risks through flood inundation maps. Flood inundation maps are very crucial for emergency action plans, urban planning, ecological studies and insurance rates. Nigeria experience her worst flood in her entire history this year. Many cities were submerged and completely under water due to torrential rainfall. Poor city planning, lack of effective development control among others contributes to the problem too. Geographic information system (GIS) can be used to visualize the extent of flooding, analyze flood maps to produce flood damaged estimation maps and flood risk maps. In this research, the under listed steps were taken in preparation of flood risk maps for the study area: (1) Digitization of topographic data and preparation of digital elevation model using ArcGIS (2) Flood simulation using hydraulic model and integration and (3) Integration of the first two steps to produce flood risk maps. The results shows that GIS can play crucial role in Flood disaster control and mitigation.

Keywords: flood disaster, risk maps, geographic information system, hazards

Procedia PDF Downloads 191
1058 Modeling of Thermo Acoustic Emission Memory Effect in Rocks of Varying Textures

Authors: Vladimir Vinnikov

Abstract:

The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied with a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.

Keywords: crack growth, cyclic heating and cooling, rock texture, thermo acoustic emission memory effect

Procedia PDF Downloads 246
1057 Cost Benefit Analysis: Evaluation among the Millimetre Wavebands and SHF Bands of Small Cell 5G Networks

Authors: Emanuel Teixeira, Anderson Ramos, Marisa Lourenço, Fernando J. Velez, Jon M. Peha

Abstract:

This article discusses the benefit cost analysis aspects of millimetre wavebands (mmWaves) and Super High Frequency (SHF). The devaluation along the distance of the carrier-to-noise-plus-interference ratio with the coverage distance is assessed by considering two different path loss models, the two-slope urban micro Line-of-Sight (UMiLoS) for the SHF band and the modified Friis propagation model, for frequencies above 24 GHz. The equivalent supported throughput is estimated at the 5.62, 28, 38, 60 and 73 GHz frequency bands and the influence of carrier-to-noise-plus-interference ratio in the radio and network optimization process is explored. Mostly owing to the lessening caused by the behaviour of the two-slope propagation model for SHF band, the supported throughput at this band is higher than at the millimetre wavebands only for the longest cell lengths. The benefit cost analysis of these pico-cellular networks was analysed for regular cellular topologies, by considering the unlicensed spectrum. For shortest distances, we can distinguish an optimal of the revenue in percentage terms for values of the cell length, R ≈ 10 m for the millimeter wavebands and for longest distances an optimal of the revenue can be observed at R ≈ 550 m for the 5.62 GHz. It is possible to observe that, for the 5.62 GHz band, the profit is slightly inferior than for millimetre wavebands, for the shortest Rs, and starts to increase for cell lengths approximately equal to the ratio between the break-point distance and the co-channel reuse factor, achieving a maximum for values of R approximately equal to 550 m.

Keywords: millimetre wavebands, SHF band, SINR, cost benefit analysis, 5G

Procedia PDF Downloads 117
1056 Enhancing the Implementation Strategy of Simultaneous Operations (SIMOPS) for the Major Turnaround at Pertamina Plaju Refinery

Authors: Fahrur Rozi, Daniswara Krisna Prabatha, Latief Zulfikar Chusaini

Abstract:

Amidst the backdrop of Pertamina Plaju Refinery, which stands as the oldest and historically less technologically advanced among Pertamina's refineries, lies a unique challenge. Originally integrating facilities established by Shell in 1904 and Stanvac (originally Standard Oil) in 1926, the primary challenge at Plaju Refinery does not solely revolve around complexity; instead, it lies in ensuring reliability, considering its operational history of over a century. After centuries of existence, Plaju Refinery has never undergone a comprehensive major turnaround encompassing all its units. The usual practice involves partial turnarounds that are sequentially conducted across its primary, secondary, and tertiary units (utilities and offsite). However, a significant shift is on the horizon. In the Q-IV of 2023, the refinery embarks on its first-ever major turnaround since its establishment. This decision was driven by the alignment of maintenance timelines across various units. Plaju Refinery's major turnaround was scheduled for October-November 2023, spanning 45 calendar days, with the objective of enhancing the operational reliability of all refinery units. The extensive job list for this turnaround encompasses 1583 tasks across 18 units/areas, involving approximately 9000 contracted workers. In this context, the Strategy of Simultaneous Operations (SIMOPS) execution emerges as a pivotal tool to optimize time efficiency and ensure safety. A Hazard Effect Management Process (HEMP) has been employed to assess the risk ratings of each task within the turnaround. Out of the tasks assessed, 22 are deemed high-risk and necessitate mitigation. The SIMOPS approach serves as a preventive measure against potential incidents. It is noteworthy that every turnaround period at Pertamina Plaju Refinery involves SIMOPS-related tasks. In this context, enhancing the implementation strategy of "Simultaneous Operations (SIMOPS)" becomes imperative to minimize the occurrence of incidents. At least four improvements have been introduced in the enhancement process for the major turnaround at Refinery Plaju. The first improvement involves conducting systematic risk assessment and potential hazard mitigation studies for SIMOPS tasks before task execution, as opposed to the previous on-site approach. The second improvement includes the completion of SIMOPS Job Mitigation and Work Matrices Sheets, which was often neglected in the past. The third improvement emphasizes comprehensive awareness to workers/contractors regarding potential hazards and mitigation strategies for SIMOPS tasks before and during the major turnaround. The final improvement is the introduction of a daily program for inspecting and observing work in progress for SIMOPS tasks. Prior to these improvements, there was no established program for monitoring ongoing activities related to SIMOPS tasks during the turnaround. This study elucidates the steps taken to enhance SIMOPS within Pertamina, drawing from the experiences of Plaju Refinery as a guide. A real actual case study will be provided from our experience in the operational unit. In conclusion, these efforts are essential for the success of the first-ever major turnaround at Plaju Refinery, with the SIMOPS strategy serving as a central component. Based on these experiences, enhancements have been made to Pertamina's official Internal Guidelines for Executing SIMOPS Risk Mitigation, benefiting all Pertamina units.

Keywords: process safety management, turn around, oil refinery, risk assessment

Procedia PDF Downloads 36
1055 The Three-Zone Composite Productivity Model of Multi-Fractured Horizontal Wells under Different Diffusion Coefficients in a Shale Gas Reservoir

Authors: Weiyao Zhu, Qian Qi, Ming Yue, Dongxu Ma

Abstract:

Due to the nano-micro pore structures and the massive multi-stage multi-cluster hydraulic fracturing in shale gas reservoirs, the multi-scale seepage flows are much more complicated than in most other conventional reservoirs, and are crucial for the economic development of shale gas. In this study, a new multi-scale non-linear flow model was established and simplified, based on different diffusion and slip correction coefficients. Due to the fact that different flow laws existed between the fracture network and matrix zone, a three-zone composite model was proposed. Then, according to the conformal transformation combined with the law of equivalent percolation resistance, the productivity equation of a horizontal fractured well, with consideration given to diffusion, slip, desorption, and absorption, was built. Also, an analytic solution was derived, and the interference of the multi-cluster fractures was analyzed. The results indicated that the diffusion of the shale gas was mainly in the transition and Fick diffusion regions. The matrix permeability was found to be influenced by slippage and diffusion, which was determined by the pore pressure and diameter according to the Knudsen number. It was determined that, with the increased half-lengths of the fracture clusters, flow conductivity of the fractures, and permeability of the fracture network, the productivity of the fractured well also increased. Meanwhile, with the increased number of fractures, the distance between the fractures decreased, and the productivity slowly increased due to the mutual interference of the fractures. In regard to the fractured horizontal wells, the free gas was found to majorly contribute to the productivity, while the contribution of the desorption increased with the increased pressure differences.

Keywords: multi-scale, fracture network, composite model, productivity

Procedia PDF Downloads 248
1054 Enhanced Acquisition Time of a Quantum Holography Scheme within a Nonlinear Interferometer

Authors: Sergio Tovar-Pérez, Sebastian Töpfer, Markus Gräfe

Abstract:

The work proposes a technique that decreases the detection acquisition time of quantum holography schemes down to one-third; this allows the possibility to image moving objects. Since its invention, quantum holography with undetected photon schemes has gained interest in the scientific community. This is mainly due to its ability to tailor the detected wavelengths according to the needs of the scheme implementation. Yet this wavelength flexibility grants the scheme a wide range of possible applications; an important matter was yet to be addressed. Since the scheme uses digital phase-shifting techniques to retrieve the information of the object out of the interference pattern, it is necessary to acquire a set of at least four images of the interference pattern along with well-defined phase steps to recover the full object information. Hence, the imaging method requires larger acquisition times to produce well-resolved images. As a consequence, the measurement of moving objects remains out of the reach of the imaging scheme. This work presents the use and implementation of a spatial light modulator along with a digital holographic technique called quasi-parallel phase-shifting. This technique uses the spatial light modulator to build a structured phase image consisting of a chessboard pattern containing the different phase steps for digitally calculating the object information. Depending on the reduction in the number of needed frames, the acquisition time reduces by a significant factor. This technique opens the door to the implementation of the scheme for moving objects. In particular, the application of this scheme in imaging alive specimens comes one step closer.

Keywords: quasi-parallel phase shifting, quantum imaging, quantum holography, quantum metrology

Procedia PDF Downloads 82
1053 Reliability of Dissimilar Metal Soldered Joint in Fabrication of Electromagnetic Interference Shielded Door Frame

Authors: Rehan Waheed, Hasan Aftab Saeed, Wasim Tarar, Khalid Mahmood, Sajid Ullah Butt

Abstract:

Electromagnetic Interference (EMI) shielded doors made from brass extruded channels need to be welded with shielded enclosures to attain optimum shielding performance. Control of welding induced distortion is a problem in welding dissimilar metals like steel and brass. In this research, soldering of the steel-brass joint has been proposed to avoid weld distortion. The material used for brass channel is UNS C36000. The thickness of brass is defined by the manufacturing process, i.e. extrusion. The thickness of shielded enclosure material (ASTM A36) can be varied to produce joint between the dissimilar metals. Steel sections of different gauges are soldered using (91% tin, 9% zinc) solder to the brass, and strength of joint is measured by standard test procedures. It is observed that thin steel sheets produce a stronger bond with brass. The steel sections further require to be welded with shielded enclosure steel sheets through TIG welding process. Stresses and deformation in the vicinity of soldered portion is calculated through FE simulation. Crack formation in soldered area is also studied through experimental work. It has been found that in thin sheets deformation produced due to applied force is localized and has no effect on soldered joint area whereas in thick sheets profound cracks have been observed in soldered joint. The shielding effectiveness of EMI shielded door is compromised due to these cracks. The shielding effectiveness of the specimens is tested and results are compared.

Keywords: dissimilar metal, EMI shielding, joint strength, soldering

Procedia PDF Downloads 139
1052 Experimental and Computational Investigations on the Mitigation of Air Pollutants Using Pulsed Radio Waves

Authors: Gangadhara Siva Naga Venkata Krishna Satya Narayana Swamy Undi

Abstract:

Particulate matter (PM) pollution in ambient air is a major environmental health risk factor contributing to disease and mortality worldwide. Current air pollution control methods have limitations in reducing real-world ambient PM levels. This study demonstrates the efficacy of using pulsed radio wave technology as a distinct approach to lower outdoor particulate pollution. Experimental data were compared with computational models to evaluate the efficiency of pulsed waves in coagulating and settling PM. Results showed 50%+ reductions in PM2.5 and PM10 concentrations at the city scale, with particle removal rates exceeding gravity settling by over 3X. Historical air quality data further validated the significant PM reductions achieved in test cases. Computational analyses revealed the underlying coagulation mechanisms induced by the pulsed waves, supporting the feasibility of this strategy for ambient particulate control. The pulsed electromagnetic technology displayed robustness in sustainably managing PM levels across diverse urban and industrial environments. Findings highlight the promise of this advanced approach as a next-generation solution to mitigate particulate air pollution and associated health burdens globally. The technology's scalability and energy efficiency can help address a key gap in current efforts to improve ambient air quality.

Keywords: particulate matter, mitigation technologies, clean air, ambient air pollution

Procedia PDF Downloads 20
1051 Numerical Investigation of Tsunami Flow Characteristics and Energy Reduction through Flexible Vegetation

Authors: Abhishek Mukherjee, Juan C. Cajas, Jenny Suckale, Guillaume Houzeaux, Oriol Lehmkuhl, Simone Marras

Abstract:

The investigation of tsunami flow characteristics and the quantification of tsunami energy reduction through the coastal vegetation is important to understand the protective benefits of nature-based mitigation parks. In the present study, a three-dimensional non-hydrostatic incompressible Computational Fluid Dynamics model with a two-way coupling enabled fluid-structure interaction approach (FSI) is used. After validating the numerical model against experimental data, tsunami flow characteristics have been investigated by varying vegetation density, modulus of elasticity, the gap between stems, and arrangement or distribution of vegetation patches. Streamwise depth average velocity profiles, turbulent kinetic energy, energy flux reflection, and dissipation extracted by the numerical study will be presented in this study. These diagnostics are essential to assess the importance of different parameters to design the proper coastal defense systems. When a tsunami wave reaches the shore, it transforms into undular bores, which induce scour around offshore structures and sediment transport. The bed shear stress, instantaneous turbulent kinetic energy, and the vorticity near-bed will be presented to estimate the importance of vegetation to prevent tsunami-induced scour and sediment transport.

Keywords: coastal defense, energy flux, fluid-structure interaction, natural hazards, sediment transport, tsunami mitigation

Procedia PDF Downloads 117
1050 Reduction of Specific Energy Consumption in Microfiltration of Bacillus velezensis Broth by Air Sparging and Turbulence Promoter

Authors: Jovana Grahovac, Ivana Pajcin, Natasa Lukic, Jelena Dodic, Aleksandar Jokic

Abstract:

To obtain purified biomass to be used in the plant pathogen biocontrol or as soil biofertilizer, it is necessary to eliminate residual broth components at the end of the fermentation process. The main drawback of membrane separation techniques is permeate flux decline due to the membrane fouling. Fouling mitigation measures increase the pressure drop along membrane channel due to the increased resistance to flow of the feed suspension, thus increasing the hydraulic power drop. At the same time, these measures lead to an increase in the permeate flux due to the reduced resistance of the filtration cake on the membrane surface. Because of these opposing effects, the energy efficiency of fouling mitigation measures is limited, and the justification of its application is provided by information on a reducing specific energy consumption compared to a case without any measures employed. In this study, the influence of static mixer (Kenics) and air-sparging (two-phase flow) on reduction of specific energy consumption (ER) was investigated. Cultivation Bacillus velezensis was carried out in the 3-L bioreactor (Biostat® Aplus) containing 2 L working volume with two parallel Rushton turbines and without internal baffles. Cultivation was carried out at 28 °C on at 150 rpm with an aeration rate of 0.75 vvm during 96 h. The experiments were carried out in a conventional cross-flow microfiltration unit. During experiments, permeate and retentate were recycled back to the broth vessel to simulate continuous process. The single channel ceramic membrane (TAMI Deutschland) used had a nominal pore size 200 nm with the length of 250 mm and an inner/external diameter of 6/10 mm. The useful membrane channel surface was 4.33×10⁻³ m². Air sparging was brought by the pressurized air connected by a three-way valve to the feed tube by a simple T-connector without diffusor. The different approaches to flux improvement are compared in terms of energy consumption. Reduction of specific energy consumption compared to microfiltration without fouling mitigation is around 49% and 63%, for use of two-phase flow and a static mixer, respectively. In the case of a combination of these two fouling mitigation methods, ER is 60%, i.e., slightly lower compared to the use of turbulence promoter alone. The reason for this result can be found in the fact that flux increase is more affected by the presence of a Kenics static mixer while sparging results in an increase of energy used during microfiltration. By comparing combined method with turbulence promoter flux enhancement method ER is negative (-7%) which can be explained by increased power consumption for air flow with moderate contribution to the flux increase. Another confirmation for this fact can be found by comparing energy consumption values for combined method with energy consumption in the case of two-phase flow. In this instance energy reduction (ER) is 22% that demonstrates that turbulence promoter is more efficient compared to two phase flow. Antimicrobial activity of Bacillus velezensis biomass against phytopathogenic isolates Xanthomonas campestris was preserved under different fouling reduction methods.

Keywords: Bacillus velezensis, microfiltration, static mixer, two-phase flow

Procedia PDF Downloads 92
1049 Projections of Climate Change in the Rain Regime of the Ibicui River Basin

Authors: Claudineia Brazil, Elison Eduardo Bierhals, Francisco Pereira, José Leandro Néris, Matheus Rippel, Luciane Salvi

Abstract:

The global concern about climate change has been increasing, since the emission of gases from human activities contributes to the greenhouse effect in the atmosphere, indicating significant impacts to the planet in the coming years. The study of precipitation regime is fundamental for the development of research in several areas. Among them are hydrology, agriculture, and electric sector. Using the climatic projections of the models belonging to the CMIP5, the main objective of the paper was to present an analysis of the impacts of climate change without rainfall in the Uruguay River basin. After an analysis of the results, it can be observed that for the future climate, there is a tendency, in relation to the present climate, for larger numbers of dry events, mainly in the winter months, changing the pluviometric regime for wet summers and drier winters. Given this projected framework, it is important to note the importance of adequate management of the existing water sources in the river basin, since the value of rainfall is reduced for the next years, it may compromise the dynamics of the ecosystems in the region. Facing climate change is fundamental issue for regions and cities all around the world. Society must improve its resilience to phenomenon impacts, and spreading the knowledge among decision makers and citizens is also essential. So, these research results can be subsidies for the decision-making in planning and management of mitigation measures and/or adaptation in south Brazil.

Keywords: climate change, hydrological potential, precipitation, mitigation

Procedia PDF Downloads 317
1048 How to Reach Net Zero Emissions? On the Permissibility of Negative Emission Technologies and the Danger of Moral Hazards

Authors: Hanna Schübel, Ivo Wallimann-Helmer

Abstract:

In order to reach the goal of the Paris Agreement to not overshoot 1.5°C of warming above pre-industrial levels, various countries including the UK and Switzerland have committed themselves to net zero emissions by 2050. The employment of negative emission technologies (NETs) is very likely going to be necessary for meeting these national objectives as well as other internationally agreed climate targets. NETs are methods of removing carbon from the atmosphere and are thus a means for addressing climate change. They range from afforestation to technological measures such as direct air capture and carbon storage (DACCS), where CO2 is captured from the air and stored underground. As all so-called geoengineering technologies, the development and deployment of NETs are often subject to moral hazard arguments. As these technologies could be perceived as an alternative to mitigation efforts, so the argument goes, they are potentially a dangerous distraction from the main target of mitigating emissions. We think that this is a dangerous argument to make as it may hinder the development of NETs which are an essential element of net zero emission targets. In this paper we argue that the moral hazard argument is only problematic if we do not reflect upon which levels of emissions are at stake in order to meet net zero emissions. In response to the moral hazard argument we develop an account of which levels of emissions in given societies should be mitigated and not be the target of NETs and which levels of emissions can legitimately be a target of NETs. For this purpose, we define four different levels of emissions: the current level of individual emissions, the level individuals emit in order to appear in public without shame, the level of a fair share of individual emissions in the global budget, and finally the baseline of net zero emissions. At each level of emissions there are different subjects to be assigned responsibilities if societies and/or individuals are committed to the target of net zero emissions. We argue that all emissions within one’s fair share do not demand individual mitigation efforts. The same holds with regard to individuals and the baseline level of emissions necessary to appear in public in their societies without shame. Individuals are only under duty to reduce their emissions if they exceed this baseline level. This is different for whole societies. Societies demanding more emissions to appear in public without shame than the individual fair share are under duty to foster emission reductions and are not legitimate to reduce by introducing NETs. NETs are legitimate for reducing emissions only below the level of fair shares and for reaching net zero emissions. Since access to NETs to achieve net zero emissions demands technology not affordable to individuals there are also no full individual responsibilities to achieve net zero emissions. This is mainly a responsibility of societies as a whole.

Keywords: climate change, mitigation, moral hazard, negative emission technologies, responsibility

Procedia PDF Downloads 94
1047 Space Time Adaptive Algorithm in Bi-Static Passive Radar Systems for Clutter Mitigation

Authors: D. Venu, N. V. Koteswara Rao

Abstract:

Space – time adaptive processing (STAP) is an effective tool for detecting a moving target in spaceborne or airborne radar systems. Since airborne passive radar systems utilize broadcast, navigation and excellent communication signals to perform various surveillance tasks and also has attracted significant interest from the distinct past, therefore the need of the hour is to have cost effective systems as compared to conventional active radar systems. Moreover, requirements of small number of secondary samples for effective clutter suppression in bi-static passive radar offer abundant illuminator resources for passive surveillance radar systems. This paper presents a framework for incorporating knowledge sources directly in the space-time beam former of airborne adaptive radars. STAP algorithm for clutter mitigation for passive bi-static radar has better quantitation of the reduction in sample size thereby amalgamating the earlier data bank with existing radar data sets. Also, we proposed a novel method to estimate the clutter matrix and perform STAP for efficient clutter suppression based on small sample size. Furthermore, the effectiveness of the proposed algorithm is verified using MATLAB simulations in order to validate STAP algorithm for passive bi-static radar. In conclusion, this study highlights the importance for various applications which augments traditional active radars using cost-effective measures.

Keywords: bistatic radar, clutter, covariance matrix passive radar, STAP

Procedia PDF Downloads 273
1046 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression

Authors: J. S. Saini, P. P. K. Sandhu

Abstract:

The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.

Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control

Procedia PDF Downloads 309
1045 An Integrated Framework for Seismic Risk Mitigation Decision Making

Authors: Mojtaba Sadeghi, Farshid Baniassadi, Hamed Kashani

Abstract:

One of the challenging issues faced by seismic retrofitting consultants and employers is quick decision-making on the demolition or retrofitting of a structure at the current time or in the future. For this reason, the existing models proposed by researchers have only covered one of the aspects of cost, execution method, and structural vulnerability. Given the effect of each factor on the final decision, it is crucial to devise a new comprehensive model capable of simultaneously covering all the factors. This study attempted to provide an integrated framework that can be utilized to select the most appropriate earthquake risk mitigation solution for buildings. This framework can overcome the limitations of current models by taking into account several factors such as cost, execution method, risk-taking and structural failure. In the newly proposed model, the database and essential information about retrofitting projects are developed based on the historical data on a retrofit project. In the next phase, an analysis is conducted in order to assess the vulnerability of the building under study. Then, artificial neural networks technique is employed to calculate the cost of retrofitting. While calculating the current price of the structure, an economic analysis is conducted to compare demolition versus retrofitting costs. At the next stage, the optimal method is identified. Finally, the implementation of the framework was demonstrated by collecting data concerning 155 previous projects.

Keywords: decision making, demolition, construction management, seismic retrofit

Procedia PDF Downloads 211
1044 Modeling of Thermally Induced Acoustic Emission Memory Effects in Heterogeneous Rocks with Consideration for Fracture Develo

Authors: Vladimir A. Vinnikov

Abstract:

The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied by a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.

Keywords: degree of rock disturbance, non-destructive testing, thermally induced acoustic emission memory effects, structure and texture of rocks

Procedia PDF Downloads 235
1043 The Value of Computerized Corpora in EFL Textbook Design: The Case of Modal Verbs

Authors: Lexi Li

Abstract:

This study aims to contribute to the field of how computer technology can be exploited to enhance EFL textbook design. Specifically, the study demonstrates how computerized native and learner corpora can be used to enhance modal verb treatment in EFL textbooks. The linguistic focus is will, would, can, could, may, might, shall, should, must. The native corpus is the spoken component of BNC2014 (hereafter BNCS2014). The spoken part is chosen because the pedagogical purpose of the textbooks is communication-oriented. Using the standard query option of CQPweb, 5% of each of the nine modals was sampled from BNCS2014. The learner corpus is the POS-tagged Ten-thousand English Compositions of Chinese Learners (TECCL). All the essays under the “secondary school” section were selected. A series of five secondary coursebooks comprise the textbook corpus. All the data in both the learner and the textbook corpora are retrieved through the concordance functions of WordSmith Tools (version, 5.0). Data analysis was divided into two parts. The first part compared the patterns of modal verbs in the textbook corpus and BNC2014 with respect to distributional features, semantic functions, and co-occurring constructions to examine whether the textbooks reflect the authentic use of English. Secondly, the learner corpus was compared with the textbook corpus in terms of the use (distributional features, semantic functions, and co-occurring constructions) in order to examine the degree of influence of the textbook on learners’ use of modal verbs. Moreover, the learner corpus was analyzed for the misuse (syntactic errors, e.g., she can sings*.) of the nine modal verbs to uncover potential difficulties that confront learners. The results indicate discrepancies between the textbook presentation of modal verbs and authentic modal use in natural discourse in terms of distributions of frequencies, semantic functions, and co-occurring structures. Furthermore, there are consistent patterns of use between the learner corpus and the textbook corpus with respect to the three above-mentioned aspects, except could, will and must, partially confirming the correlation between the frequency effects and L2 grammar acquisition. Further analysis reveals that the exceptions are caused by both positive and negative L1 transfer, indicating that the frequency effects can be intercepted by L1 interference. Besides, error analysis revealed that could, would, should and must are the most difficult for Chinese learners due to both inter-linguistic and intra-linguistic interference. The discrepancies between the textbook corpus and the native corpus point to a need to adjust the presentation of modal verbs in the textbooks in terms of frequencies, different meanings, and verb-phrase structures. Along with the adjustment of modal verb treatment based on authentic use, it is important for textbook writers to take into consideration the L1 interference as well as learners’ difficulties in their use of modal verbs. The present study is a methodological showcase of the combination both native and learner corpora in the enhancement of EFL textbook language authenticity and appropriateness for learners.

Keywords: EFL textbooks, learner corpus, modal verbs, native corpus

Procedia PDF Downloads 94
1042 Conceptual Methods of Mitigating Matured Urban Tree Roots Surviving in Conflicts Growth within Built Environment: A Review

Authors: Mohd Suhaizan Shamsuddin

Abstract:

Urbanization exacerbates the environment quality and pressures of matured urban trees' growth and development in changing environment. The growth of struggled matured urban tree-roots by spreading within the existences of infrastructures, resulting in large damage to the structured and declined growth. Many physiological growths declined or damages by the present and installations of infrastructures within and nearby root zone. Afford to remain both matured urban tree and infrastructures as a service provider causes damage and death, respectively. Inasmuch, spending more expenditure on fixing both or removing matured urban trees as risky to the future environment as the mitigation methods to reduce the problems are unconcerned. This paper aims to explain mitigation method practices of reducing the encountered problems of matured urban tree-roots settling and infrastructures while modified urban soil to sustain at an optimum level. Three categories capturing encountered conflicts growth of matured urban tree-roots growth within and nearby infrastructures by mitigating the problems of limited soil spaces, poor soil structures and soil space barrier installations and maintenance. The limited soil space encountered many conflicts and identified six methods that mitigate the survival tree-roots, such as soil volume/mounding, soil replacement/amendment for the radial trench, soil spacing-root bridge, root tunneling, walkway/pavement rising/diverted, and suspended pavement. The limited soil spaces are mitigation affords of inadequate soil-roots and spreading root settling and modification of construction soil media since the barrier existed and installed in root trails or zones. This is the reason for enabling tree-roots spreading and finds adequate sources (nutrients, water uptake and oxygen), spaces and functioning to stability stand of root anchorage since the matured tree grows larger. The poor soil structures were identified as three methods to mitigate soil materials' problems, and fewer soil voids comprise skeletal soil, structural soil, and soil cell. Mitigation of poor soil structure is altering the existing and introducing new structures by modifying the quantities and materials ratio allowing more voids beneath for roots spreading by considering the above structure of foot and vehicle traffics functioning or load-bearing. The soil space barrier installations and maintenance recognized to sustain both infrastructures and tree-roots grown in limited spaces and its benefits, the root barrier installations and root pruning are recommended. In conclusion, these recommended methods attempt to mitigate the present problems encountered at a particular place and problems among tree-roots and infrastructures exist. The combined method is the best way to alleviates the conflicts since the recognized conflicts are between tree-roots and man-made while the urban soil is modified. These presenting methods are most considered to sustain the matured urban trees' lifespan growth in the urban environment.

Keywords: urban tree-roots, limited soil spaces, poor soil structures, soil space barrier and maintenance

Procedia PDF Downloads 165
1041 Effectiveness of Lowering the Water Table as a Mitigation Measure for Foundation Settlement in Liquefiable Soils Using 1-g Scale Shake Table Test

Authors: Kausar Alam, Mohammad Yazdi, Peiman Zogh, Ramin Motamed

Abstract:

An earthquake is an unpredictable natural disaster. It induces liquefaction, which causes considerable damage to the structure, life support, and piping systems because of ground settlement. As a result, people are incredibly concerned about how to resolve the situation. Previous researchers adopted different ground improvement techniques to reduce the settlement of the structure during earthquakes. This study evaluates the effectiveness of lowering the water table as a technique to mitigate foundation settlement in liquefiable soil. The performance will be evaluated based on foundation settlement and the reduction of excessive pore water pressure. In this study, a scaled model was prepared based on a full-scale shale table experiment conducted at the University of California, San Diego (UCSD). The model ground consists of three soil layers having a relative density of 55%, 45%, and 90%, respectively. A shallow foundation is seated over an unsaturated crust layer. After preparation of the model ground, the water table was measured to be at 45, 40, and 35 cm (from the bottom). Then, the input motions were applied for 10 seconds, with a peak acceleration of 0.25g and a constant frequency of 2.73 Hz. Based on the experimental results, the effectiveness of the lowering water table in reducing the foundation settlement and excess pore water pressure was evident. The foundation settlement was reduced from 50 mm to 5 mm. In addition, lowering the water table as a mitigation measure is a cost-effective way to decrease liquefaction-induced building settlement.

Keywords: foundation settlement, ground water table, liquefaction, hake table test

Procedia PDF Downloads 81
1040 Responsibility to Protect and State Sovereignty: The Case of Syria

Authors: Renu Kumari

Abstract:

State sovereignty refers to the ability and power of a state to be independent and not to have any interference of external actors in its internal affairs. This phenomenon has been accepted by International Law, which gives rights to the state to maintain its autonomy and territorial integrity without the interference of other actors. In of 1980’s and 1990’s the world has witnessed the worst case of human rights violence for instance, Rwanda genocide, the conflict in former Yugoslavia, Kosovo, Burundi, and Chad so and so forth. Though human rights violence is not a new phenomenon, it has been present all over the world in different time and space. But in 1990’s after the devastation of these conflicts and violence the world community came up with the notion of humanitarian intervention in which some states took the responsibility of protecting human rights violations and on the in order to protect they can intervene in the internal matters of a state specifically during civil war where state is unable to protect its people. Later on these so-called world community realized that intervention itself is a negative term that was criticized also therefore they came up with a different notion that sounded positive which known as responsibility to protect. In 2005 onwards, the notion of responsibility to protect accepted and recognized by the United Nations and states at a larger level. In the case of Syria on the name of responsibility to protect foreign interventions took place and due to the internal war Syrian people were already facing many problems, the government was not able to protect them. External invasion caused many devastating outcomes to the country. This paper is an attempt to analyze various dimensions of invasion of external affairs of a particular state and the status of sovereignty. Firstly, it lays out the notion of humanitarian intervention and then the responsibility to protect. Secondly, it looks in the case of Syria since 2011, the conflict of Syria. Thirdly it focuses on various efforts made by international organizations and other actors. Lastly, it looks why and how other actors intervene in the internal matter of Syria.

Keywords: state sovereignty, external actors, intervention, responsibility to protect

Procedia PDF Downloads 127