Search results for: minimum data set
22179 Lipid Extraction from Microbial Cell by Electroporation Technique and Its Influence on Direct Transesterification for Biodiesel Synthesis
Authors: Abu Yousuf, Maksudur Rahman Khan, Ahasanul Karim, Amirul Islam, Minhaj Uddin Monir, Sharmin Sultana, Domenico Pirozzi
Abstract:
Traditional biodiesel feedstock like edible oils or plant oils, animal fats and cooking waste oil have been replaced by microbial oil in recent research of biodiesel synthesis. The well-known community of microbial oil producers includes microalgae, oleaginous yeast and seaweeds. Conventional transesterification of microbial oil to produce biodiesel is lethargic, energy consuming, cost-ineffective and environmentally unhealthy. This process follows several steps such as microbial biomass drying, cell disruption, oil extraction, solvent recovery, oil separation and transesterification. Therefore, direct transesterification of biodiesel synthesis has been studying for last few years. It combines all the steps in a single reactor and it eliminates the steps of biomass drying, oil extraction and separation from solvent. Apparently, it seems to be cost-effective and faster process but number of difficulties need to be solved to make it large scale applicable. The main challenges are microbial cell disruption in bulk volume and make faster the esterification reaction, because water contents of the medium sluggish the reaction rate. Several methods have been proposed but none of them is up to the level to implement in large scale. It is still a great challenge to extract maximum lipid from microbial cells (yeast, fungi, algae) investing minimum energy. Electroporation technique results a significant increase in cell conductivity and permeability caused due to the application of an external electric field. Electroporation is required to alter the size and structure of the cells to increase their porosity as well as to disrupt the microbial cell walls within few seconds to leak out the intracellular lipid to the solution. Therefore, incorporation of electroporation techniques contributed in direct transesterification of microbial lipids by increasing the efficiency of biodiesel production rate.Keywords: biodiesel, electroporation, microbial lipids, transesterification
Procedia PDF Downloads 28122178 Determination of Medians of Biochemical Maternal Serum Markers in Healthy Women Giving Birth to Normal Babies
Authors: Noreen Noreen, Aamir Ijaz, Hamza Akhtar
Abstract:
Background: Screening plays a major role to detect chromosomal abnormalities, Down syndrome, neural tube defects and other inborn diseases of the newborn. Serum biomarkers in the second trimester are useful in determining risk of most common chromosomal anomalies; these test include Alpha-fetoprotein (AFP), Human chorionic gonadotropin (hCG), Unconjugated Oestriol (UEȝ)and inhibin-A. Quadruple biomarkers are worth test in diagnosing the congenital pathology during pregnancy, these procedures does not form a part of routine health care of pregnant women in Pakistan, so the median value is lacking for population in Pakistan. Objective: To determine median values of biochemical maternal serum markers in local population during second trimester maternal screening. Study settings: Department of Chemical Pathology and Endocrinology, Armed Forces Institute of Pathology (AFIP) Rawalpindi. Methods: Cross-Sectional study for estimation of reference values. Non-probability consecutive sampling, 155 healthy pregnant women, of 30-40 years of age, will be included. As non-parametric statistics will be used, the minimum sample size is 120. Result: Total 155 women were enrolled into this study. The age of all women enrolled ranged from 30 to39 yrs. Among them, 39 per cent of women were less than 34 years. Mean maternal age 33.46±2.35 SD and maternal body weight were 54.98±2.88. Median value of quadruple markers calculated from 15-18th week of gestation that will be used for calculation of MOM for screening of trisomy21 in this gestational age. Median value at 15 week of gestation were observed hCG 36650 mIU/ml, AFP 23.3 IU/ml, UEȝ 3.5 nmol/L, InhibinA 198 ng/L, at 16 week of gestation hCG 29050 mIU/ml, AFP 35.4 IU/ml, UEȝ 4.1 nmol/L, InhibinA 179 ng/L, at 17 week of gestation hCG 28450 mIU/ml, AFP 36.0 IU/ml, UEȝ 6.7 nmol/L, InhibinA 176 ng/L and at 18 week of gestation hCG 25200 mIU/ml, AFP 38.2 IU/ml, UEȝ 8.2 nmol/L, InhibinA 190 ng/L respectively.All the comparisons were significant (p-Value <0.005) with 95% confidence Interval (CI) and level of significance of study set by going through literature and set at 5%. Conclusion: The median values for these four biomarkers in Pakistani pregnant women can be used to calculate MoM.Keywords: screening, down syndrome, quadruple test, second trimester, serum biomarkers
Procedia PDF Downloads 18022177 Investigating the Antimicrobial Activity of Essential Oil Derived from Pistacia atlantica Gum against Extensively Drug-Resistant Gram-Negative Acinetobacter baumannii
Authors: Zhala Ahmad, Zainab Lazim, Haider Hamzah
Abstract:
Bacterial resistance is a pressing global health issue, with multidrug-resistant (MDR), extensively drug-resistant (XDR), and pandrug-resistant (PDR) strains to pose a serious threat. In this context, researchers are investigating effective, safe, and affordable metabolites to combat these pathogens. This study focuses on gum essential oil (GEO) extracted from Pistacia atlantica and its activity and the mechanism of action against XDR Gram-negative Acinetobacter baumannii. GEO was extracted by hydrodistillation and analyzed using GC-MS. Eleven A. baumannii isolates were collected from the ward environment of Burn and Plastic Surgery Hospital in Al Sulaymaniyah City, Iraq. They were identified using the VITEK 2 system, 16S rRNA gene, and confirmed with the blaₒₓₐ₋₅₁ gene; A. baumannii ATCC 19606 was used as a reference strain. The isolates were identified as resistant to twelve different antibiotics spanning six distinct antibiotic classes while showing susceptibility to tetracycline and trimethoprim. Over 40 chemical constituents were detected in the gum's essential oils, with α-pinene being the most abundant. GEO was found to inhibit the growth of A. baumannii isolates; the minimum inhibitory concentration (MIC) of GEO was 2.5 µl/ml. GEO induced protein leakage, phosphate, and potassium ion efflux, distorted cell morphology, and cell death in the tested bacteria. GEO exhibited bacterial clearance and anti-adhesion activity using Band-Aids. This study's findings suggest that GEO could be used as a potential alternative treatment for infectious diseases caused by XRD pathogens, shedding further light on the importance of GEO in biomedical applications. Future studies must focus on generating clinically feasible sources of GEO for testing in small animal models before proceeding to human trials, ensuring safe and effective translation from the laboratory to the clinic.Keywords: antibiotic resistance, Acinetobacter baumannii, essential oils, Pistacia atlantica, alpha-pinene
Procedia PDF Downloads 7122176 A Study on Method for Identifying Capacity Factor Declination of Wind Turbines
Authors: Dongheon Shin, Kyungnam Ko, Jongchul Huh
Abstract:
The investigation on wind turbine degradation was carried out using the nacelle wind data. The three Vestas V80-2MW wind turbines of Sungsan wind farm in Jeju Island, South Korea were selected for this work. The SCADA data of the wind farm for five years were analyzed to draw power curve of the turbines. It is assumed that the wind distribution is the Rayleigh distribution to calculate the normalized capacity factor based on the drawn power curve of the three wind turbines for each year. The result showed that the reduction of power output from the three wind turbines occurred every year and the normalized capacity factor decreased to 0.12%/year on average.Keywords: wind energy, power curve, capacity factor, annual energy production
Procedia PDF Downloads 43322175 Water Quality Calculation and Management System
Authors: H. M. B. N Jayasinghe
Abstract:
The water is found almost everywhere on Earth. Water resources contain a lot of pollution. Some diseases can be spread through the water to the living beings. So to be clean water it should undergo a number of treatments necessary to make it drinkable. So it is must to have purification technology for the wastewater. So the waste water treatment plants act a major role in these issues. When considering the procedures taken after the water treatment process was always based on manual calculations and recordings. Water purification plants may interact with lots of manual processes. It means the process taking much time consuming. So the final evaluation and chemical, biological treatment process get delayed. So to prevent those types of drawbacks there are some computerized programmable calculation and analytical techniques going to be introduced to the laboratory staff. To solve this problem automated system will be a solution in which guarantees the rational selection. A decision support system is a way to model data and make quality decisions based upon it. It is widely used in the world for the various kind of process automation. Decision support systems that just collect data and organize it effectively are usually called passive models where they do not suggest a specific decision but only reveal information. This web base system is based on global positioning data adding facility with map location. Most worth feature is SMS and E-mail alert service to inform the appropriate person on a critical issue. The technological influence to the system is HTML, MySQL, PHP, and some other web developing technologies. Current issues in the computerized water chemistry analysis are not much deep in progress. For an example the swimming pool water quality calculator. The validity of the system has been verified by test running and comparison with an existing plant data. Automated system will make the life easier in productively and qualitatively.Keywords: automated system, wastewater, purification technology, map location
Procedia PDF Downloads 24722174 Study of Radiological and Chemical Effects of Uranium in Ground Water of SW and NE Punjab, India
Authors: Komal Saini, S. K. Sahoo, B. S. Bajwa
Abstract:
The Laser Fluorimetery Technique has been used for the microanalysis of uranium content in water samples collected from different sources like the hand pumps, tube wells in the drinking water samples of SW & NE Punjab, India. The geographic location of the study region in NE Punjab is between latitude 31.21º- 32.05º N and longitude 75.60º-76.14º E and for SW Punjab is between latitude 29.66º-30.48º N and longitude 74.69º-75.54º E. The purpose of this study was mainly to investigate the uranium concentration levels of ground water being used for drinking purposes and to determine its health effects, if any, to the local population of these regions. In the present study 131 samples of drinking water collected from different villages of SW and 95 samples from NE, Punjab state, India have been analyzed for chemical and radiological toxicity. In the present investigation, uranium content in water samples of SW Punjab ranges from 0.13 to 908 μgL−1 with an average of 82.1 μgL−1 whereas in samples collected from NE- Punjab, it ranges from 0 to 28.2 μgL−1 with an average of 4.84 μgL−1. Thus, revealing that in the SW- Punjab 54 % of drinking water samples have uranium concentration higher than international recommended limit of 30 µgl-1 (WHO, 2011) while 35 % of samples exceeds the threshold of 60 µgl-1 recommended by our national regulatory authority of Atomic Energy Regulatory Board (AERB), Department of Atomic Energy, India, 2004. On the other hand in the NE-Punjab region, none of the observed water sample has uranium content above the national/international recommendations. The observed radiological risk in terms of excess cancer risk ranges from 3.64x10-7 to 2.54x10-3 for SW-Punjab, whereas for NE region it ranges from 0 to 7.89x10-5. The chemical toxic effect in terms of Life-time average Daily Dose (LDD) and Hazard Quotient (HQ) have also been calculated. The LDD for SW-Punjab varies from 0.0098 to 68.46 with an average of 6.18 µg/ kg/day whereas for NE region it varies from 0 to 2.13 with average 0.365 µg/ kg/day, thus indicating presence of chemical toxicity in SW Punjab as 35% of the observed samples in the SW Punjab are above the recommendation limit of 4.53 µg/ kg/day given by AERB for 60 µgl-1 of uranium. Maximum & Minimum values for hazard quotient for SW Punjab is 0.002 & 15.11 with average 1.36 which is considerably high as compared to safe limit i.e. 1. But for NE Punjab HQ varies from 0 to 0.47. The possible sources of high uranium observed in the SW- Punjab will also be discussed.Keywords: uranium, groundwater, radiological and chemical toxicity, Punjab, India
Procedia PDF Downloads 38122173 Clinical Applications of Amide Proton Transfer Magnetic Resonance Imaging: Detection of Brain Tumor Proliferative Activity
Authors: Fumihiro Imai, Shinichi Watanabe, Shingo Maeda, Haruna Imai, Hiroki Niimi
Abstract:
It is important to know the growth rate of brain tumors before surgery because it influences treatment planning, including not only surgical resection strategy but also adjuvant therapy after surgery. Amide proton transfer (APT) imaging is an emerging molecular magnetic resonance imaging (MRI) technique based on chemical exchange saturation transfer without the administration of a contrast medium. The underlying assumption in APT imaging of tumors is that there is a close relationship between the proliferative activity of the tumor and mobile protein synthesis. We aimed to evaluate the diagnostic performance of APT imaging of pre-and post-treatment brain tumors. Ten patients with brain tumor underwent conventional and APT-weighted sequences on a 3.0 Tesla MRI before clinical intervention. The maximum and the minimum APT-weighted signals (APTWmax and APTWmin) in each solid tumor region were obtained and compared before and after a clinical intervention. All surgical specimens were examined for histopathological diagnosis. Eight of ten patients underwent adjuvant therapy after surgery. Histopathological diagnosis was glioma in 7 patients (WHO grade 2 in 2 patients, WHO grade 3 in 3 patients, and WHO grade 4 in 2 patients), meningioma WHO grade 1 in 2 patients, and primary lymphoma of the brain in 1 patient. High-grade gliomas showed significantly higher APTW signals than that low-grade gliomas. APTWmax in one huge parasagittal meningioma infiltrating into the skull bone was higher than that in glioma WHO grade 4. On the other hand, APTWmax in another convexity meningioma was the same as that in glioma WHO grade 3. Diagnosis of primary lymphoma of the brain was possible with APT imaging before pathological confirmation. APTW signals in residual tumors decreased dramatically within one year after adjuvant therapy in all patients. APT imaging demonstrated excellent diagnostic performance for the planning of surgery and adjuvant therapy of brain tumors.Keywords: amides, magnetic resonance imaging, brain tumors, cell proliferation
Procedia PDF Downloads 8722172 Analyzing the Effectiveness of a Bank of Parallel Resistors, as a Burden Compensation Technique for Current Transformer's Burden, Using LabVIEW™ Data Acquisition Tool
Authors: Dilson Subedi
Abstract:
Current transformers are an integral part of power system because it provides a proportional safe amount of current for protection and measurement applications. However, due to upgradation of electromechanical relays to numerical relays and electromechanical energy meters to digital meters, the connected burden, which defines some of the CT characteristics, has drastically reduced. This has led to the system experiencing high currents damaging the connected relays and meters. Since the protection and metering equipment's are designed to withstand only certain amount of current with respect to time, these high currents pose a risk to man and equipment. Therefore, during such instances, the CT saturation characteristics have a huge influence on the safety of both man and equipment and on the reliability of the protection and metering system. This paper shows the effectiveness of a bank of parallel connected resistors, as a burden compensation technique, in compensating the burden of under-burdened CT’s. The response of the CT in the case of failure of one or more resistors at different levels of overcurrent will be captured using the LabVIEWTM data acquisition hardware (DAQ). The analysis is done on the real-time data gathered using LabVIEWTM. Variation of current transformer saturation characteristics with changes in burden will be discussed.Keywords: accuracy limiting factor, burden, burden compensation, current transformer
Procedia PDF Downloads 24522171 Exploring Ways Early Childhood Teachers Integrate Information and Communication Technologies into Children's Play: Two Case Studies from the Australian Context
Authors: Caroline Labib
Abstract:
This paper reports on a qualitative study exploring the approaches teachers used to integrate computers or smart tablets into their program planning. Their aim was to integrate ICT into children’s play, thereby supporting children’s learning and development. Data was collected in preschool settings in Melbourne in 2016. Interviews with teachers, observations of teacher interactions with children and copies of teachers’ planning and observation documents informed the study. The paper looks closely at findings from two early childhood settings and focuses on exploring the differing approaches two EC teachers have adopted when integrating iPad or computers into their settings. Data analysis revealed three key approaches which have been labelled: free digital play, guided digital play and teacher-led digital use. Importantly, teacher decisions were influenced by the interplay between the opportunities that the ICT tools offered, the teachers’ prior knowledge and experience about ICT and children’s learning needs and contexts. This paper is a snapshot of two early childhood settings, and further research will encompass data from six more early childhood settings in Victoria with the aim of exploring a wide range of motivating factors for early childhood teachers trying to integrate ICT into their programs.Keywords: early childhood education (ECE), digital play, information and communication technologies (ICT), play, and teachers' interaction approaches
Procedia PDF Downloads 21222170 Maximum Likelihood Estimation Methods on a Two-Parameter Rayleigh Distribution under Progressive Type-Ii Censoring
Authors: Daniel Fundi Murithi
Abstract:
Data from economic, social, clinical, and industrial studies are in some way incomplete or incorrect due to censoring. Such data may have adverse effects if used in the estimation problem. We propose the use of Maximum Likelihood Estimation (MLE) under a progressive type-II censoring scheme to remedy this problem. In particular, maximum likelihood estimates (MLEs) for the location (µ) and scale (λ) parameters of two Parameter Rayleigh distribution are realized under a progressive type-II censoring scheme using the Expectation-Maximization (EM) and the Newton-Raphson (NR) algorithms. These algorithms are used comparatively because they iteratively produce satisfactory results in the estimation problem. The progressively type-II censoring scheme is used because it allows the removal of test units before the termination of the experiment. Approximate asymptotic variances and confidence intervals for the location and scale parameters are derived/constructed. The efficiency of EM and the NR algorithms is compared given root mean squared error (RMSE), bias, and the coverage rate. The simulation study showed that in most sets of simulation cases, the estimates obtained using the Expectation-maximization algorithm had small biases, small variances, narrower/small confidence intervals width, and small root of mean squared error compared to those generated via the Newton-Raphson (NR) algorithm. Further, the analysis of a real-life data set (data from simple experimental trials) showed that the Expectation-Maximization (EM) algorithm performs better compared to Newton-Raphson (NR) algorithm in all simulation cases under the progressive type-II censoring scheme.Keywords: expectation-maximization algorithm, maximum likelihood estimation, Newton-Raphson method, two-parameter Rayleigh distribution, progressive type-II censoring
Procedia PDF Downloads 16322169 Evaluation of Double Displacement Process via Gas Dumpflood from Multiple Gas Reservoirs
Authors: B. Rakjarit, S. Athichanagorn
Abstract:
Double displacement process is a method in which gas is injected at an updip well to displace the oil bypassed by waterflooding operation from downdip water injector. As gas injection is costly and a large amount of gas is needed, gas dump-flood from multiple gas reservoirs is an attractive alternative. The objective of this paper is to demonstrate the benefits of the novel approach of double displacement process via gas dump-flood from multiple gas reservoirs. A reservoir simulation model consisting of a dipping oil reservoir and several underlying layered gas reservoirs was constructed in order to investigate the performance of the proposed method. Initially, water was injected via the downdip well to displace oil towards the producer located updip. When the water cut at the producer became high, the updip well was shut in and perforated in the gas zones in order to dump gas into the oil reservoir. At this point, the downdip well was open for production. In order to optimize oil recovery, oil production and water injection rates and perforation strategy on the gas reservoirs were investigated for different numbers of gas reservoirs having various depths and thicknesses. Gas dump-flood from multiple gas reservoirs can help increase the oil recovery after implementation of waterflooding upto 10%. Although the amount of additional oil recovery is slightly lower than the one obtained in conventional double displacement process, the proposed process requires a small completion cost of the gas zones and no operating cost while the conventional method incurs high capital investment in gas compression facility and high-pressure gas pipeline and additional operating cost. From the simulation study, oil recovery can be optimized by producing oil at a suitable rate and perforating the gas zones with the right strategy which depends on depths, thicknesses and number of the gas reservoirs. Conventional double displacement process has been studied and successfully implemented in many fields around the world. However, the method of dumping gas into the oil reservoir instead of injecting it from surface during the second displacement process has never been studied. The study of this novel approach will help a practicing engineer to understand the benefits of such method and can implement it with minimum cost.Keywords: gas dump-flood, multi-gas layers, double displacement process, reservoir simulation
Procedia PDF Downloads 40822168 Deepfake Detection for Compressed Media
Authors: Sushil Kumar Gupta, Atharva Joshi, Ayush Sonawale, Sachin Naik, Rajshree Khande
Abstract:
The usage of artificially created videos and audio by deep learning is a major problem of the current media landscape, as it pursues the goal of misinformation and distrust. In conclusion, the objective of this work targets generating a reliable deepfake detection model using deep learning that will help detect forged videos accurately. In this work, CelebDF v1, one of the largest deepfake benchmark datasets in the literature, is adopted to train and test the proposed models. The data includes authentic and synthetic videos of high quality, therefore allowing an assessment of the model’s performance against realistic distortions.Keywords: deepfake detection, CelebDF v1, convolutional neural network (CNN), xception model, data augmentation, media manipulation
Procedia PDF Downloads 1122167 The Impact of Financial Risk on Banks’ Financial Performance: A Comparative Study of Islamic Banks and Conventional Banks in Pakistan
Authors: Mohammad Yousaf Safi Mohibullah Afghan
Abstract:
The study made on Islamic and conventional banks scrutinizes the risks interconnected with credit and liquidity on the productivity performance of Islamic and conventional banks that operate in Pakistan. Among the banks, only 4 Islamic and 18 conventional banks have been selected to enrich the result of our study on Islamic banks performance in connection to conventional banks. The selection of the banks to the panel is based on collecting quarterly unbalanced data ranges from the first quarter of 2007 to the last quarter of 2017. The data are collected from the Bank’s web sites and State Bank of Pakistan. The data collection is carried out based on Delta-method test. The mentioned test is used to find out the empirical results. In the study, while collecting data on the banks, the return on assets and return on equity have been major factors that are used assignificant proxies in determining the profitability of the banks. Moreover, another major proxy is used in measuring credit and liquidity risks, the loan loss provision to total loan and the ratio of liquid assets to total liability. Meanwhile, with consideration to the previous literature, some other variables such as bank size, bank capital, bank branches, and bank employees have been used to tentatively control the impact of those factors whose direct and indirect effects on profitability is understood. In conclusion, the study emphasizes that credit risk affects return on asset and return on equity positively, and there is no significant difference in term of credit risk between Islamic and conventional banks. Similarly, the liquidity risk has a significant impact on the bank’s profitability, though the marginal effect of liquidity risk is higher for Islamic banks than conventional banks.Keywords: islamic & conventional banks, performance return on equity, return on assets, pakistan banking sectors, profitibility
Procedia PDF Downloads 16522166 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 35822165 Spatial Mapping of Variations in Groundwater of Taluka Islamkot Thar Using GIS and Field Data
Authors: Imran Aziz Tunio
Abstract:
Islamkot is an underdeveloped sub-district (Taluka) in the Tharparkar district Sindh province of Pakistan located between latitude 24°25'19.79"N to 24°47'59.92"N and longitude 70° 1'13.95"E to 70°32'15.11"E. The Islamkot has an arid desert climate and the region is generally devoid of perennial rivers, canals, and streams. It is highly dependent on rainfall which is not considered a reliable surface water source and groundwater is the only key source of water for many centuries. To assess groundwater’s potential, an electrical resistivity survey (ERS) was conducted in Islamkot Taluka. Groundwater investigations for 128 Vertical Electrical Sounding (VES) were collected to determine the groundwater potential and obtain qualitatively and quantitatively layered resistivity parameters. The PASI Model 16 GL-N Resistivity Meter was used by employing a Schlumberger electrode configuration, with half current electrode spacing (AB/2) ranging from 1.5 to 100 m and the potential electrode spacing (MN/2) from 0.5 to 10 m. The data was acquired with a maximum current electrode spacing of 200 m. The data processing for the delineation of dune sand aquifers involved the technique of data inversion, and the interpretation of the inversion results was aided by the use of forward modeling. The measured geo-electrical parameters were examined by Interpex IX1D software, and apparent resistivity curves and synthetic model layered parameters were mapped in the ArcGIS environment using the inverse Distance Weighting (IDW) interpolation technique. Qualitative interpretation of vertical electrical sounding (VES) data shows the number of geo-electrical layers in the area varies from three to four with different resistivity values detected. Out of 128 VES model curves, 42 nos. are 3 layered, and 86 nos. are 4 layered. The resistivity of the first subsurface layers (Loose surface sand) varied from 16.13 Ωm to 3353.3 Ωm and thickness varied from 0.046 m to 17.52m. The resistivity of the second subsurface layer (Semi-consolidated sand) varied from 1.10 Ωm to 7442.8 Ωm and thickness varied from 0.30 m to 56.27 m. The resistivity of the third subsurface layer (Consolidated sand) varied from 0.00001 Ωm to 3190.8 Ωm and thickness varied from 3.26 m to 86.66 m. The resistivity of the fourth subsurface layer (Silt and Clay) varied from 0.0013 Ωm to 16264 Ωm and thickness varied from 13.50 m to 87.68 m. The Dar Zarrouk parameters, i.e. longitudinal unit conductance S is from 0.00024 to 19.91 mho; transverse unit resistance T from 7.34 to 40080.63 Ωm2; longitudinal resistance RS is from 1.22 to 3137.10 Ωm and transverse resistivity RT from 5.84 to 3138.54 Ωm. ERS data and Dar Zarrouk parameters were mapped which revealed that the study area has groundwater potential in the subsurface.Keywords: electrical resistivity survey, GIS & RS, groundwater potential, environmental assessment, VES
Procedia PDF Downloads 11022164 [Keynote Talk]: Discovering Liouville-Type Problems for p-Energy Minimizing Maps in Closed Half-Ellipsoids by Calculus Variation Method
Authors: Lina Wu, Jia Liu, Ye Li
Abstract:
The goal of this project is to investigate constant properties (called the Liouville-type Problem) for a p-stable map as a local or global minimum of a p-energy functional where the domain is a Euclidean space and the target space is a closed half-ellipsoid. The First and Second Variation Formulas for a p-energy functional has been applied in the Calculus Variation Method as computation techniques. Stokes’ Theorem, Cauchy-Schwarz Inequality, Hardy-Sobolev type Inequalities, and the Bochner Formula as estimation techniques have been used to estimate the lower bound and the upper bound of the derived p-Harmonic Stability Inequality. One challenging point in this project is to construct a family of variation maps such that the images of variation maps must be guaranteed in a closed half-ellipsoid. The other challenging point is to find a contradiction between the lower bound and the upper bound in an analysis of p-Harmonic Stability Inequality when a p-energy minimizing map is not constant. Therefore, the possibility of a non-constant p-energy minimizing map has been ruled out and the constant property for a p-energy minimizing map has been obtained. Our research finding is to explore the constant property for a p-stable map from a Euclidean space into a closed half-ellipsoid in a certain range of p. The certain range of p is determined by the dimension values of a Euclidean space (the domain) and an ellipsoid (the target space). The certain range of p is also bounded by the curvature values on an ellipsoid (that is, the ratio of the longest axis to the shortest axis). Regarding Liouville-type results for a p-stable map, our research finding on an ellipsoid is a generalization of mathematicians’ results on a sphere. Our result is also an extension of mathematicians’ Liouville-type results from a special ellipsoid with only one parameter to any ellipsoid with (n+1) parameters in the general setting.Keywords: Bochner formula, Calculus Stokes' Theorem, Cauchy-Schwarz Inequality, first and second variation formulas, Liouville-type problem, p-harmonic map
Procedia PDF Downloads 27422163 Precipitation Intensity: Duration Based Threshold Analysis for Initiation of Landslides in Upper Alaknanda Valley
Authors: Soumiya Bhattacharjee, P. K. Champati Ray, Shovan L. Chattoraj, Mrinmoy Dhara
Abstract:
The entire Himalayan range is globally renowned for rainfall-induced landslides. The prime focus of the study is to determine rainfall based threshold for initiation of landslides that can be used as an important component of an early warning system for alerting stake holders. This research deals with temporal dimension of slope failures due to extreme rainfall events along the National Highway-58 from Karanprayag to Badrinath in the Garhwal Himalaya, India. Post processed 3-hourly rainfall intensity data and its corresponding duration from daily rainfall data available from Tropical Rainfall Measuring Mission (TRMM) were used as the prime source of rainfall data. Landslide event records from Border Road Organization (BRO) and some ancillary landslide inventory data for 2013 and 2014 have been used to determine Intensity Duration (ID) based rainfall threshold. The derived governing threshold equation, I= 4.738D-0.025, has been considered for prediction of landslides of the study region. This equation was validated with an accuracy of 70% landslides during August and September 2014. The derived equation was considered for further prediction of landslides of the study region. From the obtained results and validation, it can be inferred that this equation can be used for initiation of landslides in the study area to work as a part of an early warning system. Results can significantly improve with ground based rainfall estimates and better database on landslide records. Thus, the study has demonstrated a very low cost method to get first-hand information on possibility of impending landslide in any region, thereby providing alert and better preparedness for landslide disaster mitigation.Keywords: landslide, intensity-duration, rainfall threshold, TRMM, slope, inventory, early warning system
Procedia PDF Downloads 27322162 Governance Challenges of Consolidated Destinations. The Case of Barcelona
Authors: Montserrat Crespi-Vallbona; Oscar Mascarilla-Miró
Abstract:
Mature destinations have different challenges trying to attract tourism and please its citizens. Hence, they have to maintain their touristic interest to standard demand and also not to undeceive those tourists with more advanced experiences. Second, they have to be concerned for the daily life of citizens and avoid the negative effects of touristification. This balance is quite delicate and often has to do with the sensitivity and commitment of the party in the local government. However, what is a general consensus is the need for destinations to differentiate from the homogeneous rest of regions and create new content, consumable resources or marketing events to guarantee their positioning. In this sense, the main responsibility of destinations is to satisfy users, tourists and citizens. Hence, its aim has to do with holistic experiences, which collect these wide approaches. Specifically, this research aims to analyze the volume and growth of tourist houses in the central touristic neighborhoods of Barcelona (this is Ciutat Vella) as the starting point to identify the behavior of tourists regarding their interests in searching for local heritage attractiveness and community atmosphere. Then, different cases are analyzed in order to show how Barcelona struggles to keep its attractive brand for the visitors, as well as for its inhabitants. Methodologically, secondary data used in this research comes from official registered tourist houses (Catalunya Government), Open Data (Barcelona municipality), the Airbnb tourist platform, from the Incasol Data and Municipal Register of Inhabitants. Primary data are collected through in-depth interviews with neighbors, social movement managers and political representatives from Turisme de Barcelona (local DMO, Destination Management Organization). Results show what the opportunities and priorities are for key actors to design policies to find a balance between all different interests.Keywords: touristification, tourist houses, governance, tourism demand, airbnbfication
Procedia PDF Downloads 6522161 Predicting Recessions with Bivariate Dynamic Probit Model: The Czech and German Case
Authors: Lukas Reznak, Maria Reznakova
Abstract:
Recession of an economy has a profound negative effect on all involved stakeholders. It follows that timely prediction of recessions has been of utmost interest both in the theoretical research and in practical macroeconomic modelling. Current mainstream of recession prediction is based on standard OLS models of continuous GDP using macroeconomic data. This approach is not suitable for two reasons: the standard continuous models are proving to be obsolete and the macroeconomic data are unreliable, often revised many years retroactively. The aim of the paper is to explore a different branch of recession forecasting research theory and verify the findings on real data of the Czech Republic and Germany. In the paper, the authors present a family of discrete choice probit models with parameters estimated by the method of maximum likelihood. In the basic form, the probits model a univariate series of recessions and expansions in the economic cycle for a given country. The majority of the paper deals with more complex model structures, namely dynamic and bivariate extensions. The dynamic structure models the autoregressive nature of recessions, taking into consideration previous economic activity to predict the development in subsequent periods. Bivariate extensions utilize information from a foreign economy by incorporating correlation of error terms and thus modelling the dependencies of the two countries. Bivariate models predict a bivariate time series of economic states in both economies and thus enhance the predictive performance. A vital enabler of timely and successful recession forecasting are reliable and readily available data. Leading indicators, namely the yield curve and the stock market indices, represent an ideal data base, as the pieces of information is available in advance and do not undergo any retroactive revisions. As importantly, the combination of yield curve and stock market indices reflect a range of macroeconomic and financial market investors’ trends which influence the economic cycle. These theoretical approaches are applied on real data of Czech Republic and Germany. Two models for each country were identified – each for in-sample and out-of-sample predictive purposes. All four followed a bivariate structure, while three contained a dynamic component.Keywords: bivariate probit, leading indicators, recession forecasting, Czech Republic, Germany
Procedia PDF Downloads 24822160 African Folklore for Critical Self-Reflection, Reflective Dialogue, and Resultant Attitudinal and Behaviour Change: University Students’ Experiences
Authors: T. M. Buthelezi, E. O. Olagundoye, R. G. L. Cele
Abstract:
This article argues that whilst African folklore has mainly been used for entertainment, it also has an educational value that has power to change young people’s attitudes and behavior. The paper is informed by the findings from the data that was generated from 154 university students who were coming from diverse backgrounds. The qualitative data was thematically analysed. Referring to the six steps of the behaviour change model, we found that African Folklore provides relevant cultural knowledge and instills values that enable young people to engage on self-reflection that eventually leads them towards attitudinal changes and behaviour modification. Using the transformative learning theory, we argue that African Folklore in itself is a pedagogical strategy that integrates cultural knowledge, values with entertainment elements concisely enough to take the young people through a transformative phase which encompasses psychological, convictional and life-style adaptation. During data production stage all ethical considerations were observed including obtaining gatekeeper’s permission letter and ethical clearance certificate from the Ethics Committee of the University. The paper recommends that African Folklore approach should be incorporated into the school curriculum particularly in life skills education with aims to change behaviour.Keywords: African folklore, young people, attitudinal, behavior change, university students
Procedia PDF Downloads 26422159 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories
Authors: Mojtaba Taheri, Saied Reza Ameli
Abstract:
In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty
Procedia PDF Downloads 7122158 Value Relevance of Accounting Information: A Study of Steel Sector in India
Authors: Pradyumna Mohanty
Abstract:
The paper aims to explore whether accounting information of Indian companies in the Steel sector are value relevant or not. Ohlson’s model which usually takes into consideration book value per share (BV) and earnings per share (EARN) has been used and the same has been expanded to include two more variables such as cash flow from operations (CFO) and return on equity (ROE). The data were collected from CMIE-Prowess data base in respect of BSE-listed steel companies and the time frame spans from 2010 to 2014. OLS regression has been used to test the value relevance of these accounting numbers. Results indicate that both CFO and BV are having significant influence on the stock price in two out of five years of study. But, BV is emerging as the most significant and highly value relevant of all the four variables during the entire period of study.Keywords: value relevance, accounting information, book value per share, earnings per share
Procedia PDF Downloads 15922157 Implementation of Achterbahn-128 for Images Encryption and Decryption
Authors: Aissa Belmeguenai, Khaled Mansouri
Abstract:
In this work, an efficient implementation of Achterbahn-128 for images encryption and decryption was introduced. The implementation for this simulated project is written by MATLAB.7.5. At first two different original images are used for validate the proposed design. Then our developed program was used to transform the original images data into image digits file. Finally, we used our implemented program to encrypt and decrypt images data. Several tests are done for proving the design performance including visual tests and security analysis; we discuss the security analysis of the proposed image encryption scheme including some important ones like key sensitivity analysis, key space analysis, and statistical attacks.Keywords: Achterbahn-128, stream cipher, image encryption, security analysis
Procedia PDF Downloads 53222156 Groundwater Monitoring Using a Community: Science Approach
Authors: Shobha Kumari Yadav, Yubaraj Satyal, Ajaya Dixit
Abstract:
In addressing groundwater depletion, it is important to develop evidence base so to be used in assessing the state of its degradation. Groundwater data is limited compared to meteorological data, which impedes the groundwater use and management plan. Monitoring of groundwater levels provides information base to assess the condition of aquifers, their responses to water extraction, land-use change, and climatic variability. It is important to maintain a network of spatially distributed, long-term monitoring wells to support groundwater management plan. Monitoring involving local community is a cost effective approach that generates real time data to effectively manage groundwater use. This paper presents the relationship between rainfall and spring flow, which are the main source of freshwater for drinking, household consumptions and agriculture in hills of Nepal. The supply and withdrawal of water from springs depends upon local hydrology and the meteorological characteristics- such as rainfall, evapotranspiration and interflow. The study offers evidence of the use of scientific method and community based initiative for managing groundwater and springshed. The approach presents a method to replicate similar initiative in other parts of the country for maintaining integrity of springs.Keywords: citizen science, groundwater, water resource management, Nepal
Procedia PDF Downloads 20322155 A Dynamic Spatial Panel Data Analysis on Renter-Occupied Multifamily Housing DC
Authors: Jose Funes, Jeff Sauer, Laixiang Sun
Abstract:
This research examines determinants of multifamily housing development and spillovers in the District of Columbia. A range of socioeconomic factors related to income distribution, productivity, and land use policies are thought to influence the development in contemporary U.S. multifamily housing markets. The analysis leverages data from the American Community Survey to construct panel datasets spanning from 2010 to 2019. Using spatial regression, we identify several socioeconomic measures and land use policies both positively and negatively associated with new housing supply. We contextualize housing estimates related to race in relation to uneven development in the contemporary D.C. housing supply.Keywords: neighborhood effect, sorting, spatial spillovers, multifamily housing
Procedia PDF Downloads 10222154 Judicial Analysis of the Burden of Proof on the Perpetrator of Corruption Criminal Act
Authors: Rahmayanti, Theresia Simatupang, Ronald H. Sianturi
Abstract:
Corruption criminal act develops rapidly since in the transition era there is weakness in law. Consequently, there is an opportunity for a few people to do fraud and illegal acts and to misuse their positions and formal functions in order to make them rich, and the criminal acts are done systematically and sophisticatedly. Some people believe that legal provisions which specifically regulate the corruption criminal act; namely, Law No. 31/1999 in conjunction with Law No. 20/2001 on the Eradication of Corruption Criminal Act are not effective any more, especially in onus probandi (the burden of proof) on corruptors. The research was a descriptive analysis, a research method which is used to obtain description on a certain situation or condition by explaining the data, and the conclusion is drawn through some analyses. The research used judicial normative approach since it used secondary data as the main data by conducting library research. The system of the burden of proof, which follows the principles of reversal of the burden of proof stipulated in Article 12B, paragraph 1 a and b, Article 37A, and Article 38B of Law No. 20/2001 on the Amendment of Law No. 31/1999, is used only as supporting evidence when the principal case is proved. Meanwhile, how to maximize the implementation of the burden of proof on the perpetrators of corruption criminal act in which the public prosecutor brings a corruption case to Court, depends upon the nature of the case and the type of indictment. The system of burden of proof can be used to eradicate corruption in the Court if some policies and general principles of justice such as independency, impartiality, and legal certainty, are applied.Keywords: burden of proof, perpetrator, corruption criminal act
Procedia PDF Downloads 32122153 Bayesian Estimation of Hierarchical Models for Genotypic Differentiation of Arabidopsis thaliana
Authors: Gautier Viaud, Paul-Henry Cournède
Abstract:
Plant growth models have been used extensively for the prediction of the phenotypic performance of plants. However, they remain most often calibrated for a given genotype and therefore do not take into account genotype by environment interactions. One way of achieving such an objective is to consider Bayesian hierarchical models. Three levels can be identified in such models: The first level describes how a given growth model describes the phenotype of the plant as a function of individual parameters, the second level describes how these individual parameters are distributed within a plant population, the third level corresponds to the attribution of priors on population parameters. Thanks to the Bayesian framework, choosing appropriate priors for the population parameters permits to derive analytical expressions for the full conditional distributions of these population parameters. As plant growth models are of a nonlinear nature, individual parameters cannot be sampled explicitly, and a Metropolis step must be performed. This allows for the use of a hybrid Gibbs--Metropolis sampler. A generic approach was devised for the implementation of both general state space models and estimation algorithms within a programming platform. It was designed using the Julia language, which combines an elegant syntax, metaprogramming capabilities and exhibits high efficiency. Results were obtained for Arabidopsis thaliana on both simulated and real data. An organ-scale Greenlab model for the latter is thus presented, where the surface areas of each individual leaf can be simulated. It is assumed that the error made on the measurement of leaf areas is proportional to the leaf area itself; multiplicative normal noises for the observations are therefore used. Real data were obtained via image analysis of zenithal images of Arabidopsis thaliana over a period of 21 days using a two-step segmentation and tracking algorithm which notably takes advantage of the Arabidopsis thaliana phyllotaxy. Since the model formulation is rather flexible, there is no need that the data for a single individual be available at all times, nor that the times at which data is available be the same for all the different individuals. This allows to discard data from image analysis when it is not considered reliable enough, thereby providing low-biased data in large quantity for leaf areas. The proposed model precisely reproduces the dynamics of Arabidopsis thaliana’s growth while accounting for the variability between genotypes. In addition to the estimation of the population parameters, the level of variability is an interesting indicator of the genotypic stability of model parameters. A promising perspective is to test whether some of the latter should be considered as fixed effects.Keywords: bayesian, genotypic differentiation, hierarchical models, plant growth models
Procedia PDF Downloads 30322152 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK
Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves
Abstract:
Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media
Procedia PDF Downloads 10522151 A Study on the Effect of Different Climate Conditions on Time of Balance of Bleeding and Evaporation in Plastic Shrinkage Cracking of Concrete Pavements
Authors: Hasan Ziari, Hassan Fazaeli, Seyed Javad Vaziri Kang Olyaei, Asma Sadat Dabiri
Abstract:
The presence of cracks in concrete pavements is a place for the ingression of corrosive substances, acids, oils, and water into the pavement and reduces its long-term durability and level of service. One of the causes of early cracks in concrete pavements is the plastic shrinkage. This shrinkage occurs due to the formation of negative capillary pressures after the equilibrium of the bleeding and evaporation rates at the pavement surface. These cracks form if the tensile stresses caused by the restrained shrinkage exceed the tensile strength of the concrete. Different climate conditions change the rate of evaporation and thus change the balance time of the bleeding and evaporation, which changes the severity of cracking in concrete. The present study examined the relationship between the balance time of bleeding and evaporation and the area of cracking in the concrete slabs using the standard method ASTM C1579 in 27 different environmental conditions by using continuous video recording and digital image analyzing. The results showed that as the evaporation rate increased and the balance time decreased, the crack severity significantly increased so that by reducing the balance time from the maximum value to its minimum value, the cracking area increased more than four times. It was also observed that the cracking area- balance time curve could be interpreted in three sections. An examination of these three parts showed that the combination of climate conditions has a significant effect on increasing or decreasing these two variables. The criticality of a single factor cannot cause the critical conditions of plastic cracking. By combining two mild environmental factors with a severe climate factor (in terms of surface evaporation rate), a considerable reduction in balance time and a sharp increase in cracking severity can be prevented. The results of this study showed that balance time could be an essential factor in controlling and predicting plastic shrinkage cracking in concrete pavements. It is necessary to control this factor in the case of constructing concrete pavements in different climate conditions.Keywords: bleeding and cracking severity, concrete pavements, climate conditions, plastic shrinkage
Procedia PDF Downloads 14622150 Earthquake Risk Assessment Using Out-of-Sequence Thrust Movement
Authors: Rajkumar Ghosh
Abstract:
Earthquakes are natural disasters that pose a significant risk to human life and infrastructure. Effective earthquake mitigation measures require a thorough understanding of the dynamics of seismic occurrences, including thrust movement. Traditionally, estimating thrust movement has relied on typical techniques that may not capture the full complexity of these events. Therefore, investigating alternative approaches, such as incorporating out-of-sequence thrust movement data, could enhance earthquake mitigation strategies. This review aims to provide an overview of the applications of out-of-sequence thrust movement in earthquake mitigation. By examining existing research and studies, the objective is to understand how precise estimation of thrust movement can contribute to improving structural design, analyzing infrastructure risk, and developing early warning systems. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources, including GPS measurements, satellite imagery, and seismic recordings. By analyzing and synthesizing these diverse datasets, researchers can gain a more comprehensive understanding of thrust movement dynamics during seismic occurrences. The review identifies potential advantages of incorporating out-of-sequence data in earthquake mitigation techniques. These include improving the efficiency of structural design, enhancing infrastructure risk analysis, and developing more accurate early warning systems. By considering out-of-sequence thrust movement estimates, researchers and policymakers can make informed decisions to mitigate the impact of earthquakes. This study contributes to the field of seismic monitoring and earthquake risk assessment by highlighting the benefits of incorporating out-of-sequence thrust movement data. By broadening the scope of analysis beyond traditional techniques, researchers can enhance their knowledge of earthquake dynamics and improve the effectiveness of mitigation measures. The study collects data from various sources, including GPS measurements, satellite imagery, and seismic recordings. These datasets are then analyzed using appropriate statistical and computational techniques to estimate out-of-sequence thrust movement. The review integrates findings from multiple studies to provide a comprehensive assessment of the topic. The study concludes that incorporating out-of-sequence thrust movement data can significantly enhance earthquake mitigation measures. By utilizing diverse data sources, researchers and policymakers can gain a more comprehensive understanding of seismic dynamics and make informed decisions. However, challenges exist, such as data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and improve the accuracy of estimates, further research and advancements in methodology are recommended. Overall, this review serves as a valuable resource for researchers, engineers, and policymakers involved in earthquake mitigation, as it encourages the development of innovative strategies based on a better understanding of thrust movement dynamics.Keywords: earthquake, out-of-sequence thrust, disaster, human life
Procedia PDF Downloads 77