Search results for: event study methodology
52051 Threat Modeling Methodology for Supporting Industrial Control Systems Device Manufacturers and System Integrators
Authors: Raluca Ana Maria Viziteu, Anna Prudnikova
Abstract:
Industrial control systems (ICS) have received much attention in recent years due to the convergence of information technology (IT) and operational technology (OT) that has increased the interdependence of safety and security issues to be considered. These issues require ICS-tailored solutions. That led to the need to creation of a methodology for supporting ICS device manufacturers and system integrators in carrying out threat modeling of embedded ICS devices in a way that guarantees the quality of the identified threats and minimizes subjectivity in the threat identification process. To research, the possibility of creating such a methodology, a set of existing standards, regulations, papers, and publications related to threat modeling in the ICS sector and other sectors was reviewed to identify various existing methodologies and methods used in threat modeling. Furthermore, the most popular ones were tested in an exploratory phase on a specific PLC device. The outcome of this exploratory phase has been used as a basis for defining specific characteristics of ICS embedded devices and their deployment scenarios, identifying the factors that introduce subjectivity in the threat modeling process of such devices, and defining metrics for evaluating the minimum quality requirements of identified threats associated to the deployment of the devices in existing infrastructures. Furthermore, the threat modeling methodology was created based on the previous steps' results. The usability of the methodology was evaluated through a set of standardized threat modeling requirements and a standardized comparison method for threat modeling methodologies. The outcomes of these verification methods confirm that the methodology is effective. The full paper includes the outcome of research on different threat modeling methodologies that can be used in OT, their comparison, and the results of implementing each of them in practice on a PLC device. This research is further used to build a threat modeling methodology tailored to OT environments; a detailed description is included. Moreover, the paper includes results of the evaluation of created methodology based on a set of parameters specifically created to rate threat modeling methodologies.Keywords: device manufacturers, embedded devices, industrial control systems, threat modeling
Procedia PDF Downloads 7852050 Optimization of Effecting Parameters for the Removal of H₂S Gas in Self Priming Venturi Scrubber Using Response Surface Methodology
Authors: Manisha Bal, B. C. Meikap
Abstract:
Highly toxic and corrosive gas H₂S is recognized as one of the hazardous air pollutants which has significant effect on the human health. Abatement of H₂S gas from the air is very necessary. H₂S gas is mainly released from the industries like paper and leather industry as well as during the production of crude oil, during wastewater treatment, etc. But the emission of H₂S gas in high concentration may cause immediate death while at lower concentrations can cause various respiratory problems. In the present study, self priming venturi scrubber is used to remove the H₂S gas from the air. Response surface methodology with central composite design has been chosen to observe the effect of process parameters on the removal efficiency of H₂S. Experiments were conducted by varying the throat gas velocity, liquid level in outer cylinder, and inlet H₂S concentration. ANOVA test confirmed the significant effect of parameters on the removal efficiency. A quadratic equation has been obtained which predicts the removal efficiency very well. The suitability of the developed model has been judged by the higher R² square value which obtained from the regression analysis. From the investigation, it was found that the throat gas velocity has most significant effect and inlet concentration of H₂S has less effect on H₂S removal efficiency.Keywords: desulfurization, pollution control, response surface methodology, venturi scrubber
Procedia PDF Downloads 13552049 Water Ingress into Underground Mine Voids in the Central Rand Goldfields Area, South Africa-Fluid Induced Seismicity
Authors: Artur Cichowicz
Abstract:
The last active mine in the Central Rand Goldfields area (50 km x 15 km) ceased operations in 2008. This resulted in the closure of the pumping stations, which previously maintained the underground water level in the mining voids. As a direct consequence of the water being allowed to flood the mine voids, seismic activity has increased directly beneath the populated area of Johannesburg. Monitoring of seismicity in the area has been on-going for over five years using the network of 17 strong ground motion sensors. The objective of the project is to improve strategies for mine closure. The evolution of the seismicity pattern was investigated in detail. Special attention was given to seismic source parameters such as magnitude, scalar seismic moment and static stress drop. Most events are located within historical mine boundaries. The seismicity pattern shows a strong relationship between the presence of the mining void and high levels of seismicity; no seismicity migration patterns were observed outside the areas of old mining. Seven years after the pumping stopped, the evolution of the seismicity has indicated that the area is not yet in equilibrium. The level of seismicity in the area appears to not be decreasing over time since the number of strong events, with Mw magnitudes above 2, is still as high as it was when monitoring began over five years ago. The average rate of seismic deformation is 1.6x1013 Nm/year. Constant seismic deformation was not observed over the last 5 years. The deviation from the average is in the order of 6x10^13 Nm/year, which is a significant deviation. The variation of cumulative seismic moment indicates that a constant deformation rate model is not suitable. Over the most recent five year period, the total cumulative seismic moment released in the Central Rand Basin was 9.0x10^14 Nm. This is equivalent to one earthquake of magnitude 3.9. This is significantly less than what was experienced during the mining operation. Characterization of seismicity triggered by a rising water level in the area can be achieved through the estimation of source parameters. Static stress drop heavily influences ground motion amplitude, which plays an important role in risk assessments of potential seismic hazards in inhabited areas. The observed static stress drop in this study varied from 0.05 MPa to 10 MPa. It was found that large static stress drops could be associated with both small and large events. The temporal evolution of the inter-event time provides an understanding of the physical mechanisms of earthquake interaction. Changes in the characteristics of the inter-event time are produced when a stress change is applied to a group of faults in the region. Results from this study indicate that the fluid-induced source has a shorter inter-event time in comparison to a random distribution. This behaviour corresponds to a clustering of events, in which short recurrence times tend to be close to each other, forming clusters of events.Keywords: inter-event time, fluid induced seismicity, mine closure, spectral parameters of seismic source
Procedia PDF Downloads 28452048 Reducing Unnecessary CT Aorta Scans in the Emergency Department
Authors: Ibrahim Abouelkhir
Abstract:
Background: Prior to this project, the number of CT aorta requests from our Emergency Department (ED) was reported by the radiology department to be high with a low positive event rate: only 1- 2% of CT aortas performed were positive for acute aortic syndrome. This trend raised concerns about the time required to process and report these scans, potentially impacting the timely reporting of other high-priority imaging, such as trauma-related scans. Other harms identified were unnecessary radiation, patients spending longer in ED contributing to overcrowding, and, most importantly, the patient not getting the right care the first time. The radiology department also raised the problem of reporting bias because they expected our CT aortas to be normal. Aim: The main aim of this project was to reduce the number of unnecessary CT aortas requested, which would be shown by 1. Number of CT aortas requested and 2. Positive event rate. Methodology: This was a quality improvement project carried out in the ED at Frimley Park Hospital, UK. Starting from 1 st January 2024, we recorded the number of days required to reach 35 CT aorta requests. We looked at all patients presenting to the ED over the age of 16 for whom a CT aorta was requested by the ED team. We looked at how many of these scans were positive for acute aortic syndrome. The intervention was a change in practice: all CT aortas should be approved by an ED consultant or ST4+ registrar (5th April 2024). We then reviewed the number of days it took to reach a total of 35 CT aorta requests following the intervention and again reviewed how many were positive. Results: Prior to the intervention, 35 CT Aorta scans were performed over a 20-day period. Following the implementation of the ED senior doctor vetting process, the same number of CT Aorta scan requests was observed over 50 days - more than twice the pre-intervention period. This indicates a significant reduction in the rate of CT Aorta scans being requested. During the pre-intervention phase, there were two positive cases of acute aortic syndrome. In the post-intervention period, there were zero. Conclusion: The mandatory review of CT Aorta scan requested by the ED consultant effectively reduced the number of scans requested. However, this intervention did not lead to an increase in positive scan results. We noted that post-intervention, approximately 50% of scans had been approved by registrar-grade doctors and, only 50% had been approved by ED consultants, and the majority were not in-person reviews. We wonder if restricting the approval to consultant grade only might improve the results, and furthermore, in person reviews should be the gold standard.Keywords: quality improvement project, CT aorta scans, emergency department, radiology department, aortic dissection, scan request vetting, clinical outcomes, imaging efficiency
Procedia PDF Downloads 852047 Efficient Estimation for the Cox Proportional Hazards Cure Model
Authors: Khandoker Akib Mohammad
Abstract:
While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood
Procedia PDF Downloads 14252046 Tropical Squall Lines in Brazil: A Methodology for Identification and Analysis Based on ISCCP Tracking Database
Authors: W. A. Gonçalves, E. P. Souza, C. R. Alcântara
Abstract:
The ISCCP-Tracking database offers an opportunity to study physical and morphological characteristics of Convective Systems based on geostationary meteorological satellites. This database contains 26 years of tracking of Convective Systems for the entire globe. Then, Tropical Squall Lines which occur in Brazil are certainly within the database. In this study, we propose a methodology for identification of these systems based on the ISCCP-Tracking database. A physical and morphological characterization of these systems is also shown. The proposed methodology is firstly based on the year of 2007. The Squall Lines were subjectively identified by visually analyzing infrared images from GOES-12. Based on this identification, the same systems were identified within the ISCCP-Tracking database. It is known, and it was also observed that the Squall Lines which occur on the north coast of Brazil develop parallel to the coast, influenced by the sea breeze. In addition, it was also observed that the eccentricity of the identified systems was greater than 0.7. Then, a methodology based on the inclination (based on the coast) and eccentricity (greater than 0.7) of the Convective Systems was applied in order to identify and characterize Tropical Squall Lines in Brazil. These thresholds were applied back in the ISCCP-Tracking database for the year of 2007. It was observed that other systems, which were not Squall Lines, were also identified. Then, we decided to call all systems identified by the inclination and eccentricity thresholds as Linear Convective Systems, instead of Squall Lines. After this step, the Linear Convective Systems were identified and characterized for the entire database, from 1983 to 2008. The physical and morphological characteristics of these systems were compared to those systems which did not have the required inclination and eccentricity to be called Linear Convective Systems. The results showed that the convection associated with the Linear Convective Systems seems to be more intense and organized than in the other systems. This affirmation is based on all ISCCP-Tracking variables analyzed. This type of methodology, which explores 26 years of satellite data by an objective analysis, was not previously explored in the literature. The physical and morphological characterization of the Linear Convective Systems based on 26 years of data is of a great importance and should be used in many branches of atmospheric sciences.Keywords: squall lines, convective systems, linear convective systems, ISCCP-Tracking
Procedia PDF Downloads 29952045 Methodology of Geometry Simplification for Conjugate Heat Transfer of Electrical Rotating Machines Using Computational Fluid Dynamics
Authors: Sachin Aggarwal, Sarah Kassinger, Nicholas Hoffman
Abstract:
Geometry simplification is a key step in performing conjugate heat transfer analysis using CFD. This paper proposes a standard methodology for the geometry simplification of rotating machines, such as electrical generators and electrical motors (both air and liquid-cooled). These machines are extensively deployed throughout the aerospace and automotive industries, where optimization of weight, volume, and performance is paramount -especially given the current global transition to renewable energy sources and vehicle hybridization and electrification. Conjugate heat transfer analysis is an essential step in optimizing their complex design. This methodology will help in reducing convergence issues due to poor mesh quality, thus decreasing computational cost and overall analysis time.Keywords: CFD, electrical machines, Geometry simplification, heat transfer
Procedia PDF Downloads 12952044 Evaluating the Effect of 'Terroir' on Volatile Composition of Red Wines
Authors: María Luisa Gonzalez-SanJose, Mihaela Mihnea, Vicente Gomez-Miguel
Abstract:
The zoning methodology currently recommended by the OIVV as official methodology to carry out viticulture zoning studies and to define and delimit the ‘terroirs’ has been applied in this study. This methodology has been successfully applied on the most significant an important Spanish Oenological D.O. regions, such as Ribera de Duero, Rioja, Rueda and Toro, but also it have been applied around the world in Portugal, different countries of South America, and so on. This is a complex methodology that uses edaphoclimatic data but also other corresponding to vineyards and other soils’ uses The methodology is useful to determine Homogeneous Soil Units (HSU) to different scale depending on the interest of each study, and has been applied from viticulture regions to particular vineyards. It seems that this methodology is an appropriate method to delimit correctly the medium in order to enhance its uses and to obtain the best viticulture and oenological products. The present work is focused on the comparison of volatile composition of wines made from grapes grown in different HSU that coexist in a particular viticulture region of Castile-Lion cited near to Burgos. Three different HSU were selected for this study. They represented around of 50% of the global area of vineyards of the studied region. Five different vineyards on each HSU under study were chosen. To reduce variability factors, other criteria were also considered as grape variety, clone, rootstocks, vineyard’s age, training systems and cultural practices. This study was carried out during three consecutive years, then wine from three different vintage were made and analysed. Different red wines were made from grapes harvested in the different vineyards under study. Grapes were harvested to ‘Technological maturity’, which are correlated with adequate levels of sugar, acidity, phenolic content (nowadays named phenolic maturity), good sanitary stages and adequate levels of aroma precursors. Results of the volatile profile of the wines produced from grapes of each HSU showed significant differences among them pointing out a direct effect of the edaphoclimatic characteristic of each UHT on the composition of the grapes and then on the volatile composition of the wines. Variability induced by HSU co-existed with the well-known inter-annual variability correlated mainly with the specific climatic conditions of each vintage, however was most intense, so the wine of each HSU were perfectly differenced. A discriminant analysis allowed to define the volatiles with discriminant capacities which were 21 of the 74 volatiles analysed. Detected discriminant volatiles were chemical different, although .most of them were esters, followed by were superior alcohols and fatty acid of short chain. Only one lactone and two aldehydes were selected as discriminant variable, and no varietal aroma compounds were selected, which agree with the fact that all the wine were made from the same grape variety.Keywords: viticulture zoning, terroir, wine, volatile profile
Procedia PDF Downloads 22052043 Media Literacy Development: A Methodology to Systematically Integrate Post-Contemporary Challenges in Early Childhood Education
Authors: Ana Mouta, Ana Paulino
Abstract:
The following text presents the ik.model, a theoretical framework that guided the pedagogical implementation of meaningful educational technology-based projects in formal education worldwide. In this paper, we will focus on how this framework has enabled the development of media literacy projects for early childhood education during the last three years. The methodology that guided educators through the challenge of systematically merging analogic and digital means in dialogic high-quality opportunities of world exploration is explained throughout these lines. The effects of this methodology on early age media literacy development are considered. Also considered is the relevance of this skill in terms of post-contemporary challenges posed to learning.Keywords: early learning, ik.model, media literacy, pedagogy
Procedia PDF Downloads 32252042 Photo-Fenton Decolorization of Methylene Blue Adsolubilized on Co2+ -Embedded Alumina Surface: Comparison of Process Modeling through Response Surface Methodology and Artificial Neural Network
Authors: Prateeksha Mahamallik, Anjali Pal
Abstract:
In the present study, Co(II)-adsolubilized surfactant modified alumina (SMA) was prepared, and methylene blue (MB) degradation was carried out on Co-SMA surface by visible light photo-Fenton process. The entire reaction proceeded on solid surface as MB was embedded on Co-SMA surface. The reaction followed zero order kinetics. Response surface methodology (RSM) and artificial neural network (ANN) were used for modeling the decolorization of MB by photo-Fenton process as a function of dose of Co-SMA (10, 20 and 30 g/L), initial concentration of MB (10, 20 and 30 mg/L), concentration of H2O2 (174.4, 348.8 and 523.2 mM) and reaction time (30, 45 and 60 min). The prediction capabilities of both the methodologies (RSM and ANN) were compared on the basis of correlation coefficient (R2), root mean square error (RMSE), standard error of prediction (SEP), relative percent deviation (RPD). Due to lower value of RMSE (1.27), SEP (2.06) and RPD (1.17) and higher value of R2 (0.9966), ANN was proved to be more accurate than RSM in order to predict decolorization efficiency.Keywords: adsolubilization, artificial neural network, methylene blue, photo-fenton process, response surface methodology
Procedia PDF Downloads 25252041 Vulnerability Risk Assessment of Non-Engineered Houses Based on Damage Data of the 2009 Padang Earthquake 2009 in Padang City, Indonesia
Authors: Rusnardi Rahmat Putra, Junji Kiyono, Aiko Furukawa
Abstract:
Several powerful earthquakes have struck Padang during recent years, one of the largest of which was an M 7.6 event that occurred on September 30, 2009 and caused more than 1000 casualties. Following the event, we conducted a 12-site microtremor array investigation to gain a representative determination of the soil condition of subsurface structures in Padang. From the dispersion curve of array observations, the central business district of Padang corresponds to relatively soft soil condition with Vs30 less than 400 m/s. because only one accelerometer existed, we simulated the 2009 Padang earthquake to obtain peak ground acceleration for all sites in Padang city. By considering the damage data of the 2009 Padang earthquake, we produced seismic risk vulnerability estimation of non-engineered houses for rock, medium and soft soil condition. We estimated the loss ratio based on the ground response, seismic hazard of Padang and the existing damaged to non-engineered structure houses due to Padang earthquake in 2009 data for several return periods of earthquake events.Keywords: profile, Padang earthquake, microtremor array, seismic vulnerability
Procedia PDF Downloads 40852040 Modelling a Hospital as a Queueing Network: Analysis for Improving Performance
Authors: Emad Alenany, M. Adel El-Baz
Abstract:
In this paper, the flow of different classes of patients into a hospital is modelled and analyzed by using the queueing network analyzer (QNA) algorithm and discrete event simulation. Input data for QNA are the rate and variability parameters of the arrival and service times in addition to the number of servers in each facility. Patient flows mostly match real flow for a hospital in Egypt. Based on the analysis of the waiting times, two approaches are suggested for improving performance: Separating patients into service groups, and adopting different service policies for sequencing patients through hospital units. The separation of a specific group of patients, with higher performance target, to be served separately from the rest of patients requiring lower performance target, requires the same capacity while improves performance for the selected group of patients with higher target. Besides, it is shown that adopting the shortest processing time and shortest remaining processing time service policies among other tested policies would results in, respectively, 11.47% and 13.75% reduction in average waiting time relative to first come first served policy.Keywords: queueing network, discrete-event simulation, health applications, SPT
Procedia PDF Downloads 18552039 Training Undergraduate Engineering Students in Robotics and Automation through Model-Based Design Training: A Case Study at Assumption University of Thailand
Authors: Sajed A. Habib
Abstract:
Problem-based learning (PBL) is a student-centered pedagogy that originated in the medical field and has also been used extensively in other knowledge disciplines with recognized advantages and limitations. PBL has been used in various undergraduate engineering programs with mixed outcomes. The current fourth industrial revolution (digital era or Industry 4.0) has made it essential for many science and engineering students to receive effective training in advanced courses such as industrial automation and robotics. This paper presents a case study at Assumption University of Thailand, where a PBL-like approach was used to teach some aspects of automation and robotics to selected groups of undergraduate engineering students. These students were given some basic level training in automation prior to participating in a subsequent training session in order to solve technical problems with increased complexity. The participating students’ evaluation of the training sessions in terms of learning effectiveness, skills enhancement, and incremental knowledge following the problem-solving session was captured through a follow-up survey consisting of 14 questions and a 5-point scoring system. From the most recent training event, an overall 70% of the respondents indicated that their skill levels were enhanced to a much greater level than they had had before the training, whereas 60.4% of the respondents from the same event indicated that their incremental knowledge following the session was much greater than what they had prior to the training. The instructor-facilitator involved in the training events suggested that this method of learning was more suitable for senior/advanced level students than those at the freshmen level as certain skills to effectively participate in such problem-solving sessions are acquired over a period of time, and not instantly.Keywords: automation, industry 4.0, model-based design training, problem-based learning
Procedia PDF Downloads 13252038 Kinematic of Thrusts and Tectonic Vergence in the Paleogene Orogen of Eastern Iran, Sechangi Area
Authors: Shahriyar Keshtgar, Mahmoud Reza Heyhat, Sasan Bagheri, Ebrahim Gholami, Seyed Naser Raiisosadat
Abstract:
The eastern Iranian range is a Z-shaped sigmoidal outcrop appearing with a NS-trending general strike on the satellite images, has already been known as the Sistan suture zone, recently identified as the product of an orogenic event introduced either by the Paleogene or Sistan orogen names. The flysch sedimentary basin of eastern Iran was filled by a huge volume of fine-grained Eocene turbiditic sediments, smaller amounts of pelagic deposits and Cretaceous ophiolitic slices, which are entirely remnants of older accretionary prisms appeared in a fold-thrust belt developed onto a subduction zone under the Lut/Afghan block, portions of the Cimmerian superterrane. In these ranges, there are Triassic sedimentary and carbonate sequences (equivalent to Nayband and Shotori Formations) along with scattered outcrops of Permian limestones (equivalent to Jamal limestone) and greenschist-facies metamorphic rocks, probably belonging to the basement of the Lut block, which have tectonic contacts with younger rocks. Moreover, the younger Eocene detrital-volcanic rocks were also thrusted onto the Cretaceous or younger turbiditic deposits. The first generation folds (parallel folds) and thrusts with slaty cleavage appeared parallel to the NE edge of the Lut block. Structural analysis shows that the most vergence of thrusts is toward the southeast so that the Permo-Triassic units in Lut have been thrusted on the younger rocks, including older (probably Jurassic) granites. Additional structural studies show that the regional transport direction in this deformation event is from northwest to the southeast where, from the outside to the inside of the orogen in the Sechengi area. Younger thrusts of the second deformation event were either directly formed as a result of the second deformation event, or they were older thrusts that reactivated and folded so that often, two sets or more slickenlines can be recognized on the thrust planes. The recent thrusts have been redistributed in directions nearly perpendicular to the edge of the Lut block and parallel to the axial surfaces of the northwest second generation large-scale folds (radial folds). Some of these younger thrusts follow the out-of-the-syncline thrust system. The both axial planes of these folds and associated penetrative shear cleavage extended towards northwest appeared with both northeast and southwest dips parallel to the younger thrusts. The large-scale buckling with the layer-parallel stress field has created this deformation event. Such consecutive deformation events perpendicular to each other cannot be basically explained by the simple linear orogen models presented for eastern Iran so far and are more consistent with the oroclinal buckling model.Keywords: thrust, tectonic vergence, orocline buckling, sechangi, eastern iranian ranges
Procedia PDF Downloads 7852037 Seismic Stratigraphy of the First Deposits of the Kribi-Campo Offshore Sub-basin (Gulf of Guinea): Pre-cretaceous Early Marine Incursion and Source Rocks Modeling
Authors: Mike-Franck Mienlam Essi, Joseph Quentin Yene Atangana, Mbida Yem
Abstract:
The Kribi-Campo sub-basin belongs to the southern domain of the Cameroon Atlantic Margin in the Gulf of Guinea. It is the African homologous segment of the Sergipe-Alagoas Basin, located at the northeast side of the Brazil margin. The onset of the seafloor spreading period in the Southwest African Margin in general and the study area particularly remains controversial. Various studies locate this event during the Cretaceous times (Early Aptian to Late Albian), while others suggested that this event occurred during Pre-Cretaceous period (Palaeozoic or Jurassic). This work analyses 02 Cameroon Span seismic lines to re-examine the Early marine incursion period of the study area for a better understanding of the margin evolution. The methodology of analysis in this study is based on the delineation of the first seismic sequence, using the reflector’s terminations tracking and the analysis of its internal reflections associated to the external configuration of the package. The results obtained indicate from the bottom upwards that the first deposits overlie a first seismic horizon (H1) associated to “onlap” terminations at its top and underlie a second horizon which shows “Downlap” terminations at its top (H2). The external configuration of this package features a prograded fill pattern, and it is observed within the depocenter area with discontinuous reflections that pinch out against the basement. From east to west, this sequence shows two seismic facies (SF1 and SF2). SF1 has parallel to subparallel reflections, characterized by high amplitude, and SF2 shows parallel and stratified reflections, characterized by low amplitude. The distribution of these seismic facies reveals a lateral facies variation observed. According to the fundamentals works on seismic stratigraphy and the literature review of the geological context of the study area, particularly, the stratigraphical natures of the identified horizons and seismic facies have been highlighted. The seismic horizons H1 and H2 correspond to Top basement and “Downlap Surface,” respectively. SF1 indicates continental sediments (Sands/Sandstone) and SF2 marine deposits (shales, clays). Then, the prograding configuration observed suggests a marine regression. The correlation of these results with the lithochronostratigraphic chart of Sergipe-Alagoas Basin reveals that the first marine deposits through the study area are dated from Pre-Cretaceous times (Palaeozoic or Jurassic). The first deposits onto the basement represents the end of a cycle of sedimentation. The hypothesis of Mike.F. Mienlam Essi is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Joseph.Q. Yene Atangana is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Mbida Yem is with the Earth Sciences Department of the Faculty of Science of the University of Yaoundé I, P.O. BOX 812 CAMEROON (e-mail: [email protected]). Cretaceous seafloor spreading through the study area is the onset of another cycle of sedimentation. Furthermore, the presence of marine sediments into the first deposits implies that this package could contain marine source rocks. The spatial tracking of these deposits reveals that they could be found in some onshore parts of the Kribi-Campo area or even in the northern side.Keywords: cameroon span seismic, early marine incursion, kribi-campo sub-basin, pre-cretaceous period, sergipe-alagoas basin
Procedia PDF Downloads 10752036 Recidivism in Brazil: Exploring the Case of the Association of Protection and Assistance to Convicts Methodology
Authors: Robyn Heitzman
Abstract:
The traditional method of punitive justice in Brazil has failed to prevent high levels of recidivism. Combined with overcrowding, a lack of resources, and human rights abuses, the conventional prison approach in Brazil is being questioned; one alternative approach is the association of protection and assistance to convicts (APAC) method. Justice -according to the principles of the APAC methodology- is served through education, reformation, and human development. The model has reported relatively low levels of recidivism and has been internationally recognised for its progress. Through qualitative research such as interviews and case studies, this paper explains why, applying the theory of restorative justice, the APAC methodology yields lower rates of recidivism compared to the traditional models of prisons in Brazil.Keywords: Brazil, justice, prisons, restorative
Procedia PDF Downloads 10852035 Learning Vocabulary with SkELL: Developing a Methodology with University Students in Japan Using Action Research
Authors: Henry R. Troy
Abstract:
Corpora are becoming more prevalent in the language classroom, especially in the development of dictionaries and course materials. Nevertheless, corpora are still perceived by many educators as difficult to use directly in the classroom, a process which is also known as “data-driven learning” (DDL). Action research has been identified as a method by which DDL’s efficiency can be increased, but it is also an approach few studies on DDL have employed. Studies into the effectiveness of DDL in language education in Japan are also rare, and investigations focused more on student and teacher reactions rather than pre and post-test scores are rarer still. This study investigates the student and teacher reactions to the use of SkELL, a free online corpus designed to be user-friendly, for vocabulary learning at a university in Japan. Action research is utilized to refine the teaching methodology, with changes to the method based on student and teacher feedback received via surveys submitted after each of the four implementations of DDL. After some training, the students used tablets to study the target vocabulary autonomously in pairs and groups, with the teacher acting as facilitator. The results show that the students enjoyed using SkELL and felt it was effective for vocabulary learning, while the teaching methodology grew in efficiency throughout the course. These findings suggest that action research can be a successful method for increasing the efficacy of DDL in the language classroom, especially with teachers and students who are new to the practice.Keywords: action research, corpus linguistics, data-driven learning, vocabulary learning
Procedia PDF Downloads 24352034 Towards a Systematic Evaluation of Web Design
Authors: Ivayla Trifonova, Naoum Jamous, Holger Schrödl
Abstract:
A good web design is a prerequisite for a successful business nowadays, especially since the internet is the most common way for people to inform themselves. Web design includes the optical composition, the structure, and the user guidance of websites. The importance of each website leads to the question if there is a way to measure its usefulness. The aim of this paper is to suggest a methodology for the evaluation of web design. The desired outcome is to have an evaluation that is concentrated on a specific website and its target group.Keywords: evaluation methodology, factor analysis, target group, web design
Procedia PDF Downloads 63052033 Research Methodology of Living Environment of Modern Residential Development in St. Petersburg
Authors: Kalina Alina Aidarovna, Khayrullina Yulia Sergeevna
Abstract:
The question of forming quality housing and living environment remains a vexed problem in the current situation of high-rise apartment building in big cities of Russia. At this start up stage of the modern so-called "mass housing" market it needs to identify key quality characteristics on a different scale from apartments to the district. This paper describes the methodology of qualitative assessment of modern mass housing construction, made on the basis of the ITMO university in cooperation with the institute of spatial planning "Urbanika," based on the case study of St. Petersburg’s residential mass housing built in 2011-2014. The methodology of the study of housing and living environment goes back to the native and foreign urbanists of 60s - 80s, such Jane Jacobs, Jan Gehl, Oscar Newman, Krasheninnikov, as well as Sommer, Stools, Kohnen and Sherrod, Krasilnikova, Sychev, Zhdanov, Tinyaeva considering spatial features of living environment in a wide range of its characteristics (environmental control, territorial and personalization, privacy, etc.). Assessment is carrying out on the proposed system of criteria developed for each residential environment scale-district, quarter, courtyard, building surrounding grounds, houses, and flats. Thus the objects of study are planning unit of residential development areas (residential area, neighborhood, quarter) residential units areas (living artist, a house), and households (apartments) consisting of residential units. As a product of identified methodology, after the results of case studies of more than 700 residential complexes in St. Petersburg, we intend the creation of affordable online resource that would allow conducting a detailed qualitative evaluation or comparative characteristics of residential complexes for all participants of the construction market-developers, designers, realtors and buyers. Thereby the main objective of the rating may be achieved to improve knowledge, requirements, and demand for quality housing and living environment among the major stakeholders of the construction market.Keywords: methodology of living environment, qualitative assessment of mass housing, scale-district, vexed problem
Procedia PDF Downloads 45852032 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers
Authors: Pietro D'Ambrosio, Roberta D'Ambrosio
Abstract:
The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.Keywords: evaluation, methodology, restoration, reuse
Procedia PDF Downloads 18552031 A Methodology for Seismic Performance Enhancement of RC Structures Equipped with Friction Energy Dissipation Devices
Authors: Neda Nabid
Abstract:
Friction-based supplemental devices have been extensively used for seismic protection and strengthening of structures, however, the conventional use of these dampers may not necessarily lead to an efficient structural performance. Conventionally designed friction dampers follow a uniform height-wise distribution pattern of slip load values for more practical simplicity. This can lead to localizing structural damage in certain story levels, while the other stories accommodate a negligible amount of relative displacement demand. A practical performance-based optimization methodology is developed to tackle with structural damage localization of RC frame buildings with friction energy dissipation devices under severe earthquakes. The proposed methodology is based on the concept of uniform damage distribution theory. According to this theory, the slip load values of the friction dampers redistribute and shift from stories with lower relative displacement demand to the stories with higher inter-story drifts to narrow down the discrepancy between the structural damage levels in different stories. In this study, the efficacy of the proposed design methodology is evaluated through the seismic performance of five different low to high-rise RC frames equipped with friction wall dampers under six real spectrum-compatible design earthquakes. The results indicate that compared to the conventional design, using the suggested methodology to design friction wall systems can lead to, by average, up to 40% reduction of maximum inter-story drift; and incredibly more uniform height-wise distribution of relative displacement demands under the design earthquakes.Keywords: friction damper, nonlinear dynamic analysis, RC structures, seismic performance, structural damage
Procedia PDF Downloads 22652030 Assessment of the Number of Damaged Buildings from a Flood Event Using Remote Sensing Technique
Authors: Jaturong Som-ard
Abstract:
The heavy rainfall from 3rd to 22th January 2017 had swamped much area of Ranot district in southern Thailand. Due to heavy rainfall, the district was flooded which had a lot of effects on economy and social loss. The major objective of this study is to detect flooding extent using Sentinel-1A data and identify a number of damaged buildings over there. The data were collected in two stages as pre-flooding and during flood event. Calibration, speckle filtering, geometric correction, and histogram thresholding were performed with the data, based on intensity spectral values to classify thematic maps. The maps were used to identify flooding extent using change detection, along with the buildings digitized and collected on JOSM desktop. The numbers of damaged buildings were counted within the flooding extent with respect to building data. The total flooded areas were observed as 181.45 sq.km. These areas were mostly occurred at Ban khao, Ranot, Takhria, and Phang Yang sub-districts, respectively. The Ban khao sub-district had more occurrence than the others because this area is located at lower altitude and close to Thale Noi and Thale Luang lakes than others. The numbers of damaged buildings were high in Khlong Daen (726 features), Tha Bon (645 features), and Ranot sub-district (604 features), respectively. The final flood extent map might be very useful for the plan, prevention and management of flood occurrence area. The map of building damage can be used for the quick response, recovery and mitigation to the affected areas for different concern organization.Keywords: flooding extent, Sentinel-1A data, JOSM desktop, damaged buildings
Procedia PDF Downloads 19052029 Quantitative Analysis of Presence, Consciousness, Subconsciousness, and Unconsciousness
Authors: Hooshmand Kalayeh
Abstract:
The human brain consists of reptilian, mammalian, and thinking brain. And mind consists of conscious, subconscious, and unconscious parallel neural-net programs. The primary objective of this paper is to propose a methodology for quantitative analysis of neural-nets associated with these mental activities in the neocortex. The secondary objective of this paper is to suggest a methodology for quantitative analysis of presence; the proposed methodologies can be used as a first-step to measure, monitor, and understand consciousness and presence. This methodology is based on Neural-Networks (NN), number of neuron in each NN associated with consciousness, subconsciouness, and unconsciousness, and number of neurons in neocortex. It is assumed that the number of neurons in each NN is correlated with the associated area and volume. Therefore, online and offline visualization techniques can be used to identify these neural-networks, and online and offline measurement methods can be used to measure areas and volumes associated with these NNs. So, instead of the number of neurons in each NN, the associated area or volume also can be used in the proposed methodology. This quantitative analysis and associated online and offline measurements and visualizations of different Neural-Networks enable us to rewire the connections in our brain for a more balanced living.Keywords: brain, mind, consciousness, presence, sub-consciousness, unconsciousness, skills, concentrations, attention
Procedia PDF Downloads 31452028 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 39452027 Reallocation of Bed Capacity in a Hospital Combining Discrete Event Simulation and Integer Linear Programming
Authors: Muhammed Ordu, Eren Demir, Chris Tofallis
Abstract:
The number of inpatient admissions in the UK has been significantly increasing over the past decade. These increases cause bed occupancy rates to exceed the target level (85%) set by the Department of Health in England. Therefore, hospital service managers are struggling to better manage key resource such as beds. On the other hand, this severe demand pressure might lead to confusion in wards. For example, patients can be admitted to the ward of another inpatient specialty due to lack of resources (i.e., bed). This study aims to develop a simulation-optimization model to reallocate the available number of beds in a mid-sized hospital in the UK. A hospital simulation model was developed to capture the stochastic behaviours of the hospital by taking into account the accident and emergency department, all outpatient and inpatient services, and the interactions between each other. A couple of outputs of the simulation model (e.g., average length of stay and revenue) were generated as inputs to be used in the optimization model. An integer linear programming was developed under a number of constraints (financial, demand, target level of bed occupancy rate and staffing level) with the aims of maximizing number of admitted patients. In addition, a sensitivity analysis was carried out by taking into account unexpected increases on inpatient demand over the next 12 months. As a result, the major findings of the approach proposed in this study optimally reallocate the available number of beds for each inpatient speciality and reveal that 74 beds are idle. In addition, the findings of the study indicate that the hospital wards will be able to cope with 14% demand increase at most in the projected year. In conclusion, this paper sheds a new light on how best to reallocate beds in order to cope with current and future demand for healthcare services.Keywords: bed occupancy rate, bed reallocation, discrete event simulation, inpatient admissions, integer linear programming, projected usage
Procedia PDF Downloads 14352026 The Relationship between Life Event Stress, Depressive Thoughts, and Working Memory Capacity
Authors: Eid Abo Hamza, Ahmed Helal
Abstract:
Purpose: The objective is to measure the capacity of the working memory, ie. the maximum number of elements that can be retrieved and processed, by measuring the basic functions of working memory (inhibition/transfer/update), and also to investigate its relationship to life stress and depressive thoughts. Methods: The study sample consisted of 50 students from Egypt. A cognitive task was designed to measure the working memory capacity based on the determinants found in previous research, which showed that cognitive tasks are the best measurements of the functions and capacity of working memory. Results: The results indicated that there were statistically significant differences in the level of life stress events (high/low) on the task of measuring the working memory capacity. The results also showed that there were no statistically significant differences between males and females or between academic major on the task of measuring the working memory capacity. Furthermore, the results reported that there was no statistically significant effect of the interaction of the level of life stress (high/low) and gender (male/female) on the task of measuring working memory capacity. Finally, the results showed that there were significant differences in the level of depressive thoughts (high/low) on the task of measuring working memory. Conclusions: The current research concludes that neither the interaction of stressful life events, gender, and academic major, nor the interaction of depressive thoughts, gender, and academic major, influence on working memory capacity.Keywords: working memory, depression, stress, life event
Procedia PDF Downloads 15752025 Applying the Global Trigger Tool in German Hospitals: A Retrospective Study in Surgery and Neurosurgery
Authors: Mareen Brosterhaus, Antje Hammer, Steffen Kalina, Stefan Grau, Anjali A. Roeth, Hany Ashmawy, Thomas Gross, Marcel Binnebosel, Wolfram T. Knoefel, Tanja Manser
Abstract:
Background: The identification of critical incidents in hospitals is an essential component of improving patient safety. To date, various methods have been used to measure and characterize such critical incidents. These methods are often viewed by physicians and nurses as external quality assurance, and this creates obstacles to the reporting events and the implementation of recommendations in practice. One way to overcome this problem is to use tools that directly involve staff in measuring indicators of quality and safety of care in the department. One such instrument is the global trigger tool (GTT), which helps physicians and nurses identify adverse events by systematically reviewing randomly selected patient records. Based on so-called ‘triggers’ (warning signals), indications of adverse events can be given. While the tool is already used internationally, its implementation in German hospitals has been very limited. Objectives: This study aimed to assess the feasibility and potential of the global trigger tool for identifying adverse events in German hospitals. Methods: A total of 120 patient records were randomly selected from two surgical, and one neurosurgery, departments of three university hospitals in Germany over a period of two months per department between January and July, 2017. The records were reviewed using an adaptation of the German version of the Institute for Healthcare Improvement Global Trigger Tool to identify triggers and adverse event rates per 1000 patient days and per 100 admissions. The severity of adverse events was classified using the National Coordinating Council for Medication Error Reporting and Prevention. Results: A total of 53 adverse events were detected in the three departments. This corresponded to adverse event rates of 25.5-72.1 per 1000 patient-days and from 25.0 to 60.0 per 100 admissions across the three departments. 98.1% of identified adverse events were associated with non-permanent harm without (Category E–71.7%) or with (Category F–26.4%) the need for prolonged hospitalization. One adverse event (1.9%) was associated with potentially permanent harm to the patient. We also identified practical challenges in the implementation of the tool, such as the need for adaptation of the global trigger tool to the respective department. Conclusions: The global trigger tool is feasible and an effective instrument for quality measurement when adapted to the departmental specifics. Based on our experience, we recommend a continuous use of the tool thereby directly involving clinicians in quality improvement.Keywords: adverse events, global trigger tool, patient safety, record review
Procedia PDF Downloads 24852024 The Price of Knowledge in the Times of Commodification of Higher Education: A Case Study on the Changing Face of Education
Authors: Joanna Peksa, Faith Dillon-Lee
Abstract:
Current developments in the Western economies have turned some universities into corporate institutions driven by practices of production and commodity. Academia is increasingly becoming integrated into national economies as a result of students paying fees and is consequently using business practices in student retention and engagement. With these changes, pedagogy status as a priority within the institution has been changing in light of these new demands. New strategies have blurred the boundaries that separate a student from a client. This led to a change of the dynamic, disrupting the traditional idea of the knowledge market, and emphasizing the corporate aspect of universities. In some cases, where students are seen primarily as a customer, the purpose of academia is no longer to educate but sell a commodity and retain fee-paying students. This paper considers opposing viewpoints on the commodification of higher education, reflecting on the reality of maintaining a pedagogic grounding in an increasingly commercialized sector. By analysing a case study of the Student Success Festival, an event that involved academic and marketing teams, the differences are considered between the respective visions of the pedagogic arm of the university and the corporate. This study argues that the initial concept of the event, based on the principles of gamification, independent learning, and cognitive criticality, was more clearly linked to a grounded pedagogic approach. However, when liaising with the marketing team in a crucial step in the creative process, it became apparent that these principles were not considered a priority in terms of their remit. While the study acknowledges in the power of pedagogy, the findings show that a pact of concord is necessary between different stakeholders in order for students to benefit fully from their learning experience. Nevertheless, while issues of power prevail and whenever power is unevenly distributed, reaching a consensus becomes increasingly challenging and further research should closely monitor the developments in pedagogy in the UK higher education.Keywords: economic pressure, commodification, pedagogy, gamification, public service, marketization
Procedia PDF Downloads 13152023 Estimation of Elastic Modulus of Soil Surrounding Buried Pipeline Using Multi-Response Surface Methodology
Authors: Won Mog Choi, Seong Kyeong Hong, Seok Young Jeong
Abstract:
The stress on the buried pipeline under pavement is significantly affected by vehicle loads and elastic modulus of the soil surrounding the pipeline. The correct elastic modulus of soil has to be applied to the finite element model to investigate the effect of the vehicle loads on the buried pipeline using finite element analysis. The purpose of this study is to establish the approach to calculating the correct elastic modulus of soil using the optimization process. The optimal elastic modulus of soil, which minimizes the difference between the strain measured from vehicle driving test at the velocity of 35km/h and the strain calculated from finite element analyses, was calculated through the optimization process using multi-response surface methodology. Three elastic moduli of soil (road layer, original soil, dense sand) surrounding the pipeline were defined as the variables for the optimization. Further analyses with the optimal elastic modulus at the velocities of 4.27km/h, 15.47km/h, 24.18km/h were performed and compared to the test results to verify the applicability of multi-response surface methodology. The results indicated that the strain of the buried pipeline was mostly affected by the elastic modulus of original soil, followed by the dense sand and the load layer, as well as the results of further analyses with optimal elastic modulus of soil show good agreement with the test.Keywords: pipeline, optimization, elastic modulus of soil, response surface methodology
Procedia PDF Downloads 38252022 Need for Shariah Screening of Companies in Nigeria: Lessons from Other Jurisdictions
Authors: Aishat Abdul-Qadir Zubair
Abstract:
Background: The absence of Shari’ah screening methodology for companies in Nigeria has further engineered the uncertainty surrounding the acceptability of investing in certain companies by people professing the religion of Islam due to the nature of the activities carried out by these companies. There are some existing shariah screening indices in other jurisdictions whose criteria can be used to check if a company or business is shariah-compliant or not. Examples such as FTSE, DJIM, Standard and Poor to mention just a few. What these indices have tried to do is to ensure that there are benchmarks to check with before investing in companies that carry out mixed activities in their business, wherein some are halal and others may be haram. Purpose: There have been numerous studies on the need to adopt certain screening methodologies as well as call for new methods in screening companies for shariah compliance in order to suit the investments needs of Muslims in other jurisdictions. It is, however, unclear how suitable these methodologies will be to Nigeria. This paper, therefore, seeks to address this gap to consider an appropriate screening methodology to be employed in Nigeria, drawing from the experience of other jurisdictions. Methods: This study employs a triangulation of both quantitative and qualitative methods to analyze the need for Shari’ah screening of companies in Nigeria. The qualitative method is used by way of ijtihad, and this study tries to apply some Islamic Principles of Maqasid al-shari’ah as well as Qawaid al-Fiqiyyah to analyze activities of companies in order to ensure that they are indeed Shari’ah compliant. In addition, using the quantitative data gathered from the interview survey, the perspective of the investors with regards to the need for Shari’ah screening of companies in Nigeria is further analyzed. Results: The result of the study shows that there is a lack of awareness from the teeming Muslim population in Nigeria on the need for Shari’ah screening of companies in Nigeria. The result further shows that there is the need to take into cognizance the peculiar nature of company activities in Nigeria before any particular Shari’ah screening methodology is adopted and setting the necessary benchmarks. Conclusion and Implications: The study concludes that there is the need to ensure that the conscious Muslims in Nigeria screen companies for Shari’ah compliance so that they can easily identify the companies to invest in. The paper, therefore, recommends that the Nigerian government need to come up with a screening methodology that will suit the peculiar nature of companies in Nigeria. The study thus has a direct implication on the Investment regulatory bodies in Nigeria such as the Securities and Exchange Commission (SEC), Central Bank of Nigeria (CBN) as well as the investor Muslims.Keywords: Shari'ah screening, Muslims, investors, companies
Procedia PDF Downloads 165