Search results for: simulated rainfall events
3440 זכור (Remember): An Analysis of Art as a Reflection of Sexual and Gendered Violence against Jewish Women during the Pogroms (1919-1920S) And the Nazi Era (1933-1943)
Authors: Isabella B. Davidman
Abstract:
Violence used against Jewish women in both the Eastern European pogroms and during the Nazi era was specifically gendered, targeting their female identity and dignity of womanhood. Not only did these acts of gendered violence dehumanize Jewish women, but they also hurt the Jewish community as a whole. The devastating sexual violence that women endured during the pogroms and the Nazi era caused profound trauma. Out of shame and fear, silence about women’s experiences of sexual abuse manifests in forms that words cannot translate. Women have turned to art and other means of storytelling to convey their female experiences in visual and non-verbal ways. Therefore, this paper aims to address the historical accounts of gendered violence against Jewish women during the pogroms and Nazi era, as well as art that reflects upon the female experience, in order to understand the emotional impact resulting from these events. To analyze the artwork, a feminist analysis was used to understand the intersection of gender with the other systems of inequality, such as systemic anti-semitism, in women’s lives; this ultimately explained the ways in which cultural productions undermine and reinforce the political and social oppression of women by exploring how art confronts the exploitation of women's bodies. By analyzing the art in the context of specific acts of violence, such as public rape, as a strategic weapon, we are able to understand women’s experiences and how these experiences, in turn, challenged their womanhood. Additionally, these atrocities, which often occurred in the public space, were dismissed and forgotten due to the social stigma of rape. In this sense, the experiences of women in pogroms and the Nazi era were both highly unacknowledged and forgotten. Therefore, the art that was produced during those time periods, as well as those after those events, gives voice to the profound silence on the narratives of Jewish women. Sexual violence is a weapon of war used to cause physical and psychological destruction, not only as a product of war. In both the early twentieth-century pogroms and the Holocaust, the sexual violence that Jewish women endured was fundamentally the same: the rape of Jewish women became a focal target in the theater of violence– women were not raped because they were women, but specifically, because they were Jewish women. Although the events of the pogroms and the Holocaust are in the past, the art that serves as testimony to the experience of Jewish women remains an everlasting reminder of the gendered violence that occurred. Even though covert expressions, such as an embroidered image of a bird eating an apple, the artwork gives voice to the many silenced victims of sexualized and gendered violence.Keywords: gendered violence, holocaust, Nazi era, pogroms
Procedia PDF Downloads 1073439 Combining the Deep Neural Network with the K-Means for Traffic Accident Prediction
Authors: Celso L. Fernando, Toshio Yoshii, Takahiro Tsubota
Abstract:
Understanding the causes of a road accident and predicting their occurrence is key to preventing deaths and serious injuries from road accident events. Traditional statistical methods such as the Poisson and the Logistics regressions have been used to find the association of the traffic environmental factors with the accident occurred; recently, an artificial neural network, ANN, a computational technique that learns from historical data to make a more accurate prediction, has emerged. Although the ability to make accurate predictions, the ANN has difficulty dealing with highly unbalanced attribute patterns distribution in the training dataset; in such circumstances, the ANN treats the minority group as noise. However, in the real world data, the minority group is often the group of interest; e.g., in the road traffic accident data, the events of the accident are the group of interest. This study proposes a combination of the k-means with the ANN to improve the predictive ability of the neural network model by alleviating the effect of the unbalanced distribution of the attribute patterns in the training dataset. The results show that the proposed method improves the ability of the neural network to make a prediction on a highly unbalanced distributed attribute patterns dataset; however, on an even distributed attribute patterns dataset, the proposed method performs almost like a standard neural network.Keywords: accident risks estimation, artificial neural network, deep learning, k-mean, road safety
Procedia PDF Downloads 1643438 World’s Fair (EXPO) Induced Heritage
Authors: Işılay Tiarnagh Sheridan
Abstract:
World EXPO, short version for the “exposition”, is a large universal public exhibition held since 1851. Within the 164 years, it was organized 34 times in 22 cities and as a result it has given birth to its very own culture unlike most of other international events. It has an outstanding power in transforming the places, in which it is held, into trademarks via changes in their urban tissues. For that, it is widely remembered with its cities instead of its countries. Within the scope of this change, some constructions were planned to be temporary, some planned to be permanent and some were thought to be temporary but kept afterwards becoming important monuments such as the Crystal Palace of London (though it was destroyed later by a fire) and the Eiffel Tower of Paris. These examples are the most prominent names upon considering World EXPOs. Yet, there are so many other legacies of these events within modern city fabric today that we don’t usually associate with its Expo history. Some of them are leading figures not only for the housing city but for other cities also, such as the first Metro line of Paris during 1900 World EXPO; some of them are listed as monuments of the cities such as Saint Louis Art Museum of 1904 World EXPO; some of them, like Melbourne Royal Exhibition Building of 1880 World’s EXPO, are among UNESCO World Heritage Sites and some of them are the masterpieces of modern architecture such as the famous Barcelona Pavilion, German pavilion of the 1929 World’s EXPO, of Ludwig Mies van der Rohe. Thus, the aim of this paper is to analyze the history of World’s EXPO and its eventual results in the birth of its own cultural heritage. Upon organizing these results, the paper aims to create a brief list of EXPO heritage monuments and sites so as to form a database for their further conservation needs.Keywords: expo, heritage, world's fair, legacy
Procedia PDF Downloads 4423437 Management Strategies for Risk Events in Construction Industries during Economic Situation and COVID-19 Pandemic in Nigeria
Authors: Ezeabasili Chibuike Patrick
Abstract:
The complex situation of construction industries in Nigeria and the risk of failures involved includes cost overrun, time overrun, Corruption, Government influence, Subcontractor challenges, Political influence and Instability, Cultural differences, Human resources deficiencies, cash flow Challenges, foreign exchange issues, inadequate design, Safety, low productivity, late payment, Quality control issues, project management issues, Environmental issues, Force majeure Competition amongst others has made the industry prone to risk and failures. Good project management remains effective in improving decision-making, which minimizes these risk events. This study was done to address these project risks and good decision-making to avert them. A mixed-method approach to research was used to do this study. Data collected by questionnaires and interviews on thirty-two (32) construction professionals was used in analyses to aid the knowledge and management of risks that were identified. The study revealed that there is no good risk management expertise in Nigeria. Also, that the economic/political situation and the recent COVID-19 pandemic has added to the risk and poor management strategies. The contingency theory and cost has therefore surfaced to be the most strategic management method used to reduce these risk issues and they seem to be very effective.Keywords: strategies, risk management, contingency theory, Nigeria
Procedia PDF Downloads 1323436 Harvesting of Kinetic Energy of the Raindrops
Authors: K. C. R.Perera, V. P. C Dassanayake, B. M. Hapuwatte, B. G. Smapath
Abstract:
This paper presents a methodology to harvest the kinetic energy of the raindrops using piezoelectric devices. In the study 1m×1m PVDF (Polyvinylidene fluoride) piezoelectric membrane, which is fixed by the four edges, is considered for the numerical simulation on deformation of the membrane due to the impact of the raindrops. Then according to the drop size of the rain, the simulation is performed classifying the rainfall types into three categories as light stratiform rain, moderate stratiform rain and heavy thundershower. The impact force of the raindrop is dependent on the terminal velocity of the raindrop, which is a function of raindrop diameter. The results were then analyzed to calculate the harvestable energy from the deformation of the piezoelectric membrane.Keywords: raindrop, piezoelectricity, deformation, terminal velocity
Procedia PDF Downloads 3243435 Risk of Fatal and Non-Fatal Coronary Heart Disease and Stroke Events among Adult Patients with Hypertension: Basic Markov Model Inputs for Evaluating Cost-Effectiveness of Hypertension Treatment: Systematic Review of Cohort Studies
Authors: Mende Mensa Sorato, Majid Davari, Abbas Kebriaeezadeh, Nizal Sarrafzadegan, Tamiru Shibru, Behzad Fatemi
Abstract:
Markov model, like cardiovascular disease (CVD) policy model based simulation, is being used for evaluating the cost-effectiveness of hypertension treatment. Stroke, angina, myocardial infarction (MI), cardiac arrest, and all-cause mortality were included in this model. Hypertension is a risk factor for a number of vascular and cardiac complications and CVD outcomes. Objective: This systematic review was conducted to evaluate the comprehensiveness of this model across different regions globally. Methods: We searched articles written in the English language from PubMed/Medline, Ovid/Medline, Embase, Scopus, Web of Science, and Google scholar with a systematic search query. Results: Thirteen cohort studies involving a total of 2,165,770 (1,666,554 hypertensive adult population and 499,226 adults with treatment-resistant hypertension) were included in this scoping review. Hypertension is clearly associated with coronary heart disease (CHD) and stroke mortality, unstable angina, stable angina, MI, heart failure (HF), sudden cardiac death, transient ischemic attack, ischemic stroke, subarachnoid hemorrhage, intracranial hemorrhage, peripheral arterial disease (PAD), and abdominal aortic aneurism (AAA). Association between HF and hypertension is variable across regions. Treatment resistant hypertension is associated with a higher relative risk of developing major cardiovascular events and all-cause mortality when compared with non-resistant hypertension. However, it is not included in the previous CVD policy model. Conclusion: The CVD policy model used can be used in most regions for the evaluation of the cost-effectiveness of hypertension treatment. However, hypertension is highly associated with HF in Latin America, the Caribbean, Eastern Europe, and Sub-Saharan Africa. Therefore, it is important to consider HF in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment in these regions. We do not suggest the inclusion of PAD and AAA in the CVD policy model for evaluating the cost-effectiveness of hypertension treatment due to a lack of sufficient evidence. Researchers should consider the effect of treatment-resistant hypertension either by including it in the basic model or during setting the model assumptions.Keywords: cardiovascular disease policy model, cost-effectiveness analysis, hypertension, systematic review, twelve major cardiovascular events
Procedia PDF Downloads 723434 Deep Reinforcement Learning for Advanced Pressure Management in Water Distribution Networks
Authors: Ahmed Negm, George Aggidis, Xiandong Ma
Abstract:
With the diverse nature of urban cities, customer demand patterns, landscape topologies or even seasonal weather trends; managing our water distribution networks (WDNs) has proved a complex task. These unpredictable circumstances manifest as pipe failures, intermittent supply and burst events thus adding to water loss, energy waste and increased carbon emissions. Whilst these events are unavoidable, advanced pressure management has proved an effective tool to control and mitigate them. Henceforth, water utilities have struggled with developing a real-time control method that is resilient when confronting the challenges of water distribution. In this paper we use deep reinforcement learning (DRL) algorithms as a novel pressure control strategy to minimise pressure violations and leakage under both burst and background leakage conditions. Agents based on asynchronous actor critic (A2C) and recurrent proximal policy optimisation (Recurrent PPO) were trained and compared to benchmarked optimisation algorithms (differential evolution, particle swarm optimisation. A2C manages to minimise leakage by 32.48% under burst conditions and 67.17% under background conditions which was the highest performance in the DRL algorithms. A2C and Recurrent PPO performed well in comparison to the benchmarks with higher processing speed and lower computational effort.Keywords: deep reinforcement learning, pressure management, water distribution networks, leakage management
Procedia PDF Downloads 933433 The Impact of the Corona Virus Outbreak Crisis on Startups
Authors: Mohammad Mehdizadeh, Sara Miri
Abstract:
Due to the recent events surrounding the global health crisis and the spread of the coronavirus (COVID-19), the activities of many businesses and start-up companies have been disrupted. It solves many economic problems and can reduce unemployment in countries because governments can take advantage of their potential without direct investment. However, with the help of their innovative ideas and new technologies, these companies can develop and grow the economy. But it is essential to consider that there will be no guarantee of their success in the event of unforeseen events, as the coronavirus outbreak in the last two years has seriously damaged these companies and, like other businesses, challenges and stagnation have started. The startup companies' challenge in the face of coronavirus begins with its impact on customers. Changing customer behavior can affect their products and distribution channels. On the other hand, to prevent countless losses in this crisis, startup companies require creative solutions to address challenges in various areas of human capital, supply chain management, sales and marketing, and so on. Therefore, all business leaders must consider and plan for the current crisis and the future; after overcoming these conditions and returning to regular business routines, it will no longer be an option, and new situations will prevail in a competitive environment. The essential strategies for developing and growing startups during the Coronavirus outbreak can be connecting with the global startup ecosystem, hosting webinars, providing podcasts and free question and answer sessions, mentoring services to growing teams, and consulting pointed out this to firms for digitalization.Keywords: business, COVID-19, digitalization, startups
Procedia PDF Downloads 1653432 An Engineered Epidemic: Big Pharma's Role in the Opioid Crisis
Authors: Donna L. Roberts
Abstract:
2019 marked 23 years since Purdue Pharma launched its flagship drug, OxyContin, that unleashed an unprecedented epidemic touching both celebrities and common citizens, metropolitan, suburbia and rural areas and all levels of socioeconomic status. From rural Appalachia to East LA individuals, families and communities have been devastated by a trajectory of addiction that often began with the legitimate prescription of a pain killer for anything from a tooth extraction to a sports injury to recovery from surgery or chronic arthritis. Far from being a serendipitous progression of events, the proliferation of this new breed of 'miracle drug' was instead a carefully crafted marketing program aimed at both the medical community and common citizens. This research represents and in-depth investigation of the evolution of the marketing, distribution and promotion of prescription opioids by pharmaceutical companies and its relationship to the propagation of the opioid crisis. Specifically, key components of Purdue Pharma’s aggressive marketing campaign, including its bonus system and sales incentives, were analyzed in the context of the sociopolitical environment that essential created the proverbial 'perfect storm' for the changing manner in which pain is treated in the U.S. The analyses of these series of events clearly indicate their role in first, the increase in prescription of opioids for non-terminal pain relief and subsequently, the incidence of related addiction, overdose, and death. Through this examination of the conditions that facilitated and maintained this drug crisis, perhaps we can begin to chart a course toward its resolution.Keywords: addiction, opioid, opioid crisis, Purdue Pharma
Procedia PDF Downloads 1233431 A Three-Dimensional (3D) Numerical Study of Roofs Shape Impact on Air Quality in Urban Street Canyons with Tree Planting
Authors: Bouabdellah Abed, Mohamed Bouzit, Lakhdar Bouarbi
Abstract:
The objective of this study is to investigate numerically the effect of roof shaped on wind flow and pollutant dispersion in a street canyon with one row of trees of pore volume, Pvol = 96%. A three-dimensional computational fluid dynamics (CFD) model for evaluating air flow and pollutant dispersion within an urban street canyon using Reynolds-averaged Navier–Stokes (RANS) equations and the k-Epsilon EARSM turbulence model as close of the equation system. The numerical model is performed with ANSYS-CFX code. Vehicle emissions were simulated as double line sources along the street. The numerical model was validated against the wind tunnel experiment. Having established this, the wind flow and pollutant dispersion in urban street canyons of six roof shapes are simulated. The numerical simulation agrees reasonably with the wind tunnel data. The results obtained in this work, indicate that the flow in 3D domain is more complicated, this complexity is increased with presence of tree and variability of the roof shapes. The results also indicated that the largest pollutant concentration level for two walls (leeward and windward wall) is observed with the upwind wedge-shaped roof. But the smallest pollutant concentration level is observed with the dome roof-shaped. The results also indicated that the corners eddies provide additional ventilation and lead to lower traffic pollutant concentrations at the street canyon ends.Keywords: street canyon, pollutant dispersion, trees, building configuration, numerical simulation, k-Epsilon EARSM
Procedia PDF Downloads 3663430 A Standard Operating Procedure (SOP) for Forensic Soil Analysis: Tested Using a Simulated Crime Scene
Authors: Samara A. Testoni, Vander F. Melo, Lorna A. Dawson, Fabio A. S. Salvador
Abstract:
Soil traces are useful as forensic evidence due to their potential to transfer and adhere to different types of surfaces on a range of objects or persons. The great variability expressed by soil physical, chemical, biological and mineralogical properties show soil traces as complex mixtures. Soils are continuous and variable, no two soil samples being indistinguishable, nevertheless, the complexity of soil characteristics can provide powerful evidence for comparative forensic purposes. This work aimed to establish a Standard Operating Procedure (SOP) for forensic soil analysis in Brazil. We carried out a simulated crime scene with double blind sampling to calibrate the sampling procedures. Samples were collected at a range of locations covering a range of soil types found in South of Brazil: Santa Candida and Boa Vista, neighbourhoods from Curitiba (State of Parana) and in Guarani and Guaraituba, neighbourhoods from Colombo (Curitiba Metropolitan Region). A previously validated sequential analyses of chemical, physical and mineralogical analyses was developed in around 2 g of soil. The suggested SOP and the sequential range of analyses were effective in grouping the samples from the same place and from the same parent material together, as well as successfully discriminated samples from different locations and originated from different rocks. In addition, modifications to the sample treatment and analytical protocol can be made depending on the context of the forensic work.Keywords: clay mineralogy, forensic soils analysis, sequential analyses, kaolinite, gibbsite
Procedia PDF Downloads 2553429 West Nile Virus Outbreaks in Canada under Expected Climate Conditions
Authors: Jalila Jbilou, Salaheddine El Adlouni, Pierre Gosselin
Abstract:
Background: West Nile virus is increasingly an important public health issue in North America. In Canada, WVN was officially reported in Toronto and Montréal for the first time in 2001. During the last decade, several WNV events have been reported in several Canadian provinces. The main objective of the present study is to update the frequency of the climate conditions favorable to WNV outbreaks in Canada. Method: Statistical frequency analysis has been used to estimate the return period for climate conditions associated with WNV outbreaks for the 1961–2050 period. The best fit is selected through the Akaike Information Criterion, and the parameters are estimated using the maximum likelihood approach. Results: Results show that the climate conditions related to the 2002 event, for Montreal and Toronto, are becoming more frequent. For Saskatoon, the highest DD20 events recorded for the last few decades were observed in 2003 and 2007. The estimated return periods are 30 years and 70 years, respectively. Conclusion: The emergence of WNV was related to extremely high DD values in the summer. However, some exceptions may be related to several factors such as virus persistence, vector migration, and also improved diagnosis and reporting levels. It is clear that such climate conditions have become much more common in the last decade and will likely continue to do so over future decades.Keywords: West Nile virus, climate, North America, statistical frequency analysis, risk estimation, public health, modeling, scenario, temperature, precipitation
Procedia PDF Downloads 3473428 Experimental Determination of Water Productivity of Improved Cassava Varieties Propagation under Rain-Fed Condition in Tropical Environment
Authors: Temitayo Abayomi Ewemoje, Isaac Olugbemiga Afolayan, Badmus Alao Tayo
Abstract:
Researchers in developing countries have worked on improving cassava resistance to diseases and pests, high yielding and early maturity However, water management has received little or no attention as cassava cultivation in Sub-Saharan Africa depended on available precipitation (rain-fed condition). Therefore the need for water management in Agricultural crop production cannot be overemphasized. As other sectors compete with agricultural sector for fresh water (which is not readily available), there is need to increase water productivity in agricultural production. Experimentation was conducted to examine water use, growth and yield of improved cassava varieties under rain fed condition using Latin- square design with four replications. Four improved disease free stem cassava varieties TMS (30572, 980505, 920326 and 090581) were planted and growth parameters of the varieties were monitored for 90 and 120 days after planting (DAP). Effective rainfall useful for the plant growth was calculated using CROPWAT8 for Windows. Results indicated TMS090581 was having the highest tuber yield and plant height while TMS30572 had highest number of nodes. Tuber stem and leaf water productivities at 90 and 120 DAP of TMS (30572, 980505, 920326 and 090581) are (1.27 and 3.58, 1.44 and 2.35, 0.89 and 1.86, 1.64 and 3.77) kg/m3 (1.56 and 2.59, 1.95 and 2.02, 1.98 and 2.05, 1.95 and 2.18) kg/m3, and (1.34 and 2.32, 1.94 and 2.16, 1.57 and 1.40, 1.27 and 1.80) kg/m3 respectively. Based on tuber water productivity TMS090581 are recommended while TMS30572 are recommended based on leaf and stem productivity in water scarce regions.Experimentation was conducted to examine water use, growth and yield of improved cassava varieties under rain fed condition using Latin- square design with four replications. Four improved disease free stem cassava varieties TMS (30572, 980505, 920326 and 090581) were planted and growth parameters of the varieties were monitored for 90 and 120 days after planting (DAP). Effective rainfall useful for the plant growth was calculated using CROPWAT8 for Windows. Results indicated TMS090581 was having the highest tuber yield and plant height while TMS30572 had the highest number of nodes. Tuber, stem and leaf water productivities at 90 and 120 DAP of TMS (30572, 980505, 920326 and 090581) are (1.27 and 3.58, 1.44 and 2.35, 0.89 and 1.86, 1.64 and 3.77) kg/m3 (1.56 and 2.59, 1.95 and 2.02, 1.98 and 2.05, 1.95 and 2.18) kg/m3, and (1.34 and 2.32, 1.94 and 2.16, 1.57 and 1.40, 1.27 and 1.80) kg/m3 respectively. Based on tuber water productivity TMS090581 are recommended while TMS30572 are recommended based on leaf and stem productivity in water scarce regionsKeywords: improved TMS varieties, leaf productivity, rain-fed cassava production, stem productivity, tuber productivity
Procedia PDF Downloads 3463427 Application of Biopolymer for Adsorption of Methylene Blue Dye from Simulated Effluent: A Green Method for Textile Industry Wastewater Treatment
Authors: Rabiya, Ramkrishna Sen
Abstract:
The textile industry releases huge volume of effluent containing reactive dyes in the nearby water bodies. These effluents are significant source of water pollution since most of the dyes are toxic in nature. Moreover, it scavenges the dissolved oxygen essential to the aquatic species. Therefore, it is necessary to treat the dye effluent before it is discharged in the nearby water bodies. The present study focuses on removing the basic dye methylene blue from simulated wastewater using biopolymer. The biopolymer was partially purified from the culture of Bacillus licheniformis by ultrafiltration. Based on the elution profile of the biopolymer from ion exchange column, it was found to be a negatively charged molecule. Its net anionic nature allows the biopolymer to adsorb positively charged molecule, methylene blue. The major factors which influence the removal of dye by the biopolymer such as incubation time, pH, initial dye concentration were evaluated. The methylene blue uptake by the biopolymer is more (14.84 mg/g) near neutral pH than in acidic pH (12.05mg/g) of the water. At low pH, the lower dissociation of the dye molecule as well as the low negative charge available on the biopolymer reduces the interaction between the biopolymer and dye. The optimum incubation time for maximum removal of dye was found to be 60 min. The entire study was done with 25 mL of dye solution in 100 mL flask at 25 °C with an amount of 11g/L of biopolymer. To study the adsorption isotherm, the dye concentration was varied in the range of 25mg/L to 205mg/L. The dye uptake by the biopolymer against the equilibrium concentration was plotted. The plot indicates that the adsorption of dye by biopolymer follows the Freundlich adsorption isotherm (R-square 0.99). Hence, these studies indicate the potential use of biopolymer for the removal of basic dye from textile wastewater in an ecofriendly and sustainable way.Keywords: biopolymer, methylene blue dye, textile industry, wastewater
Procedia PDF Downloads 1423426 Relationships of Plasma Lipids, Lipoproteins and Cardiovascular Outcomes with Climatic Variations: A Large 8-Year Period Brazilian Study
Authors: Vanessa H. S. Zago, Ana Maria H. de Avila, Paula P. Costa, Welington Corozolla, Liriam S. Teixeira, Eliana C. de Faria
Abstract:
Objectives: The outcome of cardiovascular disease is affected by environment and climate. This study evaluated the possible relationships between climatic and environmental changes and the occurrence of biological rhythms in serum lipids and lipoproteins in a large population sample in the city of Campinas, State of Sao Paulo, Brazil. In addition, it determined the temporal variations of death due to atherosclerotic events in Campinas during the time window examined. Methods: A large 8-year retrospective study was carried out to evaluate the lipid profiles of individuals attended at the University of Campinas (Unicamp). The study population comprised 27.543 individuals of both sexes and of all ages. Normolipidemic and dyslipidemic individuals classified according to Brazilian guidelines on dyslipidemias, participated in the study. For the same period, the temperature, relative humidity and daily brightness records were obtained from the Centro de Pesquisas Meteorologicas e Climaticas Aplicadas a Agricultura/Unicamp and frequencies of death due to atherosclerotic events in Campinas were acquired from the Brazilian official database DATASUS, according to the International Classification of Diseases. Statistical analyses were performed using both Cosinor and ARIMA temporal analysis methods. For cross-correlation analysis between climatic and lipid parameters, cross-correlation functions were used. Results: Preliminary results indicated that rhythmicity was significant for LDL-C and HDL-C in the cases of both normolipidemic and dyslipidemic subjects (n =respectively 11.892 and 15.651 both measures increasing in the winter and decreasing in the summer). On the other hand, for dyslipidemic subjects triglycerides increased in summer and decreased in winter, in contrast to normolipidemic ones, in which triglycerides did not show rhythmicity. The number of deaths due to atherosclerotic events showed significant rhythmicity, with maximum and minimum frequencies in winter and summer, respectively. Cross-correlation analyzes showed that low humidity and temperature, higher thermal amplitude and dark cycles are associated with increased levels of LDL-C and HDL-C during winter. In contrast, TG showed moderate cross-correlations with temperature and minimum humidity in an inverse way: maximum temperature and humidity increased TG during the summer. Conclusions: This study showed a coincident rhythmicity between low temperatures and high concentrations of LDL-C and HDL-C and the number of deaths due to atherosclerotic cardiovascular events in individuals from the city of Campinas. The opposite behavior of cholesterol and TG suggest different physiological mechanisms in their metabolic modulation by climate parameters change. Thus, new analyses are underway to better elucidate these mechanisms, as well as variations in lipid concentrations in relation to climatic variations and their associations with atherosclerotic disease and death outcomes in Campinas.Keywords: atherosclerosis, climatic variations, lipids and lipoproteins, associations
Procedia PDF Downloads 1183425 Perceived Restorativeness Scale– 6: A Short Version of the Perceived Restorativeness Scale for Mixed (or Mobile) Devices
Authors: Sara Gallo, Margherita Pasini, Margherita Brondino, Daniela Raccanello, Roberto Burro, Elisa Menardo
Abstract:
Most of the studies on the ability of environments to recover people’s cognitive resources have been conducted in laboratory using simulated environments (e.g., photographs, videos, or virtual reality), based on the implicit assumption that exposure to simulated environments has the same effects of exposure to real environments. However, the technical characteristics of simulated environments, such as the dynamic or static characteristics of the stimulus, critically affect their perception. Measuring perceived restorativeness in situ rather than in laboratory could increase the validity of the obtained measurements. Personal mobile devices could be useful because they allow accessing immediately online surveys when people are directly exposed to an environment. At the same time, it becomes important to develop short and reliable measuring instruments that allow a quick assessment of the restorative qualities of the environments. One of the frequently used self-report measures to assess perceived restorativeness is the “Perceived Restorativeness Scale” (PRS) based on Attention Restoration Theory. A lot of different versions have been proposed and used according to different research purposes and needs, without studying their validity. This longitudinal study reported some preliminary validation analyses on a short version of original scale, the PRS-6, developed to be quick and mobile-friendly. It is composed of 6 items assessing fascination and being-away. 102 Italian university students participated to the study, 84% female with age ranging from 18 to 47 (M = 20.7; SD = 2.9). Data were obtained through a survey online that asked them to report their perceived restorativeness of the environment they were in (and the kind of environment) and their positive emotion (Positive and Negative Affective Schedule, PANAS) once a day for seven days. Cronbach alpha and item-total correlations were used to assess reliability and internal consistency. Confirmatory Factor Analyses (CFA) models were run to study the factorial structure (construct validity). Correlation analyses between PRS and PANAS scores were used to check discriminant validity. In the end, multigroup CFA models were used to study measurement invariance (configural, metric, scalar, strict) between different mobile devices and between day of assessment. On the whole, the PRS-6 showed good psychometric proprieties, similar to those of the original scale, and invariance across devices and days. These results suggested that the PRS-6 could be a valid alternative to assess perceived restorativeness when researchers need a brief and immediate evaluation of the recovery quality of an environment.Keywords: restorativeness, validation, short scale development, psychometrics proprieties
Procedia PDF Downloads 2543424 Monte Carlo Simulation Study on Improving the Flatting Filter-Free Radiotherapy Beam Quality Using Filters from Low- z Material
Authors: H. M. Alfrihidi, H.A. Albarakaty
Abstract:
Flattening filter-free (FFF) photon beam radiotherapy has increased in the last decade, which is enabled by advancements in treatment planning systems and radiation delivery techniques like multi-leave collimators. FFF beams have higher dose rates, which reduces treatment time. On the other hand, FFF beams have a higher surface dose, which is due to the loss of beam hardening effect caused by the presence of the flatting filter (FF). The possibility of improving FFF beam quality using filters from low-z materials such as steel and aluminium (Al) was investigated using Monte Carlo (MC) simulations. The attenuation coefficient of low-z materials for low-energy photons is higher than that of high-energy photons, which leads to the hardening of the FFF beam and, consequently, a reduction in the surface dose. BEAMnrc user code, based on Electron Gamma Shower (EGSnrc) MC code, is used to simulate the beam of a 6 MV True-Beam linac. A phase-space (phosphor) file provided by Varian Medical Systems was used as a radiation source in the simulation. This phosphor file was scored just above the jaws at 27.88 cm from the target. The linac from the jaw downward was constructed, and radiation passing was simulated and scored at 100 cm from the target. To study the effect of low-z filters, steel and Al filters with a thickness of 1 cm were added below the jaws, and the phosphor file was scored at 100 cm from the target. For comparison, the FF beam was simulated using a similar setup. (BEAM Data Processor (BEAMdp) is used to analyse the energy spectrum in the phosphorus files. Then, the dose distribution resulting from these beams was simulated in a homogeneous water phantom using DOSXYZnrc. The dose profile was evaluated according to the surface dose, the lateral dose distribution, and the percentage depth dose (PDD). The energy spectra of the beams show that the FFF beam is softer than the FF beam. The energy peaks for the FFF and FF beams are 0.525 MeV and 1.52 MeV, respectively. The FFF beam's energy peak becomes 1.1 MeV using a steel filter, while the Al filter does not affect the peak position. Steel and Al's filters reduced the surface dose by 5% and 1.7%, respectively. The dose at a depth of 10 cm (D10) rises by around 2% and 0.5% due to using a steel and Al filter, respectively. On the other hand, steel and Al filters reduce the dose rate of the FFF beam by 34% and 14%, respectively. However, their effect on the dose rate is less than that of the tungsten FF, which reduces the dose rate by about 60%. In conclusion, filters from low-z material decrease the surface dose and increase the D10 dose, allowing for a high-dose delivery to deep tumors with a low skin dose. Although using these filters affects the dose rate, this effect is much lower than the effect of the FF.Keywords: flattening filter free, monte carlo, radiotherapy, surface dose
Procedia PDF Downloads 733423 Unsteady Flow Simulations for Microchannel Design and Its Fabrication for Nanoparticle Synthesis
Authors: Mrinalini Amritkar, Disha Patil, Swapna Kulkarni, Sukratu Barve, Suresh Gosavi
Abstract:
Micro-mixers play an important role in the lab-on-a-chip applications and micro total analysis systems to acquire the correct level of mixing for any given process. The mixing process can be classified as active or passive according to the use of external energy. Literature of microfluidics reports that most of the work is done on the models of steady laminar flow; however, the study of unsteady laminar flow is an active area of research at present. There are wide applications of this, out of which, we consider nanoparticle synthesis in micro-mixers. In this work, we have developed a model for unsteady flow to study the mixing performance of a passive micro mixer for reactants used for such synthesis. The model is developed in Finite Volume Method (FVM)-based software, OpenFOAM. The model is tested by carrying out the simulations at Re of 0.5. Mixing performance of the micro-mixer is investigated using simulated concentration values of mixed species across the width of the micro-mixer and calculating the variance across a line profile. Experimental validation is done by passing dyes through a Y shape micro-mixer fabricated using polydimethylsiloxane (PDMS) polymer and comparing variances with the simulated ones. Gold nanoparticles are later synthesized through the micro-mixer and collected at two different times leading to significantly different size distributions. These times match with the time scales over which reactant concentrations vary as obtained from simulations. Our simulations could thus be used to create design aids for passive micro-mixers used in nanoparticle synthesis.Keywords: Lab-on-chip, LOC, micro-mixer, OpenFOAM, PDMS
Procedia PDF Downloads 1623422 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure
Authors: T. Nozu, K. Hibi, T. Nishiie
Abstract:
This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure
Procedia PDF Downloads 2443421 Dynamic Simulation of Disintegration of Wood Chips Caused by Impact and Collisions during the Steam Explosion Pre-Treatment
Authors: Muhammad Muzamal, Anders Rasmuson
Abstract:
Wood material is extensively considered as a raw material for the production of bio-polymers, bio-fuels and value-added chemicals. However, the shortcoming in using wood as raw material is that the enzymatic hydrolysis of wood material is difficult because the accessibility of enzymes to hemicelluloses and cellulose is hindered by complex chemical and physical structure of the wood. The steam explosion (SE) pre-treatment improves the digestion of wood material by creating both chemical and physical modifications in wood. In this process, first, wood chips are treated with steam at high pressure and temperature for a certain time in a steam treatment vessel. During this time, the chemical linkages between lignin and polysaccharides are cleaved and stiffness of material decreases. Then the steam discharge valve is rapidly opened and the steam and wood chips exit the vessel at very high speed. These fast moving wood chips collide with each other and with walls of the equipment and disintegrate to small pieces. More damaged and disintegrated wood have larger surface area and increased accessibility to hemicelluloses and cellulose. The energy required for an increase in specific surface area by same value is 70 % more in conventional mechanical technique, i.e. attrition mill as compared to steam explosion process. The mechanism of wood disintegration during the SE pre-treatment is very little studied. In this study, we have simulated collision and impact of wood chips (dimension 20 mm x 20 mm x 4 mm) with each other and with walls of the vessel. The wood chips are simulated as a 3D orthotropic material. Damage and fracture in the wood material have been modelled using 3D Hashin’s damage model. This has been accomplished by developing a user-defined subroutine and implementing it in the FE software ABAQUS. The elastic and strength properties used for simulation are of spruce wood at 12% and 30 % moisture content and at 20 and 160 OC because the impacted wood chips are pre-treated with steam at high temperature and pressure. We have simulated several cases to study the effects of elastic and strength properties of wood, velocity of moving chip and orientation of wood chip at the time of impact on the damage in the wood chips. The disintegration patterns captured by simulations are very similar to those observed in experimentally obtained steam exploded wood. Simulation results show that the wood chips moving with higher velocity disintegrate more. Moisture contents and temperature decreases elastic properties and increases damage. Impact and collision in specific directions cause easy disintegration. This model can be used to efficiently design the steam explosion equipment.Keywords: dynamic simulation, disintegration of wood, impact, steam explosion pretreatment
Procedia PDF Downloads 4013420 Use of Simulation in Medical Education: Role and Challenges
Authors: Raneem Osama Salem, Ayesha Nuzhat, Fatimah Nasser Al Shehri, Nasser Al Hamdan
Abstract:
Background: Recently, most medical schools around the globe are using simulation for teaching and assessing students’ clinical skills and competence. There are many obstacles that could face students and faculty when simulation sessions are introduced into undergraduate curriculum. Objective: The aim of this study is to obtain the opinion of undergraduate medical students and our faculty regarding the role of simulation in undergraduate curriculum, the simulation modalities used, and perceived barriers in implementing stimulation sessions. Methods: To address the role of simulation, modalities used, and perceived challenges to implementation of simulation sessions, a self-administered pilot tested questionnaire with 18 items using a 5 point Likert scale was distributed. Participants included undergraduate male medical students (n=125) and female students (n=70) as well as the faculty members (n=14). Result: Various learning outcomes are achieved and improved through the technology enhanced simulation sessions such as communication skills, diagnostic skills, procedural skills, self-confidence, and integration of basic and clinical sciences. The use of high fidelity simulators, simulated patients and task trainers was more desirable by our students and faculty for teaching and learning as well as an evaluation tool. According to most of the students,' institutional support in terms of resources, staff and duration of sessions was adequate. However, motivation to participate in the sessions and provision of adequate feedback by the staff was a constraint. Conclusion: The use of simulation laboratory is of great benefit to the students and a great teaching tool for the staff to ensure students learning of the various skills.Keywords: simulators, medical students, skills, simulated patients, performance, challenges, skill laboratory
Procedia PDF Downloads 4083419 Encapsulation of Probiotic Bacteria in Complex Coacervates
Authors: L. A. Bosnea, T. Moschakis, C. Biliaderis
Abstract:
Two probiotic strains of Lactobacillus paracasei subsp. paracasei (E6) and Lactobacillus paraplantarum (B1), isolated from traditional Greek dairy products, were microencapsulated by complex coacervation using whey protein isolate (WPI, 3% w/v) and gum arabic (GA, 3% w/v) solutions mixed at different polymer ratio (1:1, 2:1 and 4:1). The effect of total biopolymer concentration on cell viability was assessed using WPI and GA solutions of 1, 3 and 6% w/v at a constant ratio of 2:1. Also, several parameters were examined for optimization of the microcapsule formation, such as inoculum concentration and the effect of ionic strength. The viability of the bacterial cells during heat treatment and under simulated gut conditions was also evaluated. Among the different WPI/GA weight ratios tested (1:1, 2:1, and 4:1), the highest survival rate was observed for the coacervate structures made with the ratio of 2:1. The protection efficiency at low pH values is influenced by both concentration and the ratio of the added biopolymers. Moreover, the inoculum concentration seems to affect the efficiency of microcapsules to entrap the bacterial cells since an optimum level was noted at less than 8 log cfu/ml. Generally, entrapment of lactobacilli in the complex coacervate structure enhanced the viability of the microorganisms when exposed to a low pH environment (pH 2.0). Both encapsulated strains retained high viability in simulated gastric juice (>73%), especially in comparison with non-encapsulated (free) cells (<19%). The encapsulated lactobacilli also exhibited enhanced viability after 10–30 min of heat treatment (65oC) as well as at different NaCl concentrations (pH 4.0). Overall, the results of this study suggest that complex coacervation with WPI/GA has a potential to deliver live probiotics in low pH food systems and fermented dairy products; the complexes can dissolve at pH 7.0 (gut environment), releasing the microbial cells.Keywords: probiotic, complex coacervation, whey, encapsulation
Procedia PDF Downloads 2973418 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 1963417 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil
Authors: B. Mendonça, D. Sandwell
Abstract:
The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.Keywords: ground subsidence, Interferometric Satellite Aperture Radar (InSAR), metropolitan region of Sao Paulo, water extraction
Procedia PDF Downloads 3553416 Implementation of a Monostatic Microwave Imaging System using a UWB Vivaldi Antenna
Authors: Babatunde Olatujoye, Binbin Yang
Abstract:
Microwave imaging is a portable, noninvasive, and non-ionizing imaging technique that employs low-power microwave signals to reveal objects in the microwave frequency range. This technique has immense potential for adoption in commercial and scientific applications such as security scanning, material characterization, and nondestructive testing. This work presents a monostatic microwave imaging setup using an Ultra-Wideband (UWB), low-cost, miniaturized Vivaldi antenna with a bandwidth of 1 – 6 GHz. The backscattered signals (S-parameters) of the Vivaldi antenna used for scanning targets were measured in the lab using a VNA. An automated two-dimensional (2-D) scanner was employed for the 2-D movement of the transceiver to collect the measured scattering data from different positions. The targets consist of four metallic objects, each with a distinct shape. Similar setup was also simulated in Ansys HFSS. A high-resolution Back Propagation Algorithm (BPA) was applied to both the simulated and experimental backscattered signals. The BPA utilizes the phase and amplitude information recorded over a two-dimensional aperture of 50 cm × 50 cm with a discreet step size of 2 cm to reconstruct a focused image of the targets. The adoption of BPA was demonstrated by coherently resolving and reconstructing reflection signals from conventional time-of-flight profiles. For both the simulation and experimental data, BPA accurately reconstructed a high resolution 2D image of the targets in terms of shape and location. An improvement of the BPA, in terms of target resolution, was achieved by applying the filtering method in frequency domain.Keywords: back propagation, microwave imaging, monostatic, vivialdi antenna, ultra wideband
Procedia PDF Downloads 213415 A Physiological Approach for Early Detection of Hemorrhage
Authors: Rabie Fadil, Parshuram Aarotale, Shubha Majumder, Bijay Guargain
Abstract:
Hemorrhage is the loss of blood from the circulatory system and leading cause of battlefield and postpartum related deaths. Early detection of hemorrhage remains the most effective strategy to reduce mortality rate caused by traumatic injuries. In this study, we investigated the physiological changes via non-invasive cardiac signals at rest and under different hemorrhage conditions simulated through graded lower-body negative pressure (LBNP). Simultaneous electrocardiogram (ECG), photoplethysmogram (PPG), blood pressure (BP), impedance cardiogram (ICG), and phonocardiogram (PCG) were acquired from 10 participants (age:28 ± 6 year, weight:73 ± 11 kg, height:172 ± 8 cm). The LBNP protocol consisted of applying -20, -30, -40, -50, and -60 mmHg pressure to the lower half of the body. Beat-to-beat heart rate (HR), systolic blood pressure (SBP), diastolic blood pressure (DBP), and mean aerial pressure (MAP) were extracted from ECG and blood pressure. Systolic amplitude (SA), systolic time (ST), diastolic time (DT), and left ventricle Ejection time (LVET) were extracted from PPG during each stage. Preliminary results showed that the application of -40 mmHg i.e. moderate stage simulated hemorrhage resulted significant changes in HR (85±4 bpm vs 68 ± 5bpm, p < 0.01), ST (191 ± 10 ms vs 253 ± 31 ms, p < 0.05), LVET (350 ± 14 ms vs 479 ± 47 ms, p < 0.05) and DT (551 ± 22 ms vs 683 ± 59 ms, p < 0.05) compared to rest, while no change was observed in SA (p > 0.05) as a consequence of LBNP application. These findings demonstrated the potential of cardiac signals in detecting moderate hemorrhage. In future, we will analyze all the LBNP stages and investigate the feasibility of other physiological signals to develop a predictive machine learning model for early detection of hemorrhage.Keywords: blood pressure, hemorrhage, lower-body negative pressure, LBNP, machine learning
Procedia PDF Downloads 1673414 Enhanced Flight Dynamics Model to Simulate the Aircraft Response to Gust Encounters
Authors: Castells Pau, Poetsch Christophe
Abstract:
The effect of gust and turbulence encounters on aircraft is a wide field of study which allows different approaches, from high-fidelity multidisciplinary simulations to more simplified models adapted to industrial applications. The typical main goal is to predict the gust loads on the aircraft in order to ensure a safe design and achieve certification. Another topic widely studied is the gust loads reduction through an active control law. The impact of gusts on aircraft handling qualities is of interest as well in the analysis of in-service events so as to evaluate the aircraft response and the performance of the flight control laws. Traditionally, gust loads and handling qualities are addressed separately with different models adapted to the specific needs of each discipline. In this paper, an assessment of the differences between both models is presented and a strategy to better account for the physics of gust encounters in a typical flight dynamics model is proposed based on the model used for gust loads analysis. The applied corrections aim to capture the gust unsteady aerodynamics and propagation as well as the effect of dynamic flexibility at low frequencies. Results from the gust loads model at different flight conditions and measures from real events are used for validation. An assessment of a possible extension of steady aerodynamic nonlinearities to low frequency range is also addressed. The proposed corrections provide meaningful means to evaluate the performance and possible adjustments of the flight control laws.Keywords: flight dynamics, gust loads, handling qualities, unsteady aerodynamics
Procedia PDF Downloads 1473413 Techno-Apocalypse in Christian End-Time Literature
Authors: Sean O'Callaghan
Abstract:
Around 2011/2012, a whole new genre of Christian religious writing began to emerge, focused on the role of advanced technologies, particularly the GRIN technologies (Genetics, Robotics, Information Technology and Nanotechnology), in bringing about a techno-apocalypse, leading to catastrophic events which would usher in the end of the world. This genre, at first niche, has now begun to grow in significance in many quarters of the more fundamentalist and biblically literalist branches of evangelicalism. It approaches science and technology with more than extreme skepticism. It accuses transhumanists of being in league with satanic powers and a satanic agenda and contextualizes transhumanist scientific progress in terms of its service to what it believes to be a soon to come Antichrist figure. The genre has moved beyond literature and videos about its message can be found on YouTube and other forums, where many of the presentations there get well over a quarter of a million views. This paper will examine the genre and its genesis, referring to the key figures involved in spreading the anti-intellectualist and anti-scientific message. It will demonstrate how this genre of writing is similar in many respects to other forms of apocalyptic writing which have emerged in the twentieth and twenty-first centuries, all in response to both scientific and political events which are interpreted in the light of biblical prophecy. It will also set the genre in the context of a contemporary pre-occupation with conspiracy theory. The conclusions of the research conducted in this field by the author are that it does a grave disservice to both the scientific and Christian audiences which it targets, by misrepresenting scientific advances and by creating a hermeneutic of suspicion which makes it impossible for Christians to place their trust in scientific claims.Keywords: antichrist, catastrophic, Christian, techno-apocalypse
Procedia PDF Downloads 2083412 Using a Simulated Learning Environment to Teach Pre-Service Special Educators Behavior Management
Authors: Roberta Gentry
Abstract:
A mixed methods study that examined candidate’s perceptions of the use of computerized simulation as an effective tool to learn classroom management will be presented. The development, implementation, and assessment of the simulation and candidate data on the feasibility of the approach in comparison to other methods will be presented.Keywords: behavior management, simulations, teacher preparation, teacher education
Procedia PDF Downloads 4033411 Simulating the Effect of Chlorine on Dynamic of Main Aquatic Species in Urban Lake with a Mini System Dynamic Model
Authors: Zhiqiang Yan, Chen Fan, Beicheng Xia
Abstract:
Urban lakes play an invaluable role in urban water systems such as flood control, landscape, entertainment, and energy utilization, and have suffered from severe eutrophication over the past few years. To investigate the ecological response of main aquatic species and system stability to chlorine interference in shallow urban lakes, a mini system dynamic model, based on the competition and predation of main aquatic species and TP circulation, was developed. The main species of submerged macrophyte, phytoplankton, zooplankton, benthos and TP in water and sediment were simulated as variables in the model with the interference of chlorine which effect function was attenuation equation. The model was validated by the data which was investigated in the Lotus Lake in Guangzhou from October 1, 2015 to January 31, 2016. Furthermore, the eco-exergy was used to analyze the change in complexity of the shallow urban lake. The results showed the correlation coefficient between observed and simulated values of all components presented significant. Chlorine showed a significant inhibitory effect on Microcystis aeruginosa,Rachionus plicatilis, Diaphanosoma brachyurum Liévin and Mesocyclops leuckarti (Claus).The outbreak of Spiroggra spp. inhibited the growth of Vallisneria natans (Lour.) Hara, caused a gradual decrease of eco-exergy, reflecting the breakdown of ecosystem internal equilibria. It was concluded that the study gives important insight into using chlorine to achieve eutrophication control and understand mechanism process.Keywords: system dynamic model, urban lake, chlorine, eco-exergy
Procedia PDF Downloads 209