Search results for: vented gas explosion
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 158

Search results for: vented gas explosion

38 Oil Exploration in the Niger Delta and the Right to a Healthy Environment

Authors: Olufunke Ayilara Aje-Famuyide

Abstract:

The centrality of the Petroleum Industry in the world energy is undoubted. The world economy almost runs and depends on petroleum. Petroleum industry is a multi-trillion industry; it turns otherwise poor and underdeveloped countries into wealthy nations and thrusts them at the center of international diplomacy. Although these developing nations lack the necessary technology to explore and exploit petroleum resources they are not without help as developed nations, represented by their multinational corporations are ready and willing to provide both the technical and managerial expertise necessary for the development of this natural resource. However, the exploration of these petroleum resources comes with, sometimes, grave, concomitant consequences. These consequences are especially pronounced with respect to the environment. From the British Petroleum Oil rig explosion and the resultant oil spillage and pollution in New Mexico, United States to the Mobil Oil spillage along Nigerian coast, the story and consequence is virtually the same. Nigeria’s Niger Delta Region produces Nigeria’s petroleum which accounts for more than ninety-five percent of Nigeria’s foreign exchange earnings. Between 1999 and 2007, Nigeria earned more than $400 billion from petroleum exports. Nevertheless, petroleum exploration and exploitation has devastated the Niger Delta environment. From oil spillage which pollutes the rivers, farms and wetlands to gas flaring by the multi-national corporations; the consequences is similar-a region that has been devastated by petroleum exploitation. This paper thus seeks to examine the consequences and impact of petroleum pollution in the Niger Delta of Nigeria with particular reference on the right of the people of Niger Delta to a healthy environment. The paper further seeks to examine the relevant international, regional instrument and Nigeria’s municipal laws that are meant to protect the result of the people of the Niger Delta and their enforcement by the Nigerian State. It is quite worrisome that the Niger Delta Region and its people have suffered and are still suffering grave violations of their right to a healthy environment as a result of petroleum exploitation in their region. The Nigerian effort at best is half-hearted in its protection of the people’s right.

Keywords: environment, exploration, petroleum, pollution

Procedia PDF Downloads 400
37 Psychopathic Disorders and Judges Sentencing: Can Neurosciences Change this Aggravating Factor in a Mitigating Factor?

Authors: Kevin Moustapha

Abstract:

Psychopathy is perceived today as being «the most important concept in the criminal justice system» and as «the most important legal notion of the early 21 th century». The explosion of research related to psychopathy seems to perfectly illustrate this trend. Traditionally, many studies tend to focus on links between insanity defense and psychopathy. That is why our purpose in this article is to analyze psychopathic disorders in the scope of judges sentencing in Canada. Indeed, in every Canadian case related to dangerous offenders, judges must balance between fairness and protection of the individuals rights of the accused and protection of society from dangerous predators who may commit future acts of physical or sexual violence. Increasingly, psychopathic disorders are taking an important part in judge sentencing, especially in Canada. This phenomenon can be illustrated by the high proportion of psychopath offenders incarcerated in North American prisons. Many decisions in Canadians courtrooms seem to point out that psychopathy is often used as a strong argument by the judges to preserve public safety. The fact that psychopathy is often associated with violence, recklessness and recidivism, it could explain why many judges consider psychopathic disorders as an aggravating factor. Generally, the judge reasoning is based on article 753 of Canadian Criminal Code related to dangerous offenders, which is used for individuals who show a pattern of repetitive and persistent aggressive behaviour. However, with cognitive neurosciences, the psychopath’s situation in courtrooms would probably change. Cerebral imaging and news data provided by the neurosciences show that emotional and volitional functions in psychopath’s brains are impaired. Understanding these new issues could enable some judges to recognize psychopathic disorders as a mitigating factor. Two important questions ought to be raised in this article: can exploring psychopaths ‘brains really change the judge sentencing in Canadian courtrooms? If yes, can judges consider psychopathy more as a mitigating factor than an aggravating factor?

Keywords: criminal law, judges sentencing, neurosciences, psychopathy

Procedia PDF Downloads 898
36 Development of an Atmospheric Radioxenon Detection System for Nuclear Explosion Monitoring

Authors: V. Thomas, O. Delaune, W. Hennig, S. Hoover

Abstract:

Measurement of radioactive isotopes of atmospheric xenon is used to detect, locate and identify any confined nuclear tests as part of the Comprehensive Nuclear Test-Ban Treaty (CTBT). In this context, the Alternative Energies and French Atomic Energy Commission (CEA) has developed a fixed device to continuously measure the concentration of these fission products, the SPALAX process. During its atmospheric transport, the radioactive xenon will undergo a significant dilution between the source point and the measurement station. Regarding the distance between fixed stations located all over the globe, the typical volume activities measured are near 1 mBq m⁻³. To avoid the constraints induced by atmospheric dilution, the development of a mobile detection system is in progress; this system will allow on-site measurements in order to confirm or infringe a suspicious measurement detected by a fixed station. Furthermore, this system will use beta/gamma coincidence measurement technique in order to drastically reduce environmental background (which masks such activities). The detector prototype consists of a gas cell surrounded by two large silicon wafers, coupled with two square NaI(Tl) detectors. The gas cell has a sample volume of 30 cm³ and the silicon wafers are 500 µm thick with an active surface area of 3600 mm². In order to minimize leakage current, each wafer has been segmented into four independent silicon pixels. This cell is sandwiched between two low background NaI(Tl) detectors (70x70x40 mm³ crystal). The expected Minimal Detectable Concentration (MDC) for each radio-xenon is in the order of 1-10 mBq m⁻³. Three 4-channels digital acquisition modules (Pixie-NET) are used to process all the signals. Time synchronization is ensured by a dedicated PTP-network, using the IEEE 1588 Precision Time Protocol. We would like to present this system from its simulation to the laboratory tests.

Keywords: beta/gamma coincidence technique, low level measurement, radioxenon, silicon pixels

Procedia PDF Downloads 104
35 The Cost of Non-Communicable Diseases in the European Union: A Projection towards the Future

Authors: Desiree Vandenberghe, Johan Albrecht

Abstract:

Non-communicable diseases (NCDs) are responsible for the vast majority of deaths in the European Union (EU) and represent a large share of total health care spending. A future increase in this health and financial burden is likely to be driven by population ageing, lifestyle changes and technological advances in medicine. Without adequate prevention measures, this burden can severely threaten population health and economic development. To tackle this challenge, a correct assessment of the current burden of NCDs is required, as well as a projection of potential increases of this burden. The contribution of this paper is to offer perspective on the evolution of the NCD burden towards the future and to give an indication of the potential of prevention policy. A Non-Homogenous, Semi-Markov model for the EU was constructed, which allowed for a projection of the cost burden for the four main NCDs (cancer, cardiovascular disease, chronic respiratory disease and diabetes mellitus) towards 2030 and 2050. This simulation is done based on multiple baseline scenarios that vary in demand and supply factors such as health status, population structure, and technological advances. Finally, in order to assess the potential of preventive measures to curb the cost explosion of NCDs, a simulation is executed which includes increased efforts for preventive health care measures. According to the Markov model, by 2030 and 2050, total costs (direct and indirect costs) in the EU could increase by 30.1% and 44.1% respectively, compared to 2015 levels. An ambitious prevention policy framework for NCDs will be required if the EU wants to meet this challenge of rising costs. To conclude, significant cost increases due to Non-Communicable Diseases are likely to occur due to demographic and lifestyle changes. Nevertheless, an ambitious prevention program throughout the EU can aid in making this cost burden manageable for future generations.

Keywords: non-communicable diseases, preventive health care, health policy, Markov model, scenario analysis

Procedia PDF Downloads 114
34 Advances in Mathematical Sciences: Unveiling the Power of Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid advancements in data collection, storage, and processing capabilities have led to an explosion of data in various domains. In this era of big data, mathematical sciences play a crucial role in uncovering valuable insights and driving informed decision-making through data analytics. The purpose of this abstract is to present the latest advances in mathematical sciences and their application in harnessing the power of data analytics. This abstract highlights the interdisciplinary nature of data analytics, showcasing how mathematics intersects with statistics, computer science, and other related fields to develop cutting-edge methodologies. It explores key mathematical techniques such as optimization, mathematical modeling, network analysis, and computational algorithms that underpin effective data analysis and interpretation. The abstract emphasizes the role of mathematical sciences in addressing real-world challenges across different sectors, including finance, healthcare, engineering, social sciences, and beyond. It showcases how mathematical models and statistical methods extract meaningful insights from complex datasets, facilitating evidence-based decision-making and driving innovation. Furthermore, the abstract emphasizes the importance of collaboration and knowledge exchange among researchers, practitioners, and industry professionals. It recognizes the value of interdisciplinary collaborations and the need to bridge the gap between academia and industry to ensure the practical application of mathematical advancements in data analytics. The abstract highlights the significance of ongoing research in mathematical sciences and its impact on data analytics. It emphasizes the need for continued exploration and innovation in mathematical methodologies to tackle emerging challenges in the era of big data and digital transformation. In summary, this abstract sheds light on the advances in mathematical sciences and their pivotal role in unveiling the power of data analytics. It calls for interdisciplinary collaboration, knowledge exchange, and ongoing research to further unlock the potential of mathematical methodologies in addressing complex problems and driving data-driven decision-making in various domains.

Keywords: mathematical sciences, data analytics, advances, unveiling

Procedia PDF Downloads 61
33 Simulation of Concrete Wall Subjected to Airblast by Developing an Elastoplastic Spring Model in Modelica Modelling Language

Authors: Leo Laine, Morgan Johansson

Abstract:

To meet the civilizations future needs for safe living and low environmental footprint, the engineers designing the complex systems of tomorrow will need efficient ways to model and optimize these systems for their intended purpose. For example, a civil defence shelter and its subsystem components needs to withstand, e.g. airblast and ground shock from decided design level explosion which detonates with a certain distance from the structure. In addition, the complex civil defence shelter needs to have functioning air filter systems to protect from toxic gases and provide clean air, clean water, heat, and electricity needs to also be available through shock and vibration safe fixtures and connections. Similar complex building systems can be found in any concentrated living or office area. In this paper, the authors use a multidomain modelling language called Modelica to model a concrete wall as a single degree of freedom (SDOF) system with elastoplastic properties with the implemented option of plastic hardening. The elastoplastic model was developed and implemented in the open source tool OpenModelica. The simulation model was tested on the case with a transient equivalent reflected pressure time history representing an airblast from 100 kg TNT detonating 15 meters from the wall. The concrete wall is approximately regarded as a concrete strip of 1.0 m width. This load represents a realistic threat on any building in a city like area. The OpenModelica model results were compared with an Excel implementation of a SDOF model with an elastic-plastic spring using simple fixed timestep central difference solver. The structural displacement results agreed very well with each other when it comes to plastic displacement magnitude, elastic oscillation displacement, and response times.

Keywords: airblast from explosives, elastoplastic spring model, Modelica modelling language, SDOF, structural response of concrete structure

Procedia PDF Downloads 107
32 A Selective and Fast Hydrogen Sensor Using Doped-LaCrO₃ as Sensing Electrode

Authors: He Zhang, Jianxin Yi

Abstract:

As a clean energy, hydrogen shows many advantages such as renewability, high heat value, and extensive sources and may play an important role in the future society. However, hydrogen is a combustible gas because of its low ignition energy (0.02mJ) and wide explosive limit (4% ~ 74% in air). It is very likely to cause fire hazard or explosion once leakage is happened and not detected in time. Mixed-potential type sensor has attracted much attention in monitoring and detecting hydrogen due to its high response, simple support electronics and long-term stability. Typically, this kind of sensor is consisted of a sensing electrode (SE), a reference electrode (RE) and a solid electrolyte. The SE and RE materials usually display different electrocatalytic abilities to hydrogen. So hydrogen could be detected by measuring the EMF change between the two electrodes. Previous reports indicate that a high-performance sensing electrode is important for improving the sensing characteristics of the sensor. In this report, a planar type mixed-potential hydrogen sensor using La₀.₈Sr₀.₂Cr₀.₅Mn₀.₅O₃₋δ (LSCM) as SE, Pt as RE and yttria-stabilized zirconia (YSZ) as solid electrolyte was developed. The reason for selecting LSCM as sensing electrode is that it shows the high electrocatalytic ability to hydrogen in solid oxide fuel cells. The sensing performance of the fabricated LSCM/YSZ/Pt sensor was tested systemically. The experimental results show that the sensor displays high response to hydrogen. The response values for 100ppm and 1000ppm hydrogen at 450 ºC are -70 mV and -118 mV, respectively. The response time is an important parameter to evaluate a sensor. In this report, the sensor response time decreases with increasing hydrogen concentration and get saturated above 500ppm. The steady response time at 450 ºC is as short as 4s, indicating the sensor shows great potential in practical application to monitor hydrogen. An excellent response repeatability to 100ppm hydrogen at 450 ˚C and a good sensor reproducibility among three sensors were also observed. Meanwhile, the sensor exhibits excellent selectivity to hydrogen compared with several interfering gases such as NO₂, CH₄, CO, C₃H₈ and NH₃. Polarization curves were tested to investigate the sensing mechanism and the results indicated the sensor abide by the mixed-potential mechanism.

Keywords: fire hazard, H₂ sensor, mixed-potential, perovskite

Procedia PDF Downloads 150
31 Leveraging Natural Language Processing for Legal Artificial Intelligence: A Longformer Approach for Taiwanese Legal Cases

Authors: Hsin Lee, Hsuan Lee

Abstract:

Legal artificial intelligence (LegalAI) has been increasing applications within legal systems, propelled by advancements in natural language processing (NLP). Compared with general documents, legal case documents are typically long text sequences with intrinsic logical structures. Most existing language models have difficulty understanding the long-distance dependencies between different structures. Another unique challenge is that while the Judiciary of Taiwan has released legal judgments from various levels of courts over the years, there remains a significant obstacle in the lack of labeled datasets. This deficiency makes it difficult to train models with strong generalization capabilities, as well as accurately evaluate model performance. To date, models in Taiwan have yet to be specifically trained on judgment data. Given these challenges, this research proposes a Longformer-based pre-trained language model explicitly devised for retrieving similar judgments in Taiwanese legal documents. This model is trained on a self-constructed dataset, which this research has independently labeled to measure judgment similarities, thereby addressing a void left by the lack of an existing labeled dataset for Taiwanese judgments. This research adopts strategies such as early stopping and gradient clipping to prevent overfitting and manage gradient explosion, respectively, thereby enhancing the model's performance. The model in this research is evaluated using both the dataset and the Average Entropy of Offense-charged Clustering (AEOC) metric, which utilizes the notion of similar case scenarios within the same type of legal cases. Our experimental results illustrate our model's significant advancements in handling similarity comparisons within extensive legal judgments. By enabling more efficient retrieval and analysis of legal case documents, our model holds the potential to facilitate legal research, aid legal decision-making, and contribute to the further development of LegalAI in Taiwan.

Keywords: legal artificial intelligence, computation and language, language model, Taiwanese legal cases

Procedia PDF Downloads 45
30 The Effect of Artificial Intelligence on Petroleum Industry and Production

Authors: Mina Shokry Hanna Saleh Tadros

Abstract:

The centrality of the Petroleum Industry in the world energy is undoubted. The world economy almost runs and depends on petroleum. Petroleum industry is a multi-trillion industry; it turns otherwise poor and underdeveloped countries into wealthy nations and thrusts them at the center of international diplomacy. Although these developing nations lack the necessary technology to explore and exploit petroleum resources they are not without help as developed nations, represented by their multinational corporations are ready and willing to provide both the technical and managerial expertise necessary for the development of this natural resource. However, the exploration of these petroleum resources comes with, sometimes, grave, concomitant consequences. These consequences are especially pronounced with respect to the environment. From the British Petroleum Oil rig explosion and the resultant oil spillage and pollution in New Mexico, United States to the Mobil Oil spillage along Egyptian coast, the story and consequence is virtually the same. Egypt’s delta Region produces Nigeria’s petroleum which accounts for more than ninety-five percent of Nigeria’s foreign exchange earnings. Between 1999 and 2007, Egypt earned more than $400 billion from petroleum exports. Nevertheless, petroleum exploration and exploitation has devastated the Delta environment. From oil spillage which pollutes the rivers, farms and wetlands to gas flaring by the multi-national corporations; the consequences is similar-a region that has been devastated by petroleum exploitation. This paper thus seeks to examine the consequences and impact of petroleum pollution in the Egypt Delta with particular reference on the right of the people of Niger Delta to a healthy environment. The paper further seeks to examine the relevant international, regional instrument and Nigeria’s municipal laws that are meant to protect the result of the people of the Egypt Delta and their enforcement by the Nigerian State. It is quite worrisome that the Egypt Delta Region and its people have suffered and are still suffering grave violations of their right to a healthy environment as a result of petroleum exploitation in their region. The Egypt effort at best is half-hearted in its protection of the people’s right.

Keywords: crude oil, fire, floating roof tank, lightning protection systemenvironment, exploration, petroleum, pollutionDuvernay petroleum system, oil generation, oil-source correlation, Re-Os

Procedia PDF Downloads 25
29 Studies on Biojetfuel Obtained from Vegetable Oil: Process Characteristics, Engine Performance and Their Comparison with Mineral Jetfuel

Authors: F. Murilo T. Luna, Vanessa F. Oliveira, Alysson Rocha, Expedito J. S. Parente, Andre V. Bueno, Matheus C. M. Farias, Celio L. Cavalcante Jr.

Abstract:

Aviation jetfuel used in aircraft gas-turbine engines is customarily obtained from the kerosene distillation fraction of petroleum (150-275°C). Mineral jetfuel consists of a hydrocarbon mixture containing paraffins, naphthenes and aromatics, with low olefins content. In order to ensure their safety, several stringent requirements must be met by jetfuels, such as: high energy density, low risk of explosion, physicochemical stability and low pour point. In this context, aviation fuels eventually obtained from biofeedstocks (which have been coined as ‘biojetfuel’), must be used as ‘drop in’, since adaptations in aircraft engines are not desirable, to avoid problems with their operation reliability. Thus, potential aviation biofuels must present the same composition and physicochemical properties of conventional jetfuel. Among the potential feedtstocks for aviation biofuel, the babaçu oil, extracted from a palm tree extensively found in some regions of Brazil, contains expressive quantities of short chain saturated fatty acids and may be an interesting choice for biojetfuel production. In this study, biojetfuel was synthesized through homogeneous transesterification of babaçu oil using methanol and its properties were compared with petroleum-based jetfuel through measurements of oxidative stability, physicochemical properties and low temperature properties. The transesterification reactions were carried out using methanol and after decantation/wash procedures, the methyl esters were purified by molecular distillation under high vacuum at different temperatures. The results indicate significant improvement in oxidative stability and pour point of the products when compared to the fresh oil. After optimization of operational conditions, potential biojetfuel samples were obtained, consisting mainly of C8 esters, showing low pour point and high oxidative stability. Jet engine tests are being conducted in an automated test bed equipped with pollutant emissions analysers to study the operational performance of the biojetfuel that was obtained and compare with a mineral commercial jetfuel.

Keywords: biojetfuel, babaçu oil, oxidative stability, engine tests

Procedia PDF Downloads 233
28 Effect of Perceived Importance of a Task in the Prospective Memory Task

Authors: Kazushige Wada, Mayuko Ueda

Abstract:

In the present study, we reanalyzed lapse errors in the last phase of a job, by re-counting near lapse errors and increasing the number of participants. We also examined the results of this study from the perspective of prospective memory (PM), which concerns future actions. This study was designed to investigate whether perceiving the importance of PM tasks caused lapse errors in the last phase of a job and to determine if such errors could be explained from the perspective of PM processing. Participants (N = 34) conducted a computerized clicking task, in which they clicked on 10 figures that they had learned in advance in 8 blocks of 10 trials. Participants were requested to click the check box in the start display of a block and to click the checking off box in the finishing display. This task was a PM task. As a measure of PM performance, we counted the number of omission errors caused by forgetting to check off in the finishing display, which was defined as a lapse error. The perceived importance was manipulated by different instructions. Half the participants in the highly important task condition were instructed that checking off was very important, because equipment would be overloaded if it were not done. The other half in the not important task condition was instructed only about the location and procedure for checking off. Furthermore, we controlled workload and the emotion of surprise to confirm the effect of demand capacity and attention. To manipulate emotions during the clicking task, we suddenly presented a photo of a traffic accident and the sound of a skidding car followed by an explosion. Workload was manipulated by requesting participants to press the 0 key in response to a beep. Results indicated too few forgetting induced lapse errors to be analyzed. However, there was a weak main effect of the perceived importance of the check task, in which the mouse moved to the “END” button before moving to the check box in the finishing display. Especially, the highly important task group showed more such near lapse errors, than the not important task group. Neither surprise, nor workload affected the occurrence of near lapse errors. These results imply that high perceived importance of PM tasks impair task performance. On the basis of the multiprocess framework of PM theory, we have suggested that PM task performance in this experiment relied not on monitoring PM tasks, but on spontaneous retrieving.

Keywords: prospective memory, perceived importance, lapse errors, multi process framework of prospective memory.

Procedia PDF Downloads 419
27 Deflagration and Detonation Simulation in Hydrogen-Air Mixtures

Authors: Belyayev P. E., Makeyeva I. R., Mastyuk D. A., Pigasov E. E.

Abstract:

Previously, the phrase ”hydrogen safety” was often used in terms of NPP safety. Due to the rise of interest to “green” and, particularly, hydrogen power engineering, the problem of hydrogen safety at industrial facilities has become ever more urgent. In Russia, the industrial production of hydrogen is meant to be performed by placing a chemical engineering plant near NPP, which supplies the plant with the necessary energy. In this approach, the production of hydrogen involves a wide range of combustible gases, such as methane, carbon monoxide, and hydrogen itself. Considering probable incidents, sudden combustible gas outburst into open space with further ignition is less dangerous by itself than ignition of the combustible mixture in the presence of many pipelines, reactor vessels, and any kind of fitting frames. Even ignition of 2100 cubic meters of the hydrogen-air mixture in open space gives velocity and pressure that are much lesser than velocity and pressure in Chapman-Jouguet condition and do not exceed 80 m/s and 6 kPa accordingly. However, the space blockage, the significant change of channel diameter on the way of flame propagation, and the presence of gas suspension lead to significant deflagration acceleration and to its transition into detonation or quasi-detonation. At the same time, process parameters acquired from the experiments at specific experimental facilities are not general, and their application to different facilities can only have a conventional and qualitative character. Yet, conducting deflagration and detonation experimental investigation for each specific industrial facility project in order to determine safe infrastructure unit placement does not seem feasible due to its high cost and hazard, while the conduction of numerical experiments is significantly cheaper and safer. Hence, the development of a numerical method that allows the description of reacting flows in domains with complex geometry seems promising. The base for this method is the modification of Kuropatenko method for calculating shock waves recently developed by authors, which allows using it in Eulerian coordinates. The current work contains the results of the development process. In addition, the comparison of numerical simulation results and experimental series with flame propagation in shock tubes with orifice plates is presented.

Keywords: CFD, reacting flow, DDT, gas explosion

Procedia PDF Downloads 58
26 Buddhism: Its Socio-Economic Relevance in the Present Changing World

Authors: Bandana Bhattacharya

Abstract:

‘Buddhism’, as such signifies the ‘ism’ that is based on Buddha’s life and teachings or that is concerned with the gospel of Buddha as recorded in the literature available in Pali, Sanskrit, Buddhist Sanskrit, Prakrit and even in the other non-Indian languages wherein it has been described a very abstruse, complex and lofty philosophy of life or ‘the way of life’ preached by Him (Buddha). It has another side too, i.e., the applicability of the tenets of Buddha according to the needs of the present society, where human life and outlook has been totally changed. Applied Buddhism signifies the applicability of the Buddha’s noble tenets. Along with the theological exposition and textual criticism of the Buddha’s discourses, it has now become almost obligatory for the Buddhist scholars to re-interpret Buddhism from modern perspectives. Basically Applied Buddhism defined a ‘way of life’ which may transform the higher quality of life or essence of life due to changed circumstances, places and time. Nowadays, if we observe the present situation of the world, we will find the current problems such as health, economic, politic, global warming, population explosion, pollution of all types including cultural scarcity essential commodities and indiscriminate use of human, natural and water resources are becoming more and more pronounced day by day, under such a backdrop of world situation. Applied Buddhism rather Buddhism may be the only instrument left now for mankind to address all such human achievements, lapses, and problems. Buddha’s doctrine is itself called ‘akālika, timeless’. On the eve of the Mahāparinibbāṇa at Kusinara, the Blessed One allows His disciples to change, modify and alter His minor teachings according to the needs of the future, although He has made some utterances, which would eternally remain fresh. Hence Buddhism has been able to occupy a prominent place in modern life, because of its timeless applicability, emanating from a set of eternal values. The logical and scientific outlook of Buddha may be traced in His very first sermon named the Dhammacakkapavattana-Sutta where He suggested to avoid the two extremes, namely, constantly attachment to sensual pleasures (Kāmasukhallikānuyoga) and devotion to self-mortification that is painful as well as unprofitable and asked to adopt Majjhimapaṭipadā, ‘Middle path’, which is very much applicable even today in every spheres of human life; and the absence of which is the root cause of all problems event at present. This paper will be a humble attempt to highlight the relevance of Buddhism in the present society.

Keywords: applied Buddhism, ecology, self-awareness, value

Procedia PDF Downloads 99
25 Self-Energy Sufficiency Assessment of the Biorefinery Annexed to a Typical South African Sugar Mill

Authors: M. Ali Mandegari, S. Farzad, , J. F. Görgens

Abstract:

Sugar is one of the main agricultural industries in South Africa and approximately livelihoods of one million South Africans are indirectly dependent on sugar industry which is economically struggling with some problems and should re-invent in order to ensure a long-term sustainability. Second generation biorefinery is defined as a process to use waste fibrous for the production of biofuel, chemicals animal food, and electricity. Bioethanol is by far the most widely used biofuel for transportation worldwide and many challenges in front of bioethanol production were solved. Biorefinery annexed to the existing sugar mill for production of bioethanol and electricity is proposed to sugar industry and is addressed in this study. Since flowsheet development is the key element of the bioethanol process, in this work, a biorefinery (bioethanol and electricity production) annexed to a typical South African sugar mill considering 65ton/h dry sugarcane bagasse and tops/trash as feedstock was simulated. Aspen PlusTM V8.6 was applied as simulator and realistic simulation development approach was followed to reflect the practical behaviour of the plant. Latest results of other researches considering pretreatment, hydrolysis, fermentation, enzyme production, bioethanol production and other supplementary units such as evaporation, water treatment, boiler, and steam/electricity generation units were adopted to establish a comprehensive biorefinery simulation. Steam explosion with SO2 was selected for pretreatment due to minimum inhibitor production and simultaneous saccharification and fermentation (SSF) configuration was adopted for enzymatic hydrolysis and fermentation of cellulose and hydrolyze. Bioethanol purification was simulated by two distillation columns with side stream and fuel grade bioethanol (99.5%) was achieved using molecular sieve in order to minimize the capital and operating costs. Also boiler and steam/power generation were completed using industrial design data. Results indicates that the annexed biorefinery can be self-energy sufficient when 35% of feedstock (tops/trash) bypass the biorefinery process and directly be loaded to the boiler to produce sufficient steam and power for sugar mill and biorefinery plant.

Keywords: biorefinery, self-energy sufficiency, tops/trash, bioethanol, electricity

Procedia PDF Downloads 516
24 Biodiesel Production from Edible Oil Wastewater Sludge with Bioethanol Using Nano-Magnetic Catalysis

Authors: Wighens Ngoie Ilunga, Pamela J. Welz, Olewaseun O. Oyekola, Daniel Ikhu-Omoregbe

Abstract:

Currently, most sludge from the wastewater treatment plants of edible oil factories is disposed to landfills, but landfill sites are finite and potential sources of environmental pollution. Production of biodiesel from wastewater sludge can contribute to energy production and waste minimization. However, conventional biodiesel production is energy and waste intensive. Generally, biodiesel is produced from the transesterification reaction of oils with alcohol (i.e., Methanol, ethanol) in the presence of a catalyst. Homogeneously catalysed transesterification is the conventional approach for large-scale production of biodiesel as reaction times are relatively short. Nevertheless, homogenous catalysis presents several challenges such as high probability of soap. The current study aimed to reuse wastewater sludge from the edible oil industry as a novel feedstock for both monounsaturated fats and bioethanol for the production of biodiesel. Preliminary results have shown that the fatty acid profile of the oilseed wastewater sludge is favourable for biodiesel production with 48% (w/w) monounsaturated fats and that the residue left after the extraction of fats from the sludge contains sufficient fermentable sugars after steam explosion followed by an enzymatic hydrolysis for the successful production of bioethanol [29% (w/w)] using a commercial strain of Saccharomyces cerevisiae. A novel nano-magnetic catalyst was synthesised from mineral processing alkaline tailings, mainly containing dolomite originating from cupriferous ores using a modified sol-gel. The catalyst elemental chemical compositions and structural properties were characterised by X-ray diffraction (XRD), scanning electron microscopy (SEM), Fourier transform infra-red (FTIR) and the BET for the surface area with 14.3 m²/g and 34.1 nm average pore diameter. The mass magnetization of the nano-magnetic catalyst was 170 emu/g. Both the catalytic properties and reusability of the catalyst were investigated. A maximum biodiesel yield of 78% was obtained, which dropped to 52% after the fourth transesterification reaction cycle. The proposed approach has the potential to reduce material costs, energy consumption and water usage associated with conventional biodiesel production technologies. It may also mitigate the impact of conventional biodiesel production on food and land security, while simultaneously reducing waste.

Keywords: biodiesel, bioethanol, edible oil wastewater sludge, nano-magnetism

Procedia PDF Downloads 116
23 Waste Management in a Hot Laboratory of Japan Atomic Energy Agency – 1: Overview and Activities in Chemical Processing Facility

Authors: Kazunori Nomura, Hiromichi Ogi, Masaumi Nakahara, Sou Watanabe, Atsuhiro Shibata

Abstract:

Chemical Processing Facility of Japan Atomic Energy Agency is a basic research field for advanced back-end technology developments with using actual high-level radioactive materials such as irradiated fuels from the fast reactor, high-level liquid waste from reprocessing plant. In the nature of a research facility, various kinds of chemical reagents have been offered for fundamental tests. Most of them were treated properly and stored in the liquid waste vessel equipped in the facility, but some were not treated and remained at the experimental space as a kind of legacy waste. It is required to treat the waste in safety. On the other hand, we formulated the Medium- and Long-Term Management Plan of Japan Atomic Energy Agency Facilities. This comprehensive plan considers Chemical Processing Facility as one of the facilities to be decommissioned. Even if the plan is executed, treatment of the “legacy” waste beforehand must be a necessary step for decommissioning operation. Under this circumstance, we launched a collaborative research project called the STRAD project, which stands for Systematic Treatment of Radioactive liquid waste for Decommissioning, in order to develop the treatment processes for wastes of the nuclear research facility. In this project, decomposition methods of chemicals causing a troublesome phenomenon such as corrosion and explosion have been developed and there is a prospect of their decomposition in the facility by simple method. And solidification of aqueous or organic liquid wastes after the decomposition has been studied by adding cement or coagulants. Furthermore, we treated experimental tools of various materials with making an effort to stabilize and to compact them before the package into the waste container. It is expected to decrease the number of transportation of the solid waste and widen the operation space. Some achievements of these studies will be shown in this paper. The project is expected to contribute beneficial waste management outcome that can be shared world widely.

Keywords: chemical processing facility, medium- and long-term management plan of JAEA facilities, STRAD project, treatment of radioactive waste

Procedia PDF Downloads 123
22 Flow Sheet Development and Simulation of a Bio-refinery Annexed to Typical South African Sugar Mill

Authors: M. Ali Mandegari, S. Farzad, J. F. Görgens

Abstract:

Sugar is one of the main agricultural industries in South Africa and approximately livelihoods of one million South Africans are indirectly dependent on sugar industry which is economically struggling with some problems and should re-invent in order to ensure a long-term sustainability. Second generation bio-refinery is defined as a process to use waste fibrous for the production of bio-fuel, chemicals animal food, and electricity. Bio-ethanol is by far the most widely used bio-fuel for transportation worldwide and many challenges in front of bio-ethanol production were solved. Bio-refinery annexed to the existing sugar mill for production of bio-ethanol and electricity is proposed to sugar industry and is addressed in this study. Since flow-sheet development is the key element of the bio-ethanol process, in this work, a bio-refinery (bio-ethanol and electricity production) annexed to a typical South African sugar mill considering 65ton/h dry sugarcane bagasse and tops/trash as feedstock was simulated. Aspen PlusTM V8.6 was applied as simulator and realistic simulation development approach was followed to reflect the practical behavior of the plant. Latest results of other researches considering pretreatment, hydrolysis, fermentation, enzyme production, bio-ethanol production and other supplementary units such as evaporation, water treatment, boiler, and steam/electricity generation units were adopted to establish a comprehensive bio-refinery simulation. Steam explosion with SO2 was selected for pretreatment due to minimum inhibitor production and simultaneous saccharification and fermentation (SSF) configuration was adopted for enzymatic hydrolysis and fermentation of cellulose and hydrolyze. Bio-ethanol purification was simulated by two distillation columns with side stream and fuel grade bio-ethanol (99.5%) was achieved using molecular sieve in order to minimize the capital and operating costs. Also boiler and steam/power generation were completed using industrial design data. Results indicates 256.6 kg bio ethanol per ton of feedstock and 31 MW surplus power were attained from bio-refinery while the process consumes 3.5, 3.38, and 0.164 (GJ/ton per ton of feedstock) hot utility, cold utility and electricity respectively. Developed simulation is a threshold of variety analyses and developments for further studies.

Keywords: bio-refinery, bagasse, tops, trash, bio-ethanol, electricity

Procedia PDF Downloads 495
21 Developing Improvements to Multi-Hazard Risk Assessments

Authors: A. Fathianpour, M. B. Jelodar, S. Wilkinson

Abstract:

This paper outlines the approaches taken to assess multi-hazard assessments. There is currently confusion in assessing multi-hazard impacts, and so this study aims to determine which of the available options are the most useful. The paper uses an international literature search, and analysis of current multi-hazard assessments and a case study to illustrate the effectiveness of the chosen method. Findings from this study will help those wanting to assess multi-hazards to undertake a straightforward approach. The paper is significant as it helps to interpret the various approaches and concludes with the preferred method. Many people in the world live in hazardous environments and are susceptible to disasters. Unfortunately, when a disaster strikes it is often compounded by additional cascading hazards, thus people would confront more than one hazard simultaneously. Hazards include natural hazards (earthquakes, floods, etc.) or cascading human-made hazards (for example, Natural Hazard Triggering Technological disasters (Natech) such as fire, explosion, toxic release). Multi-hazards have a more destructive impact on urban areas than one hazard alone. In addition, climate change is creating links between different disasters such as causing landslide dams and debris flows leading to more destructive incidents. Much of the prevailing literature deals with only one hazard at a time. However, recently sophisticated multi-hazard assessments have started to appear. Given that multi-hazards occur, it is essential to take multi-hazard risk assessment under consideration. This paper aims to review the multi-hazard assessment methods through articles published to date and categorize the strengths and disadvantages of using these methods in risk assessment. Napier City is selected as a case study to demonstrate the necessity of using multi-hazard risk assessments. In order to assess multi-hazard risk assessments, first, the current multi-hazard risk assessment methods were described. Next, the drawbacks of these multi-hazard risk assessments were outlined. Finally, the improvements to current multi-hazard risk assessments to date were summarised. Generally, the main problem of multi-hazard risk assessment is to make a valid assumption of risk from the interactions of different hazards. Currently, risk assessment studies have started to assess multi-hazard situations, but drawbacks such as uncertainty and lack of data show the necessity for more precise risk assessment. It should be noted that ignoring or partial considering multi-hazards in risk assessment will lead to an overestimate or overlook in resilient and recovery action managements.

Keywords: cascading hazards, disaster assessment, mullti-hazards, risk assessment

Procedia PDF Downloads 84
20 Towards a Sustainable High Population Density Urban Intertextuality – Program Re-Configuration Integrated Urban Design Study in Hangzhou, China

Authors: Xuan Li, Lei Xu

Abstract:

By the end of 2014, China has an urban population of 749 million, reaching the urbanization rate of 54.77%. Dense and vertical urban structure has become a common choice for China and most of the densely populated Asian countries for sustainable development. This paper focuses on the most conspicuous urban change period in China, from 2000 to 2010, during which China's population shifted the fastest from rural region to cities. On one hand, the 200 million nationwide "new citizen" along with the 456 million "old citizen" explored in the new-century city for new urban lifestyle and livable built environment; On the other hand, however, large-scale rapid urban constructions are confined to the methods of traditional two-dimensional architectural thinking. Human-oriented design and system thinking have been missing in this intricate postmodern urban condition. This phenomenon, especially the gap and spark between the solid, huge urban physical system and the rich, subtle everyday urban life, will be studied in depth: How the 20th-century high-rise residential building "spontaneously" turned into an old but expensive multi-functional high-rise complex in the 21st century city center; how 21st century new/late 20th century old public buildings with the same function integrated their different architectural forms into the new / old city center? Finally the paper studies cases in Hangzhou: 1) Function Evolve–downtown high-rise residential building “International Garden” and “Zhongshan Garden” (1999). 2) Form Compare–Hangzhou Theater (1998) vs Hangzhou Grand Theatre (2004), Hangzhou City Railway Station (1999) vs Hangzhou East Railway Station (2013). The research aims at the exploring the essence of city from the building form dispel and urban program re-configuration approach, gaining a better consideration of human behavior through compact urban design effort for improving urban intertextuality, searching for a sustainable development path in the crucial time of urban population explosion in China.

Keywords: architecture form dispel, compact urban design, urban intertextuality, urban program re-configuration

Procedia PDF Downloads 465
19 City Buses and Sustainable Urban Mobility in Kano Metropolis 1967-2015: An Historical Perspective

Authors: Yusuf Umar Madugu

Abstract:

Since its creation in 1967, Kano has tremendously undergone political, social and economic transformations. Public urban transportation has been playing a vital role in sustaining economic growth of Kano metropolis, especially with the existence of modern buses with the regular network of roads, in all the main centers of trade. This study, therefore, centers on the role of intra-city buses in molding the economy of Kano. Its main focus is post-colonial Kano (i.e. 1967-2015), a period that witnessed rapid expansion of commercial activities and ever increasing urbanization which goes along with it population explosion. The commuters patronized the urban transport, a situation that made the business lucrative. More so, the traders who had come from within and outside Kano relied heavily on commercial vehicles to transport their merchandise to their various destinations. Commercial road transport system, therefore, had become well organized in Kano with a significant number of people earning their means of livelihood from it. It also serves as a source of revenue to governments at different levels. However, the study of transport and development as an academic discipline is inter-disciplinary in nature. This study, therefore, employs the services and the methodologies of other disciplines such as Geography, History, Urban and Regional Planning, Engineering, Computer Science, Economics, etc. to provide a comprehensive picture of the issues under investigation. The source materials for this study included extensive use of written literature and oral information. In view of the crucial importance of intra-city commercial transport services, this study demonstrates its role in the overall economic transformation of the study area. It generally also, contributed in opening up a new ground and looked into the history of commercial transport system. At present, Kano Metropolitan area is located between latitude 110 50’ and 12007’, and longitude 80 22’ and 80 47’ within the Semi-Arid Sudan Savannah Zone of West Africa about 840kilometers of the edge of the Sahara desert. The Metropolitan area has expanded over the years and has become the third largest conurbation in Nigeria with a population of about 4million. It is made up of eight local government areas viz: Kano Municipal, Gwale, Dala, Tarauni, Nasarawa, Fage, Ungogo, and Kumbotso.

Keywords: assessment, buses, city, mobility, sustainable

Procedia PDF Downloads 193
18 Combustion Characteristics of Ionized Fuels for Battery System Safety

Authors: Hyeuk Ju Ko, Eui Ju Lee

Abstract:

Many electronic devices are powered by various rechargeable batteries such as lithium-ion today, but occasionally the batteries undergo thermal runaway and cause fire, explosion, and other hazards. If a battery fire should occur in an electronic device of vehicle and aircraft cabin, it is important to quickly extinguish the fire and cool the batteries to minimize safety risks. Attempts to minimize these risks have been carried out by many researchers but the number of study on the successful extinguishment is limited. Because most rechargeable batteries are operated on the ion state with electron during charge and discharge of electricity, and the reaction of this electrolyte has a big difference with normal combustion. Here, we focused on the effect of ions on reaction stability and pollutant emissions during combustion process. The other importance for understanding ionized fuel combustion could be found in high efficient and environment-friendly combustion technologies, which are used to be operated an extreme condition and hence results in unintended flame instability such as extinction and oscillation. The use of electromagnetic energy and non-equilibrium plasma is one of the way to solve the problems, but the application has been still limited because of lack of excited ion effects in the combustion process. Therefore, the understanding of ion role during combustion might be promised to the energy safety society including the battery safety. In this study, the effects of an ionized fuel on the flame stability and pollutant emissions were experimentally investigated in the hydrocarbon jet diffusion flames. The burner used in this experiment consisted of 7.5 mm diameter tube for fuel and the gaseous fuels were ionized with the ionizer (SUNJE, SPN-11). Methane (99.9% purity) and propane (commercial grade) were used as a fuel and open ambient air was used as an oxidizer. As the performance of ionizer used in the experiment was evaluated at first, ion densities of both propane and methane increased linearly with volume flow rate but the ion density of propane is slightly higher than that of methane. The results show that the overall flame stability and shape such as flame length has no significant difference even in the higher ion concentration. However, the fuel ionization affects to the pollutant emissions such as NOx and soot. NOx and CO emissions measured in post flame region decreased with increasing fuel ionization, especially at high fuel velocity, i.e. high ion density. TGA analysis and morphology of soot by TEM indicates that the fuel ionization makes soot to be matured.

Keywords: battery fires, ionization, jet flames, stability, NOx and soot

Procedia PDF Downloads 159
17 Comparison of Spiking Neuron Models in Terms of Biological Neuron Behaviours

Authors: Fikret Yalcinkaya, Hamza Unsal

Abstract:

To understand how neurons work, it is required to combine experimental studies on neural science with numerical simulations of neuron models in a computer environment. In this regard, the simplicity and applicability of spiking neuron modeling functions have been of great interest in computational neuron science and numerical neuroscience in recent years. Spiking neuron models can be classified by exhibiting various neuronal behaviors, such as spiking and bursting. These classifications are important for researchers working on theoretical neuroscience. In this paper, three different spiking neuron models; Izhikevich, Adaptive Exponential Integrate Fire (AEIF) and Hindmarsh Rose (HR), which are based on first order differential equations, are discussed and compared. First, the physical meanings, derivatives, and differential equations of each model are provided and simulated in the Matlab environment. Then, by selecting appropriate parameters, the models were visually examined in the Matlab environment and it was aimed to demonstrate which model can simulate well-known biological neuron behaviours such as Tonic Spiking, Tonic Bursting, Mixed Mode Firing, Spike Frequency Adaptation, Resonator and Integrator. As a result, the Izhikevich model has been shown to perform Regular Spiking, Continuous Explosion, Intrinsically Bursting, Thalmo Cortical, Low-Threshold Spiking and Resonator. The Adaptive Exponential Integrate Fire model has been able to produce firing patterns such as Regular Ignition, Adaptive Ignition, Initially Explosive Ignition, Regular Explosive Ignition, Delayed Ignition, Delayed Regular Explosive Ignition, Temporary Ignition and Irregular Ignition. The Hindmarsh Rose model showed three different dynamic neuron behaviours; Spike, Burst and Chaotic. From these results, the Izhikevich cell model may be preferred due to its ability to reflect the true behavior of the nerve cell, the ability to produce different types of spikes, and the suitability for use in larger scale brain models. The most important reason for choosing the Adaptive Exponential Integrate Fire model is that it can create rich ignition patterns with fewer parameters. The chaotic behaviours of the Hindmarsh Rose neuron model, like some chaotic systems, is thought to be used in many scientific and engineering applications such as physics, secure communication and signal processing.

Keywords: Izhikevich, adaptive exponential integrate fire, Hindmarsh Rose, biological neuron behaviours, spiking neuron models

Procedia PDF Downloads 145
16 Between the House and the City: An Investigation of the Structure of the Family/Society and the Role of the Public Housing in Tokyo and Berlin

Authors: Abudjana Babiker

Abstract:

The middle of twenty century witnessed an explosion in public housing. After the great depression, some of the capitalists and communist countries have launched policies and programs to produce public housing in the urban areas. Concurrently, modernity was the leading architecture style at the time excessively supported the production, and principally was the instrument for the success of the public housing program due to the modernism manifesto for manufactured architecture as an international style that serves the society and parallelly connect it to the other design industries which allowed for the production of the architecture elements. After the second world war, public housing flourished, especially in communist’s countries. The idea of public housing was conceived as living spaces at the time, while the Workplaces performed as the place for production and labor. Michel Foucault - At the end of the twenty century- the introduction of biopolitics has had highlighted the alteration in the production and labor inter-function. The house does not precisely perform as the sanctuary, from the production, for the family, it opens the house to be -part of the city as- a space for production, not only to produce objects but to reproduce the family as a total part of the production mechanism in the city. While the public housing kept altering from one country to another after the failure of the modernist’s public housing in the late 1970s, the society continued changing parallelly with the socio-economic condition in each political-economical system, and the public housing thus followed. The family structure in the major cities has been dramatically changing, single parenting and the long working hours, for instance, have been escalating the loneliness in the major cities such as London, Berlin, and Tokyo and the public housing for the families is no longer suits the single lifestyle for the individuals. This Paper investigates the performance of both the single/individual lifestyle and the family/society structure in Tokyo and Berlin in a relation to the utilization of public housing under economical policies and the socio-political environment that produced the individuals and the collective. The study is carried through the study of the undercurrent individual/society and case studies to examine the performance of the utilization of the housing. The major finding is that the individual/collective are revolving around the city; the city identified and acts as a system that magnetized and blurred the line between production and reproduction lifestyle. The mass public housing for families is shifting to be a combination between neo-liberalism and socialism housing.

Keywords: loneliness, production reproduction, work live, publichousing

Procedia PDF Downloads 161
15 Analysis and Design Modeling for Next Generation Network Intrusion Detection and Prevention System

Authors: Nareshkumar Harale, B. B. Meshram

Abstract:

The continued exponential growth of successful cyber intrusions against today’s businesses has made it abundantly clear that traditional perimeter security measures are no longer adequate and effective. We evolved the network trust architecture from trust-untrust to Zero-Trust, With Zero Trust, essential security capabilities are deployed in a way that provides policy enforcement and protection for all users, devices, applications, data resources, and the communications traffic between them, regardless of their location. Information exchange over the Internet, in spite of inclusion of advanced security controls, is always under innovative, inventive and prone to cyberattacks. TCP/IP protocol stack, the adapted standard for communication over network, suffers from inherent design vulnerabilities such as communication and session management protocols, routing protocols and security protocols are the major cause of major attacks. With the explosion of cyber security threats, such as viruses, worms, rootkits, malwares, Denial of Service attacks, accomplishing efficient and effective intrusion detection and prevention is become crucial and challenging too. In this paper, we propose a design and analysis model for next generation network intrusion detection and protection system as part of layered security strategy. The proposed system design provides intrusion detection for wide range of attacks with layered architecture and framework. The proposed network intrusion classification framework deals with cyberattacks on standard TCP/IP protocol, routing protocols and security protocols. It thereby forms the basis for detection of attack classes and applies signature based matching for known cyberattacks and data mining based machine learning approaches for unknown cyberattacks. Our proposed implemented software can effectively detect attacks even when malicious connections are hidden within normal events. The unsupervised learning algorithm applied to network audit data trails results in unknown intrusion detection. Association rule mining algorithms generate new rules from collected audit trail data resulting in increased intrusion prevention though integrated firewall systems. Intrusion response mechanisms can be initiated in real-time thereby minimizing the impact of network intrusions. Finally, we have shown that our approach can be validated and how the analysis results can be used for detecting and protection from the new network anomalies.

Keywords: network intrusion detection, network intrusion prevention, association rule mining, system analysis and design

Procedia PDF Downloads 200
14 Enzymatic Hydrolysis of Sugar Cane Bagasse Using Recombinant Hemicellulases

Authors: Lorena C. Cintra, Izadora M. De Oliveira, Amanda G. Fernandes, Francieli Colussi, Rosália S. A. Jesuíno, Fabrícia P. Faria, Cirano J. Ulhoa

Abstract:

Xylan is the main component of hemicellulose and for its complete degradation is required cooperative action of a system consisting of several enzymes including endo-xylanases (XYN), β-xylosidases (XYL) and α-L-arabinofuranosidases (ABF). The recombinant hemicellulolytic enzymes an endoxylanase (HXYN2), β-xylosidase (HXYLA), and an α-L-arabinofuranosidase (ABF3) were used in hydrolysis tests. These three enzymes are produced by filamentous fungi and were expressed heterologously and produced in Pichia pastoris previously. The aim of this work was to evaluate the effect of recombinant hemicellulolytic enzymes on the enzymatic hydrolysis of sugarcane bagasse (SCB). The interaction between the three recombinant enzymes during SCB pre-treated by steam explosion hydrolysis was performed with different concentrations of HXYN2, HXYLA and ABF3 in different ratios in according to a central composite rotational design (CCRD) 23, including six axial points and six central points, totaling 20 assays. The influence of the factors was assessed by analyzing the main effects and interaction between the factors, calculated using Statistica 8.0 software (StatSoft Inc. Tulsa, OK, USA). The Pareto chart was constructed with this software and showed the values of the Student’s t test for each recombinant enzyme. It was considered as response variable the quantification of reducing sugars by DNS (mg/mL). The Pareto chart showed that the recombinant enzyme ABF3 exerted more significant effect during SCB hydrolysis, with higher concentrations and with the lowest concentration of this enzyme. It was performed analysis of variance according to Fisher method (ANOVA). In ANOVA for the release of reducing sugars (mg/ml) as the variable response, the concentration of ABF3 showed significance during hydrolysis SCB. The result obtained by ANOVA, is in accordance with those presented in the analysis method based on the statistical Student's t (Pareto chart). The degradation of the central chain of xylan by HXYN2 and HXYLA was more strongly influenced by ABF3 action. A model was obtained, and it describes the performance of the interaction of all three enzymes for the release of reducing sugars, and can be used to better explain the results of the statistical analysis. The formulation capable of releasing the higher levels of reducing sugars had the following concentrations: HXYN2 with 600 U/g of substrate, HXYLA with 11.5 U.g-1 and ABF3 with 0.32 U.g-1. In conclusion, the recombinant enzyme that has a more significant effect during SCB hydrolysis was ABF3. It is noteworthy that the xylan present in the SCB is arabinoglucoronoxylan, due to this fact debranching enzymes are important to allow access of enzymes that act on the central chain.

Keywords: experimental design, hydrolysis, recombinant enzymes, sugar cane bagasse

Procedia PDF Downloads 198
13 Prediction of Live Birth in a Matched Cohort of Elective Single Embryo Transfers

Authors: Mohsen Bahrami, Banafsheh Nikmehr, Yueqiang Song, Anuradha Koduru, Ayse K. Vuruskan, Hongkun Lu, Tamer M. Yalcinkaya

Abstract:

In recent years, we have witnessed an explosion of studies aimed at using a combination of artificial intelligence (AI) and time-lapse imaging data on embryos to improve IVF outcomes. However, despite promising results, no study has used a matched cohort of transferred embryos which only differ in pregnancy outcome, i.e., embryos from a single clinic which are similar in parameters, such as: morphokinetic condition, patient age, and overall clinic and lab performance. Here, we used time-lapse data on embryos with known pregnancy outcomes to see if the rich spatiotemporal information embedded in this data would allow the prediction of the pregnancy outcome regardless of such critical parameters. Methodology—We did a retrospective analysis of time-lapse data from our IVF clinic utilizing Embryoscope 100% of the time for embryo culture to blastocyst stage with known clinical outcomes, including live birth vs nonpregnant (embryos with spontaneous abortion outcomes were excluded). We used time-lapse data from 200 elective single transfer embryos randomly selected from January 2019 to June 2021. Our sample included 100 embryos in each group with no significant difference in patient age (P=0.9550) and morphokinetic scores (P=0.4032). Data from all patients were combined to make a 4th order tensor, and feature extraction were subsequently carried out by a tensor decomposition methodology. The features were then used in a machine learning classifier to classify the two groups. Major Findings—The performance of the model was evaluated using 100 random subsampling cross validation (train (80%) - test (20%)). The prediction accuracy, averaged across 100 permutations, exceeded 80%. We also did a random grouping analysis, in which labels (live birth, nonpregnant) were randomly assigned to embryos, which yielded 50% accuracy. Conclusion—The high accuracy in the main analysis and the low accuracy in random grouping analysis suggest a consistent spatiotemporal pattern which is associated with pregnancy outcomes, regardless of patient age and embryo morphokinetic condition, and beyond already known parameters, such as: early cleavage or early blastulation. Despite small samples size, this ongoing analysis is the first to show the potential of AI methods in capturing the complex morphokinetic changes embedded in embryo time-lapse data, which contribute to successful pregnancy outcomes, regardless of already known parameters. The results on a larger sample size with complementary analysis on prediction of other key outcomes, such as: euploidy and aneuploidy of embryos will be presented at the meeting.

Keywords: IVF, embryo, machine learning, time-lapse imaging data

Procedia PDF Downloads 64
12 Understanding the Dynamics of Human-Snake Negative Interactions: A Study of Indigenous Perceptions in Tamil Nadu, Southern India

Authors: Ramesh Chinnasamy, Srishti Semalty, Vishnu S. Nair, Thirumurugan Vedagiri, Mahesh Ganeshan, Gautam Talukdar, Karthy Sivapushanam, Abhijit Das

Abstract:

Snakes form an integral component of ecological systems. Human population explosion and associated acceleration of habitat destruction and degradation, has led to a rapid increase in human-snake encounters. The study aims at understanding the level of awareness, knowledge, and attitude of the people towards human-snake negative interaction and role of awareness programmes in the Moyar river valley, Tamil Nadu. The study area is part of the Mudumalai and the Sathyamangalam Tiger Reserves, which are significant wildlife corridors between the Western Ghats and the Eastern Ghats in the Nilgiri Biosphere Reserve. The data was collected using questionnaire covering 644 respondents spread across 18 villages between 2018 and 2019. The study revealed that 86.5% of respondents had strong negative perceptions towards snakes which were propelled by fear, superstitions, and threat of snakebite which was common and did not vary among different villages (F=4.48; p = <0.05) and age groups (X2 = 1.946; p = 0.962). Cobra 27.8% (n = 294) and rat snake 21.3% (n = 225) were the most sighted species and most snake encounter occurred during the monsoon season i.e., July 35.6 (n = 218), June 19.1% (n = 117) and August 18.4% (n = 113). At least 1 out of 5 respondents was reportedly bitten by snakes during their lifetime. The most common species of snakes that were the cause of snakebite were Saw scaled viper (32.6%, n = 42) followed by Cobra 17.1% (n = 22). About 21.3% (n = 137) people reported livestock loss due to pythons and other snakes 21.3% (n = 137). Most people, preferred medical treatment for snakebite (87.3%), whereas 12.7%, still believed in traditional methods. The majority (82.3%) used precautionary measure by keeping traditional items such as garlic, kerosene, and snake plant to avoid snakes. About 30% of the respondents expressed need for technical and monetary support from the forest department that could aid in reducing the human-snake conflict. It is concluded that the general perception in the study area is driven by fear and negative attitude towards snakes. Though snakes such as Cobra were widely worshiped in the region, there are still widespread myths and misconceptions that have led to the irrational killing of snakes. Awareness and innovative education programs rooted in the local context and language should be integrated at the village level, to minimize risk and the associated threat of snakebite among the people. Results from this study shall help policy makers to devise appropriate conservation measures to reduce human-snake conflicts in India.

Keywords: Envenomation, Health-Education, Human-Wildlife Conflict, Neglected Tropical Disease, Snakebite Mitigation, Traditional Practitioners

Procedia PDF Downloads 187
11 Towards a Strategic Framework for State-Level Epistemological Functions

Authors: Mark Darius Juszczak

Abstract:

While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.

Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program

Procedia PDF Downloads 52
10 Sociology Perspective on Emotional Maltreatment: Retrospective Case Study in a Japanese Elementary School

Authors: Nozomi Fujisaka

Abstract:

This sociological case study analyzes a sequence of student maltreatment in an elementary school in Japan, based on narratives from former students. Among various forms of student maltreatment, emotional maltreatment has received less attention. One reason for this is that emotional maltreatment is often considered part of education and is difficult to capture in surveys. To discuss the challenge of recognizing emotional maltreatment, it's necessary to consider the social background in which student maltreatment occurs. Therefore, from the perspective of the sociology of education, this study aims to clarify the process through which emotional maltreatment was embraced by students within a Japanese classroom. The focus of this study is a series of educational interactions by a homeroom teacher with 11- or 12-year-old students at a small public elementary school approximately 10 years ago. The research employs retrospective narrative data collected through interviews and autoethnography. The semi-structured interviews, lasting one to three hours each, were conducted with 11 young people who were enrolled in the same class as the researcher during their time in elementary school. Autoethnography, as a critical research method, contributes to existing theories and studies by providing a critical representation of the researcher's own experiences. Autoethnography enables researchers to collect detailed data that is often difficult to verbalize in interviews. These research methods are well-suited for this study, which aims to shift the focus from teachers' educational intentions to students' perspectives and gain a deeper understanding of student maltreatment. The research results imply a pattern of emotional maltreatment that is challenging to differentiate from education. In this study's case, the teacher displayed calm and kind behavior toward students after a threat and an explosion of anger. Former students frequently mentioned this behavior of the teacher and perceived emotional maltreatment as part of education. It was not uncommon for former students to offer positive evaluations of the teacher despite experiencing emotional distress. These findings are analyzed and discussed in conjunction with the deschooling theory and the cycle of violence theory. The deschooling theory provides a sociological explanation for how emotional maltreatment can be overlooked in society. The cycle of violence theory, originally developed within the context of domestic violence, explains how violence between romantic partners can be tolerated due to prevailing social norms. Analyzing the case in association with these two theories highlights the characteristics of teachers' behaviors that rationalize maltreatment as education and hinder students from escaping emotional maltreatment. This study deepens our understanding of the causes of student maltreatment and provides a new perspective for future qualitative and quantitative research. Furthermore, since this research is based on the sociology of education, it has the potential to expand research in the fields of pedagogy and sociology, in addition to psychology and social welfare.

Keywords: emotional maltreatment, education, student maltreatment, Japan

Procedia PDF Downloads 42
9 Numerical Simulation of Filtration Gas Combustion: Front Propagation Velocity

Authors: Yuri Laevsky, Tatyana Nosova

Abstract:

The phenomenon of filtration gas combustion (FGC) had been discovered experimentally at the beginning of 80’s of the previous century. It has a number of important applications in such areas as chemical technologies, fire-explosion safety, energy-saving technologies, oil production. From the physical point of view, FGC may be defined as the propagation of region of gaseous exothermic reaction in chemically inert porous medium, as the gaseous reactants seep into the region of chemical transformation. The movement of the combustion front has different modes, and this investigation is focused on the low-velocity regime. The main characteristic of the process is the velocity of the combustion front propagation. Computation of this characteristic encounters substantial difficulties because of the strong heterogeneity of the process. The mathematical model of FGC is formed by the energy conservation laws for the temperature of the porous medium and the temperature of gas and the mass conservation law for the relative concentration of the reacting component of the gas mixture. In this case the homogenization of the model is performed with the use of the two-temperature approach when at each point of the continuous medium we specify the solid and gas phases with a Newtonian heat exchange between them. The construction of a computational scheme is based on the principles of mixed finite element method with the usage of a regular mesh. The approximation in time is performed by an explicit–implicit difference scheme. Special attention was given to determination of the combustion front propagation velocity. Straight computation of the velocity as grid derivative leads to extremely unstable algorithm. It is worth to note that the term ‘front propagation velocity’ makes sense for settled motion when some analytical formulae linking velocity and equilibrium temperature are correct. The numerical implementation of one of such formulae leading to the stable computation of instantaneous front velocity has been proposed. The algorithm obtained has been applied in subsequent numerical investigation of the FGC process. This way the dependence of the main characteristics of the process on various physical parameters has been studied. In particular, the influence of the combustible gas mixture consumption on the front propagation velocity has been investigated. It also has been reaffirmed numerically that there is an interval of critical values of the interfacial heat transfer coefficient at which a sort of a breakdown occurs from a slow combustion front propagation to a rapid one. Approximate boundaries of such an interval have been calculated for some specific parameters. All the results obtained are in full agreement with both experimental and theoretical data, confirming the adequacy of the model and the algorithm constructed. The presence of stable techniques to calculate the instantaneous velocity of the combustion wave allows considering the semi-Lagrangian approach to the solution of the problem.

Keywords: filtration gas combustion, low-velocity regime, mixed finite element method, numerical simulation

Procedia PDF Downloads 277