Search results for: failure detection and prediction
581 Novel Framework for MIMO-Enhanced Robust Selection of Critical Control Factors in Auto Plastic Injection Moulding Quality Optimization
Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari
Abstract:
Apparent quality defects such as warpage, shrinkage, weld line, etc. are such an irresistible phenomenon in mass production of auto plastic appearance parts. These frequently occurred manufacturing defects should be satisfied concurrently so as to achieve a final product with acceptable quality standards. Determining the significant control factors that simultaneously affect multiple quality characteristics can significantly improve the optimization results by eliminating the deviating effect of the so-called ineffective outliers. Hence, a robust quantitative approach needs to be developed upon which major control factors and their level can be effectively determined to help improve the reliability of the optimal processing parameter design. Hence, the primary objective of current study was to develop a systematic methodology for selection of significant control factors (SCF) relevant to multiple quality optimization of auto plastic appearance part. Auto bumper was used as a specimen with the most identical quality and production characteristics to APAP group. A preliminary failure modes and effect analysis (FMEA) was conducted to nominate a database of pseudo significant significant control factors prior to the optimization phase. Later, CAE simulation Moldflow analysis was implemented to manipulate four rampant plastic injection quality defects concerned with APAP group including warpage deflection, volumetric shrinkage, sink mark and weld line. Furthermore, a step-backward elimination searching method (SESME) has been developed for systematic pre-optimization selection of SCF based on hierarchical orthogonal array design and priority-based one-way analysis of variance (ANOVA). The development of robust parameter design in the second phase was based on DOE module powered by Minitab v.16 statistical software. Based on the F-test (F 0.05, 2, 14) one-way ANOVA results, it was concluded that for warpage deflection, material mixture percentage was the most significant control factor yielding a 58.34% of contribution while for the other three quality defects, melt temperature was the most significant control factor with a 25.32%, 84.25%, and 34.57% contribution for sin mark, shrinkage and weld line strength control. Also, the results on the he least significant control factors meaningfully revealed injection fill time as the least significant factor for both warpage and sink mark with respective 1.69% and 6.12% contribution. On the other hand, for shrinkage and weld line defects, the least significant control factors were holding pressure and mold temperature with a 0.23% and 4.05% overall contribution accordingly.Keywords: plastic injection moulding, quality optimization, FMEA, ANOVA, SESME, APAP
Procedia PDF Downloads 349580 Fostering Diversity, Equity, and Inclusion: Case of Higher Education Institutions in Kazakhstan
Authors: Gainiya Tazhina
Abstract:
Higher education systems of many countries have increased diversity and ensured equal rights and opportunities for inclusive students in the last decades. Issues of diversity-equity-inclusion (DEI) in Kazakhstani higher education began to be considered in legislation in 2021-2023. The adoption of the Road Map of the Ministry of Education and Science for universities’ inclusivity indicated strategies for change. The paper traces how this government initiative is being implemented in universities across the country. Content analysis of legislative documents, media publications, surveys of students, staff and interviews with leaders have demonstrated the inconsistency of these strategic decisions. Thus, the Road Map required that by 2023 conditions for promoting and ensuring inclusive education and barrier-free environments should be created in 60% -100% of Kazakhstani universities, including spaces inside academic buildings and dormitories in a short period of time. (March 2023-August 2025). Educational programs and curricula have not been adapted to the needs of students with special education needs (SEN); teachers do not have the skills and methods to work with students with SEN, students from minority groups, and international students. 60% of universities have not created a barrier-free environment on campuses due to the high cost of elevators, tactile tiles and assistive devices. Only 1% of school-disabled graduates enter universities due to the unwillingness of universities to educate people with disabilities. At the same time, universities do not adapt their educational programs and services to the needs of inclusive students; their needs are not identified; they study under the same conditions as regular students. Accordingly, teaching staff does not have the knowledge and skills to teach inclusive students; university lecturers misunderstand or oversimplify the social phenomena of ‘inclusion’ and ‘diversity’. The situation is more acute with the creation of a barrier-free architectural environment on university campuses. Recent reports indicate that these reforms have not been implemented to date, proven controversial in practice due to the inconsistency of national research on inclusion in higher education. Widely announced reforms have not produced the expected results leading to distortions at the local level. Inconsistent policies, contradictory legislative acts without expertise of needs and developing specific implementation criteria, without training specialists and indicators for achieving reforms are doomed to failure and mistrust of society. Based on the results of this research, recommendations have been developed: (1) to overcome inconsistencies in legislation regarding DEI in higher education; (2) to encourage initiatives in universities' inclusive environments; (3) to develop projects that will promote public awareness of DEI.Keywords: diversity-equity-inclusion, Kazakhstani universities, reforms, legislation, accessibility
Procedia PDF Downloads 16579 Integrating Multiple Types of Value in Natural Capital Accounting Systems: Environmental Value Functions
Authors: Pirta Palola, Richard Bailey, Lisa Wedding
Abstract:
Societies and economies worldwide fundamentally depend on natural capital. Alarmingly, natural capital assets are quickly depreciating, posing an existential challenge for humanity. The development of robust natural capital accounting systems is essential for transitioning towards sustainable economic systems and ensuring sound management of capital assets. However, the accurate, equitable and comprehensive estimation of natural capital asset stocks and their accounting values still faces multiple challenges. In particular, the representation of socio-cultural values held by groups or communities has arguably been limited, as to date, the valuation of natural capital assets has primarily been based on monetary valuation methods and assumptions of individual rationality. People relate to and value the natural environment in multiple ways, and no single valuation method can provide a sufficiently comprehensive image of the range of values associated with the environment. Indeed, calls have been made to improve the representation of multiple types of value (instrumental, intrinsic, and relational) and diverse ontological and epistemological perspectives in environmental valuation. This study addresses this need by establishing a novel valuation framework, Environmental Value Functions (EVF), that allows for the integration of multiple types of value in natural capital accounting systems. The EVF framework is based on the estimation and application of value functions, each of which describes the relationship between the value and quantity (or quality) of an ecosystem component of interest. In this framework, values are estimated in terms of change relative to the current level instead of calculating absolute values. Furthermore, EVF was developed to also support non-marginalist conceptualizations of value: it is likely that some environmental values cannot be conceptualized in terms of marginal changes. For example, ecological resilience value may, in some cases, be best understood as a binary: it either exists (1) or is lost (0). In such cases, a logistic value function may be used as the discriminator. Uncertainty in the value function parameterization can be considered through, for example, Monte Carlo sampling analysis. The use of EVF is illustrated with two conceptual examples. For the first time, EVF offers a clear framework and concrete methodology for the representation of multiple types of value in natural capital accounting systems, simultaneously enabling 1) the complementary use and integration of multiple valuation methods (monetary and non-monetary); 2) the synthesis of information from diverse knowledge systems; 3) the recognition of value incommensurability; 4) marginalist and non-marginalist value analysis. Furthermore, with this advancement, the coupling of EVF and ecosystem modeling can offer novel insights to the study of spatial-temporal dynamics in natural capital asset values. For example, value time series can be produced, allowing for the prediction and analysis of volatility, long-term trends, and temporal trade-offs. This approach can provide essential information to help guide the transition to a sustainable economy.Keywords: economics of biodiversity, environmental valuation, natural capital, value function
Procedia PDF Downloads 194578 Evaluation of Cyclic Steam Injection in Multi-Layered Heterogeneous Reservoir
Authors: Worawanna Panyakotkaew, Falan Srisuriyachai
Abstract:
Cyclic steam injection (CSI) is a thermal recovery technique performed by injecting periodically heated steam into heavy oil reservoir. Oil viscosity is substantially reduced by means of heat transferred from steam. Together with gas pressurization, oil recovery is greatly improved. Nevertheless, prediction of effectiveness of the process is difficult when reservoir contains degree of heterogeneity. Therefore, study of heterogeneity together with interest reservoir properties must be evaluated prior to field implementation. In this study, thermal reservoir simulation program is utilized. Reservoir model is firstly constructed as multi-layered with coarsening upward sequence. The highest permeability is located on top layer with descending of permeability values in lower layers. Steam is injected from two wells located diagonally in quarter five-spot pattern. Heavy oil is produced by adjusting operating parameters including soaking period and steam quality. After selecting the best conditions for both parameters yielding the highest oil recovery, effects of degree of heterogeneity (represented by Lorenz coefficient), vertical permeability and permeability sequence are evaluated. Surprisingly, simulation results show that reservoir heterogeneity yields benefits on CSI technique. Increasing of reservoir heterogeneity impoverishes permeability distribution. High permeability contrast results in steam intruding in upper layers. Once temperature is cool down during back flow period, condense water percolates downward, resulting in high oil saturation on top layers. Gas saturation appears on top after while, causing better propagation of steam in the following cycle due to high compressibility of gas. Large steam chamber therefore covers most of the area in upper zone. Oil recovery reaches approximately 60% which is of about 20% higher than case of heterogeneous reservoir. Vertical permeability exhibits benefits on CSI. Expansion of steam chamber occurs within shorter time from upper to lower zone. For fining upward permeability sequence where permeability values are reversed from the previous case, steam does not override to top layers due to low permeability. Propagation of steam chamber occurs in middle of reservoir where permeability is high enough. Rate of oil recovery is slower compared to coarsening upward case due to lower permeability at the location where propagation of steam chamber occurs. Even CSI technique produces oil quite slowly in early cycles, once steam chamber is formed deep in the reservoir, heat is delivered to formation quickly in latter cycles. Since reservoir heterogeneity is unavoidable, a thorough understanding of its effect must be considered. This study shows that CSI technique might be one of the compatible solutions for highly heterogeneous reservoir. This competitive technique also shows benefit in terms of heat consumption as steam is injected periodically.Keywords: cyclic steam injection, heterogeneity, reservoir simulation, thermal recovery
Procedia PDF Downloads 459577 Using True Life Situations in a Systems Theory Perspective as Sources of Creativity: A Case Study of how to use Everyday Happenings to produce Creative Outcomes in Novel and Screenplay Writing
Authors: Rune Bjerke
Abstract:
Psychologists incline to see creativity as a mental and psychological process. However, creativity is as well results of cultural and social interactions. Therefore, creativity is not a product of individuals in isolation, but of social systems. Creative people get ideas from the influence of others and the immediate cultural environment – a space of knowledge, situations, and practices. Therefore, in this study we apply the systems theory in practice to activate creativity processes in the production of our novel and screenplay writing. We, as storytellers actively seek to get into situations in our everyday lives, our systems, to generate ideas. Within our personal systems, we have the potential to induce situations to realise ideas to our texts, which may be accepted by our gate-keepers and can become socially validated. This is our method of writing – get into situations, get ideas to texts, and test them with family and friends in our social systems. Example of novel text as an outcome of our method is as follows: “Is it a matter of obviousness or had I read it somewhere, that the one who increases his knowledge increases his pain? And also, the other way around, with increased pain, knowledge increases, I thought. Perhaps such a chain of effects explains why the rebel August Strindberg wrote seven plays in ten months after the divorce with Siri von Essen. Shortly after, he tried painting. Neither the seven theatre plays were shown, nor the paintings were exhibited. I was standing in front of Munch's painting Women in Three Stages with chaotic mental images of myself crumpled in a church and a laughing x-girlfriend watching my suffering. My stomach was turning at unpredictable intervals and the subsequent vomiting almost suffocated me. Love grief at the worst. Was it this pain Strindberg felt? Despite the failure of his first plays, the pain must have triggered a form of creative energy that turned pain into ideas. Suffering, thoughts, feelings, words, text, and then, the reader experience. Maybe this negative force can be transformed into something positive, I asked myself. The question eased my pain. At that moment, I forgot the damp, humid air in the Munch Museum. Is it the similar type of Strindberg-pain that could explain the recurring, depressive themes in Munch's paintings? Illness, death, love and jealousy. As a beginning art student at the master's level, I had decided to find the answer. Was it the same with Munch's pain, as with Strindberg - a woman behind? There had to be women in the case of Munch - therefore, the painting “Women in Three Stages”? Who are they, what personality types are they – the women in red, black and white dresses from left to the right?” We, the writers, are using persons, situations and elements in our systems, in a systems theory perspective, to prompt creative ideas. A conceptual model is provided to advance creativity theory.Keywords: creativity theory, systems theory, novel writing, screenplay writing, sources of creativity in social systems
Procedia PDF Downloads 121576 Prevalence of Chlamydia Trachomatis Infection in Multiple Anatomical Sites among Patients at Stis Center, Thailand
Authors: Siwimol Phoomniyom, Pathom Karaipoom, Rossaphorn Kittyaowaman
Abstract:
Background: C. trachomatis is the most common bacterial sexually transmitted infections. Although infection with C. trachomatis can be treated with antibiotic, it is frequently asymptomatic, especially in extragenital sites. Hence, if screening tests are not performed, undetected and untreated is a crucial problem for C. trachomatis infection, especially in Thailand, which is less well studied. We sought to assess the prevalence of C. trachomatis infection in multiple anatomical sites among patients attending Bangrak STIs Center. Methods: We examined laboratory results of all patients at baseline visit from 3 January 2018 to 27 December 2019. These results were tested by a validated in-house real time PCR specify for the cryptic plasmid gene of C. trachomatis. The prevalence of C. trachomatis was analyzed by anatomical sites, sexes, and ages. Urogenital samples were obtained from urethral swab of men and cervical swab of women. The median ages of the patients were 32 years (range 13-89 years). Chi-square test by IBM SPSS statistic version 20 was used to assess difference in the distribution of variables between groups. Results: Among 3,789 patients, the prevalence for C. trachomatis infection was the highest in rectal (16.1%), followed by urogenital (11.2%) and pharyngeal (3.5%) sites. Rectal and urogenital infection in men was higher than in women, with the highest prevalence of 16.6% in rectal site. Both rectal and urogenital sites also showed statistically significant differences between sexes (P<0.001). Meanwhile, pharyngeal C. trachomatis infection rate was higher in women than men. Interestingly, the chlamydia prevalence was the highest in age 13-19 years of all three sites (18.5%, urogenital; 17.7%, rectal; 6.5%, pharyngeal), with statistically significant difference between age groups (P<0.001). Total of 45 C. trachomatis infections, 20.0%, 51.1%, and 6.7% were isolated from urogenital, rectal, and pharyngeal sites. In total, 75.6%, 26.7%, and 80.0% of chlamydia infections would have been missed, if only urogenital, rectal, or pharyngeal screening was performed. Conclusions: The highest source of C. trachomatis infection was the rectal site. While, the highest prevalence in men was at rectal site, that in women was at urogenital site. The highest chlamydia prevalence was found in adolescent age group, indicating that the pediatric population was a high-risk group. This finding also elucidated that a high proportion of C. trachomatis infection would be missed, if only single anatomical site screening was performed, especially in extragenital sites. Hence, extragenital screening is also required for the extensive C. trachomatis detection.Keywords: chlamydia trachomatis, anatomical sites, sexes, ages
Procedia PDF Downloads 70575 Inherent Difficulties in Countering Islamophobia
Authors: Imbesat Daudi
Abstract:
Islamophobia, which is a billion-dollar industry, is widespread, especially in the United States, Europe, India, Israel, and countries that have Muslim minorities at odds with their governmental policies. Hatred of Islam in the West did not evolve spontaneously; it was methodically created. Islamophobia's current format has been designed to spread on its own, find a space in the Western psyche, and resist its eradication. Hatred has been sustained by neoconservative ideologues and their allies, which are supported by the mainstream media. Social scientists have evaluated how ideas spread, why any idea can go viral, and where new ideas find space in our brains. This was possible because of the advances in the computational power of software and computers. Spreading of ideas, including Islamophobia, follows a sine curve; it has three phases: An initial exploratory phase with a long lag period, an explosive phase if ideas go viral, and the final phase when ideas find space in the human psyche. In the initial phase, the ideas are quickly examined in a center in the prefrontal lobe. When it is deemed relevant, it is sent for evaluation to another center of the prefrontal lobe; there, it is critically examined. Once it takes a final shape, the idea is sent as a final product to a center in the occipital lobe. This center cannot critically evaluate ideas; it can only defend them from its critics. Counterarguments, no matter how scientific, are automatically rejected. Therefore, arguments that could be highly effective in the early phases are counterproductive once they are stored in the occipital lobe. Anti-Islamophobic intellectuals have done a very good job of countering Islamophobic arguments. However, they have not been as effective as neoconservative ideologues who have promoted anti-Muslim rhetoric that was based on half-truths, misinformation, or outright lies. The failure is partly due to the support pro-war activists receive from the mainstream media, state institutions, mega-corporations engaged in violent conflicts, and think tanks that provide Islamophobic arguments. However, there are also scientific reasons why anti-Islamophobic thinkers have been less effective. There are different dynamics of spreading ideas once they are stored in the occipital lobe. The human brain is incapable of evaluating further once it accepts ideas as its own; therefore, a different strategy is required to be effective. This paper examines 1) why anti-Islamophobic intellectuals have failed in changing the minds of non-Muslims and 2) the steps of countering hatred. Simply put, a new strategy is needed that can effectively counteract hatred of Islam and Muslims. Islamophobia is a disease that requires strong measures. Fighting hatred is always a challenge, but if we understand why Islamophobia is taking root in the twenty-first century, one can succeed in challenging Islamophobic arguments. That will need a coordinated effort of Intellectuals, writers and the media.Keywords: islamophobia, Islam and violence, anti-islamophobia, demonization of Islam
Procedia PDF Downloads 48574 Plotting of an Ideal Logic versus Resource Outflow Graph through Response Analysis on a Strategic Management Case Study Based Questionnaire
Authors: Vinay A. Sharma, Shiva Prasad H. C.
Abstract:
The initial stages of any project are often observed to be in a mixed set of conditions. Setting up the project is a tough task, but taking the initial decisions is rather not complex, as some of the critical factors are yet to be introduced into the scenario. These simple initial decisions potentially shape the timeline and subsequent events that might later be plotted on it. Proceeding towards the solution for a problem is the primary objective in the initial stages. The optimization in the solutions can come later, and hence, the resources deployed towards attaining the solution are higher than what they would have been in the optimized versions. A ‘logic’ that counters the problem is essentially the core of the desired solution. Thus, if the problem is solved, the deployment of resources has led to the required logic being attained. As the project proceeds along, the individuals working on the project face fresh challenges as a team and are better accustomed to their surroundings. The developed, optimized solutions are then considered for implementation, as the individuals are now experienced, and know better of the consequences and causes of possible failure, and thus integrate the adequate tolerances wherever required. Furthermore, as the team graduates in terms of strength, acquires prodigious knowledge, and begins its efficient transfer, the individuals in charge of the project along with the managers focus more on the optimized solutions rather than the traditional ones to minimize the required resources. Hence, as time progresses, the authorities prioritize attainment of the required logic, at a lower amount of dedicated resources. For empirical analysis of the stated theory, leaders and key figures in organizations are surveyed for their ideas on appropriate logic required for tackling a problem. Key-pointers spotted in successfully implemented solutions are noted from the analysis of the responses and a metric for measuring logic is developed. A graph is plotted with the quantifiable logic on the Y-axis, and the dedicated resources for the solutions to various problems on the X-axis. The dedicated resources are plotted over time, and hence the X-axis is also a measure of time. In the initial stages of the project, the graph is rather linear, as the required logic will be attained, but the consumed resources are also high. With time, the authorities begin focusing on optimized solutions, since the logic attained through them is higher, but the resources deployed are comparatively lower. Hence, the difference between consecutive plotted ‘resources’ reduces and as a result, the slope of the graph gradually increases. On an overview, the graph takes a parabolic shape (beginning on the origin), as with each resource investment, ideally, the difference keeps on decreasing, and the logic attained through the solution keeps increasing. Even if the resource investment is higher, the managers and authorities, ideally make sure that the investment is being made on a proportionally high logic for a larger problem, that is, ideally the slope of the graph increases with the plotting of each point.Keywords: decision-making, leadership, logic, strategic management
Procedia PDF Downloads 110573 Abilitest Battery: Presentation of Tests and Psychometric Properties
Authors: Sylwia Sumińska, Łukasz Kapica, Grzegorz Szczepański
Abstract:
Introduction: Cognitive skills are a crucial part of everyday functioning. Cognitive skills include perception, attention, language, memory, executive functions, and higher cognitive skills. With the aging of societies, there is an increasing percentage of people whose cognitive skills decline. Cognitive skills affect work performance. The appropriate diagnosis of a worker’s cognitive skills reduces the risk of errors and accidents at work which is also important for senior workers. The study aimed to prepare new cognitive tests for adults aged 20-60 and assess the psychometric properties of the tests. The project responds to the need for reliable and accurate methods of assessing cognitive performance. Computer tests were developed to assess psychomotor performance, attention, and working memory. Method: Two hundred eighty people aged 20-60 will participate in the study in 4 age groups. Inclusion criteria for the study were: no subjective cognitive impairment, no history of severe head injuries, chronic diseases, psychiatric and neurological diseases. The research will be conducted from February - to June 2022. Cognitive tests: 1) Measurement of psychomotor performance: Reaction time, Reaction time with selective attention component; 2) Measurement of sustained attention: Visual search (dots), Visual search (numbers); 3) Measurement of working memory: Remembering words, Remembering letters. To assess the validity and the reliability subjects will perform the Vienna Test System, i.e., “Reaction Test” (reaction time), “Signal Detection” (sustained attention), “Corsi Block-Tapping Test” (working memory), and Perception and Attention Test (TUS), Colour Trails Test (CTT), Digit Span – subtest from The Wechsler Adult Intelligence Scale. Eighty people will be invited to a session after three months aimed to assess the consistency over time. Results: Due to ongoing research, the detailed results from 280 people will be shown at the conference separately in each age group. The results of correlation analysis with the Vienna Test System will be demonstrated as well.Keywords: aging, attention, cognitive skills, cognitive tests, psychomotor performance, working memory
Procedia PDF Downloads 106572 A Comparative Study of the Tribological Behavior of Bilayer Coatings for Machine Protection
Authors: Cristina Diaz, Lucia Perez-Gandarillas, Gonzalo Garcia-Fuentes, Simone Visigalli, Roberto Canziani, Giuseppe Di Florio, Paolo Gronchi
Abstract:
During their lifetime, industrial machines are often subjected to chemical, mechanical and thermal extreme conditions. In some cases, the loss of efficiency comes from the degradation of the surface as a result of its exposition to abrasive environments that can cause wear. This is a common problem to be solved in industries of diverse nature such as food, paper or concrete industries, among others. For this reason, a good selection of the material is of high importance. In the machine design context, stainless steels such as AISI 304 and 316 are widely used. However, the severity of the external conditions can require additional protection for the steel and sometimes coating solutions are demanded in order to extend the lifespan of these materials. Therefore, the development of effective coatings with high wear resistance is of utmost technological relevance. In this research, bilayer coatings made of Titanium-Tantalum, Titanium-Niobium, Titanium-Hafnium, and Titanium-Zirconium have been developed using magnetron sputtering configuration by PVD (Physical Vapor Deposition) technology. Their tribological behavior has been measured and evaluated under different environmental conditions. Two kinds of steels were used as substrates: AISI 304, AISI 316. For the comparison with these materials, titanium alloy substrate was also employed. Regarding the characterization, wear rate and friction coefficient were evaluated by a tribo-tester, using a pin-on-ball configuration with different lubricants such as tomato sauce, wine, olive oil, wet compost, a mix of sand and concrete with water and NaCl to approximate the results to real extreme conditions. In addition, topographical images of the wear tracks were obtained in order to get more insight of the wear behavior and scanning electron microscope (SEM) images were taken to evaluate the adhesion and quality of the coating. The characterization was completed with the measurement of nanoindentation hardness and elastic modulus. Concerning the results, thicknesses of the samples varied from 100 nm (Ti-Zr layer) to 1.4 µm (Ti-Hf layer) and SEM images confirmed that the addition of the Ti layer improved the adhesion of the coatings. Moreover, results have pointed out that these coatings have increased the wear resistance in comparison with the original substrates under environments of different severity. Furthermore, nanoindentation hardness results showed an improvement of the elastic strain to failure and a high modulus of elasticity (approximately 200 GPa). As a conclusion, Ti-Ta, Ti-Zr, Ti-Nb, and Ti-Hf are very promising and effective coatings in terms of tribological behavior, improving considerably the wear resistance and friction coefficient of typically used machine materials.Keywords: coating, stainless steel, tribology, wear
Procedia PDF Downloads 151571 Boussinesq Model for Dam-Break Flow Analysis
Authors: Najibullah M, Soumendra Nath Kuiry
Abstract:
Dams and reservoirs are perceived for their estimable alms to irrigation, water supply, flood control, electricity generation, etc. which civilize the prosperity and wealth of society across the world. Meantime the dam breach could cause devastating flood that can threat to the human lives and properties. Failures of large dams remain fortunately very seldom events. Nevertheless, a number of occurrences have been recorded in the world, corresponding in an average to one to two failures worldwide every year. Some of those accidents have caused catastrophic consequences. So it is decisive to predict the dam break flow for emergency planning and preparedness, as it poses high risk to life and property. To mitigate the adverse impact of dam break, modeling is necessary to gain a good understanding of the temporal and spatial evolution of the dam-break floods. This study will mainly deal with one-dimensional (1D) dam break modeling. Less commonly used in the hydraulic research community, another possible option for modeling the rapidly varied dam-break flows is the extended Boussinesq equations (BEs), which can describe the dynamics of short waves with a reasonable accuracy. Unlike the Shallow Water Equations (SWEs), the BEs taken into account the wave dispersion and non-hydrostatic pressure distribution. To capture the dam-break oscillations accurately it is very much needed of at least fourth-order accurate numerical scheme to discretize the third-order dispersion terms present in the extended BEs. The scope of this work is therefore to develop an 1D fourth-order accurate in both space and time Boussinesq model for dam-break flow analysis by using finite-volume / finite difference scheme. The spatial discretization of the flux and dispersion terms achieved through a combination of finite-volume and finite difference approximations. The flux term, was solved using a finite-volume discretization whereas the bed source and dispersion term, were discretized using centered finite-difference scheme. Time integration achieved in two stages, namely the third-order Adams Basforth predictor stage and the fourth-order Adams Moulton corrector stage. Implementation of the 1D Boussinesq model done using PYTHON 2.7.5. Evaluation of the performance of the developed model predicted as compared with the volume of fluid (VOF) based commercial model ANSYS-CFX. The developed model is used to analyze the risk of cascading dam failures similar to the Panshet dam failure in 1961 that took place in Pune, India. Nevertheless, this model can be used to predict wave overtopping accurately compared to shallow water models for designing coastal protection structures.Keywords: Boussinesq equation, Coastal protection, Dam-break flow, One-dimensional model
Procedia PDF Downloads 232570 Predicting OpenStreetMap Coverage by Means of Remote Sensing: The Case of Haiti
Authors: Ran Goldblatt, Nicholas Jones, Jennifer Mannix, Brad Bottoms
Abstract:
Accurate, complete, and up-to-date geospatial information is the foundation of successful disaster management. When the 2010 Haiti Earthquake struck, accurate and timely information on the distribution of critical infrastructure was essential for the disaster response community for effective search and rescue operations. Existing geospatial datasets such as Google Maps did not have comprehensive coverage of these features. In the days following the earthquake, many organizations released high-resolution satellite imagery, catalyzing a worldwide effort to map Haiti and support the recovery operations. Of these organizations, OpenStreetMap (OSM), a collaborative project to create a free editable map of the world, used the imagery to support volunteers to digitize roads, buildings, and other features, creating the most detailed map of Haiti in existence in just a few weeks. However, large portions of the island are still not fully covered by OSM. There is an increasing need for a tool to automatically identify which areas in Haiti, as well as in other countries vulnerable to disasters, that are not fully mapped. The objective of this project is to leverage different types of remote sensing measurements, together with machine learning approaches, in order to identify geographical areas where OSM coverage of building footprints is incomplete. Several remote sensing measures and derived products were assessed as potential predictors of OSM building footprints coverage, including: intensity of light emitted at night (based on VIIRS measurements), spectral indices derived from Sentinel-2 satellite (normalized difference vegetation index (NDVI), normalized difference built-up index (NDBI), soil-adjusted vegetation index (SAVI), urban index (UI)), surface texture (based on Sentinel-1 SAR measurements)), elevation and slope. Additional remote sensing derived products, such as Hansen Global Forest Change, DLR`s Global Urban Footprint (GUF), and World Settlement Footprint (WSF), were also evaluated as predictors, as well as OSM street and road network (including junctions). Using a supervised classification with a random forest classifier resulted in the prediction of 89% of the variation of OSM building footprint area in a given cell. These predictions allowed for the identification of cells that are predicted to be covered but are actually not mapped yet. With these results, this methodology could be adapted to any location to assist with preparing for future disastrous events and assure that essential geospatial information is available to support the response and recovery efforts during and following major disasters.Keywords: disaster management, Haiti, machine learning, OpenStreetMap, remote sensing
Procedia PDF Downloads 125569 State Violence: The Brazilian Amnesty Law and the Fight Against Impunity
Authors: Flavia Kroetz
Abstract:
From 1964 to 1985, Brazil was ruled by a dictatorial regime that, under the discourse of fight against terrorism and subversion, implemented cruel and atrocious practices against anyone who opposed the State ideology. At the same time, several Latin American countries faced dictatorial periods and experienced State repression through apparatuses of violence institutionalized in the very governmental structure. Despite the correspondence between repressive methods adopted by authoritarian regimes in States such as Argentina, Chile, El Salvador, Peru and Uruguay, the mechanisms of democratic transition adopted with the end of each dictatorship were significantly different. While some States have found ways to deal with past atrocities through serious and transparent investigations of the crimes perpetrated in the name of repression, in others, as in Brazil, a culture of impunity remains rooted in society, manifesting itself in the widespread disbelief of the population in governmental and democratic institutions. While Argentina, Chile, Peru and Uruguay are convincing examples of the possibility and importance of the prosecution of crimes such as torture, forced disappearance and murder committed by the State, El Salvador demonstrates the complete failure to punish or at least remove from power the perpetrators of serious crimes against civilians and political opponents. In a scenario of widespread violations of human rights, State violence becomes entrenched within society as a daily and even necessary practice. In Brazil, a lack of political and judicial will withstands the impunity of those who, during the military regime, committed serious crimes against human rights under the authority of the State. If the reproduction of violence is a direct consequence of the culture of denial and the rejection of everyone considered to be different, ‘the other’, then the adoption of transitional mechanisms that underpin the historical and political contexts of the time seems essential. Such mechanisms must strengthen democracy through the effective implementation of the rights to memory and to truth, the right to justice and reparations for victims and their families, as well as institutional changes in order to remove from power those who, when in power, could not distinguish between legality and authoritarianism. Against this background, this research analyses the importance of transitional justice for the restoration of democracy, considering the adoption of amnesty laws as a strategy to preclude criminal prosecution of offenses committed during dictatorial regimes. The study investigates the scope of Law No 6.683/79, the Brazilian amnesty law, which, according to a 2010 decision of the Brazilian Constitutional Supreme Court, granted amnesty to those responsible for political crimes and related crimes, committed between September 2, 1961 and August 15, 1979. Was the purpose of this Law to grant amnesty to violent crimes committed by the State? If so, is it possible to recognize the legitimacy of a Congress composed of indirectly elected politicians controlled by the dictatorship?Keywords: amnesty law, criminal justice, dictatorship, state violence
Procedia PDF Downloads 439568 Nudging the Criminal Justice System into Listening to Crime Victims in Plea Agreements
Authors: Dana Pugach, Michal Tamir
Abstract:
Most criminal cases end with a plea agreement, an issue whose many aspects have been discussed extensively in legal literature. One important feature, however, has gained little notice, and that is crime victims’ place in plea agreements following the federal Crime Victims Rights Act of 2004. This law has provided victims some meaningful and potentially revolutionary rights, including the right to be heard in the proceeding and a right to appeal against a decision made while ignoring the victim’s rights. While victims’ rights literature has always emphasized the importance of such right, references to this provision in the general literature about plea agreements are sparse, if existing at all. Furthermore, there are a few cases only mentioning this right. This article purports to bridge between these two bodies of legal thinking – the vast literature concerning plea agreements and victims’ rights research– by using behavioral economics. The article will, firstly, trace the possible structural reasons for the failure of this right to be materialized. Relevant incentives of all actors involved will be identified as well as their inherent consequential processes that lead to the victims’ rights malfunction. Secondly, the article will use nudge theory in order to suggest solutions that will enhance incentives for the repeat players in the system (prosecution, judges, defense attorneys) and lead to the strengthening of weaker group’s interests – the crime victims. Behavioral psychology literature recognizes that the framework in which an individual confronts a decision can significantly influence his decision. Richard Thaler and Cass Sunstein developed the idea of ‘choice architecture’ - ‘the context in which people make decisions’ - which can be manipulated to make particular decisions more likely. Choice architectures can be changed by adjusting ‘nudges,’ influential factors that help shape human behavior, without negating their free choice. The nudges require decision makers to make choices instead of providing a familiar default option. In accordance with this theory, we suggest a rule, whereby a judge should inquire the victim’s view prior to accepting the plea. This suggestion leaves the judge’s discretion intact; while at the same time nudges her not to go directly to the default decision, i.e. automatically accepting the plea. Creating nudges that force actors to make choices is particularly significant when an actor intends to deviate from routine behaviors but experiences significant time constraints, as in the case of judges and plea bargains. The article finally recognizes some far reaching possible results of the suggestion. These include meaningful changes to the earlier stages of criminal process even before reaching court, in line with the current criticism of the plea agreements machinery.Keywords: plea agreements, victims' rights, nudge theory, criminal justice
Procedia PDF Downloads 323567 Factors Associated with Cytomegalovirus Infection: A Prospective Single Centre Study
Authors: Marko Jankovic, Aleksandra Knezevic, Maja Cupic, Dragana Vujic, Zeljko Zecevic, Borko Gobeljic, Marija Simic, Tanja Jovanovic
Abstract:
The human cytomegalovirus (CMV) is a notorious pathogen in the pediatric transplant setting. Although studies on factors in complicity with CMV infection abound, the role of age, gender, allogeneic hematopoietic stem cell transplantation (alloHSCT) modality, and underlying disease as regards CMV infection and viral load in children are poorly explored. We examined the significance of various factors related to the risk of CMV infection and viral load in Serbian children and adolescents undergoing alloHSCT. This was a prospective single centre study of thirty two pediatric patients in receipt of alloHSCT for various malignant and non-malignant disorders. Screening for active viral infection was performed by regular weekly monitoring. The Real-Time PCR method was used for CMV DNA detection and quantitation. Statistical analysis was performed using the IBM SPSS Statistics v20 software. Chi-square test was used to evaluate categorical variables. Comparison between scalar and nominal data was done by Wilcoxon-Mann-Whitney test. Pearson correlation was applied for studying the association between patient age and viral load. CMV was detected in 23 (71.9%) patients. Infection occurred significantly more often (p=0.015) in patients with haploidentical donors. The opposite was noted for matched sibling grafts (p=0.006). The viral load was higher in females (p=0.041) and children in the aftermath of alloHSCT with malignant diseases (p=0.019). There was no significant relationship between the viral infection dynamics and overt medical consequences. This is the first study of risk factors for CMV infection in Serbian pediatric alloHSCT patients. Transplanted patients presented with a high incidence of CMV viremia. The HLA compatibility of donated graft is associated with the frequency of CMV positive events. Age, gender, underlying disease, and medically relevant events were not conducive to occurrences of viremia. Notably, substantial viral burdens were evidenced in females and patients with neoplastic diseases. Studies comprising larger populations are clearly needed to scrutinize current results.Keywords: allogeneic hematopoietic stem cell transplantation, children, cytomegalovirus, risk factors, viral load
Procedia PDF Downloads 162566 Polypropylene Matrix Enriched With Silver Nanoparticles From Banana Peel Extract For Antimicrobial Control Of E. coli and S. epidermidis To Maintain Fresh Food
Authors: Michail Milas, Aikaterini Dafni Tegiou, Nickolas Rigopoulos, Eustathios Giaouris, Zaharias Loannou
Abstract:
Nanotechnology, a relatively new scientific field, addresses the manipulation of nanoscale materials and devices, which are governed by unique properties, and is applied in a wide range of industries, including food packaging. The incorporation of nanoparticles into polymer matrices used for food packaging is a field that is highly researched today. One such combination is silver nanoparticles with polypropylene. In the present study, the synthesis of the silver nanoparticles was carried out by a natural method. In particular, a ripe banana peel extract was used. This method is superior to others as it stands out for its environmental friendliness, high efficiency and low-cost requirement. In particular, a 1.75 mM AgNO₃ silver nitrate solution was used, as well as a BPE concentration of 1.7% v/v, an incubation period of 48 hours at 70°C and a pH of 4.3 and after its preparation, the polypropylene films were soaked in it. For the PP films, random PP spheres were melted at 170-190°C into molds with 0.8cm diameter. This polymer was chosen as it is suitable for plastic parts and reusable plastic containers of various types that are intended to come into contact with food without compromising its quality and safety. The antimicrobial test against Escherichia coli DFSNB1 and Staphylococcus epidermidis DFSNB4 was performed on the films. It appeared that the films with silver nanoparticles had a reduction, at least 100 times, compared to those without silver nanoparticles, in both strains. The limit of detection is the lower limit of the vertical error lines in the presence of nanoparticles, which is 3.11. The main reasons that led to the adsorption of nanoparticles are the porous nature of polypropylene and the adsorption capacity of nanoparticles on the surface of the films due to hydrophobic-hydrophilic forces. The most significant parameters that contributed to the results of the experiment include the following: the stage of ripening of the banana during the preparation of the plant extract, the temperature and residence time of the nanoparticle solution in the oven, the residence time of the polypropylene films in the nanoparticle solution, the number of nanoparticles inoculated on the films and, finally, the time these stayed in the refrigerator so that they could dry and be ready for antimicrobial treatment.Keywords: antimicrobial control, banana peel extract, E. coli, natural synthesis, microbe, plant extract, polypropylene films, S.epidermidis, silver nano, random pp
Procedia PDF Downloads 176565 Accounting for Downtime Effects in Resilience-Based Highway Network Restoration Scheduling
Authors: Zhenyu Zhang, Hsi-Hsien Wei
Abstract:
Highway networks play a vital role in post-disaster recovery for disaster-damaged areas. Damaged bridges in such networks can disrupt the recovery activities by impeding the transportation of people, cargo, and reconstruction resources. Therefore, rapid restoration of damaged bridges is of paramount importance to long-term disaster recovery. In the post-disaster recovery phase, the key to restoration scheduling for a highway network is prioritization of bridge-repair tasks. Resilience is widely used as a measure of the ability to recover with which a network can return to its pre-disaster level of functionality. In practice, highways will be temporarily blocked during the downtime of bridge restoration, leading to the decrease of highway-network functionality. The failure to take downtime effects into account can lead to overestimation of network resilience. Additionally, post-disaster recovery of highway networks is generally divided into emergency bridge repair (EBR) in the response phase and long-term bridge repair (LBR) in the recovery phase, and both of EBR and LBR are different in terms of restoration objectives, restoration duration, budget, etc. Distinguish these two phases are important to precisely quantify highway network resilience and generate suitable restoration schedules for highway networks in the recovery phase. To address the above issues, this study proposes a novel resilience quantification method for the optimization of long-term bridge repair schedules (LBRS) taking into account the impact of EBR activities and restoration downtime on a highway network’s functionality. A time-dependent integer program with recursive functions is formulated for optimally scheduling LBR activities. Moreover, since uncertainty always exists in the LBRS problem, this paper extends the optimization model from the deterministic case to the stochastic case. A hybrid genetic algorithm that integrates a heuristic approach into a traditional genetic algorithm to accelerate the evolution process is developed. The proposed methods are tested using data from the 2008 Wenchuan earthquake, based on a regional highway network in Sichuan, China, consisting of 168 highway bridges on 36 highways connecting 25 cities/towns. The results show that, in this case, neglecting the bridge restoration downtime can lead to approximately 15% overestimation of highway network resilience. Moreover, accounting for the impact of EBR on network functionality can help to generate a more specific and reasonable LBRS. The theoretical and practical values are as follows. First, the proposed network recovery curve contributes to comprehensive quantification of highway network resilience by accounting for the impact of both restoration downtime and EBR activities on the recovery curves. Moreover, this study can improve the highway network resilience from the organizational dimension by providing bridge managers with optimal LBR strategies.Keywords: disaster management, highway network, long-term bridge repair schedule, resilience, restoration downtime
Procedia PDF Downloads 151564 Screening Tools and Its Accuracy for Common Soccer Injuries: A Systematic Review
Authors: R. Christopher, C. Brandt, N. Damons
Abstract:
Background: The sequence of prevention model states that by constant assessment of injury, injury mechanisms and risk factors are identified, highlighting that collecting and recording of data is a core approach for preventing injuries. Several screening tools are available for use in the clinical setting. These screening techniques only recently received research attention, hence there is a dearth of inconsistent and controversial data regarding their applicability, validity, and reliability. Several systematic reviews related to common soccer injuries have been conducted; however, none of them addressed the screening tools for common soccer injuries. Objectives: The purpose of this study was to conduct a review of screening tools and their accuracy for common injuries in soccer. Methods: A systematic scoping review was performed based on the Joanna Briggs Institute procedure for conducting systematic reviews. Databases such as SPORT Discus, Cinahl, Medline, Science Direct, PubMed, and grey literature were used to access suitable studies. Some of the key search terms included: injury screening, screening, screening tool accuracy, injury prevalence, injury prediction, accuracy, validity, specificity, reliability, sensitivity. All types of English studies dating back to the year 2000 were included. Two blind independent reviewers selected and appraised articles on a 9-point scale for inclusion as well as for the risk of bias with the ACROBAT-NRSI tool. Data were extracted and summarized in tables. Plot data analysis was done, and sensitivity and specificity were analyzed with their respective 95% confidence intervals. I² statistic was used to determine the proportion of variation across studies. Results: The initial search yielded 95 studies, of which 21 were duplicates, and 54 excluded. A total of 10 observational studies were included for the analysis: 3 studies were analysed quantitatively while the remaining 7 were analysed qualitatively. Seven studies were graded low and three studies high risk of bias. Only high methodological studies (score > 9) were included for analysis. The pooled studies investigated tools such as the Functional Movement Screening (FMS™), the Landing Error Scoring System (LESS), the Tuck Jump Assessment, the Soccer Injury Movement Screening (SIMS), and the conventional hamstrings to quadriceps ratio. The accuracy of screening tools was of high reliability, sensitivity and specificity (calculated as ICC 0.68, 95% CI: 52-0.84; and 0.64, 95% CI: 0.61-0.66 respectively; I² = 13.2%, P=0.316). Conclusion: Based on the pooled results from the included studies, the FMS™ has a good inter-rater and intra-rater reliability. FMS™ is a screening tool capable of screening for common soccer injuries, and individual FMS™ scores are a better determinant of performance in comparison with the overall FMS™ score. Although meta-analysis could not be done for all the included screening tools, qualitative analysis also indicated good sensitivity and specificity of the individual tools. Higher levels of evidence are, however, needed for implication in evidence-based practice.Keywords: accuracy, screening tools, sensitivity, soccer injuries, specificity
Procedia PDF Downloads 180563 Remote Sensing Application in Environmental Researches: Case Study of Iran Mangrove Forests Quantitative Assessment
Authors: Neda Orak, Mostafa Zarei
Abstract:
Environmental assessment is an important session in environment management. Since various methods and techniques have been produces and implemented. Remote sensing (RS) is widely used in many scientific and research fields such as geology, cartography, geography, agriculture, forestry, land use planning, environment, etc. It can show earth surface objects cyclical changes. Also, it can show earth phenomena limits on basis of electromagnetic reflectance changes and deviations records. The research has been done on mangrove forests assessment by RS techniques. Mangrove forests quantitative analysis in Basatin and Bidkhoon estuaries was the aim of this research. It has been done by Landsat satellite images from 1975- 2013 and match to ground control points. This part of mangroves are the last distribution in northern hemisphere. It can provide a good background to improve better management on this important ecosystem. Landsat has provided valuable images to earth changes detection to researchers. This research has used MSS, TM, +ETM, OLI sensors from 1975, 1990, 2000, 2003-2013. Changes had been studied after essential corrections such as fix errors, bands combination, georeferencing on 2012 images as basic image, by maximum likelihood and IPVI Index. It was done by supervised classification. 2004 google earth image and ground points by GPS (2010-2012) was used to compare satellite images obtained changes. Results showed mangrove area in bidkhoon was 1119072 m2 by GPS and 1231200 m2 by maximum likelihood supervised classification and 1317600 m2 by IPVI in 2012. Basatin areas is respectively: 466644 m2, 88200 m2, 63000 m2. Final results show forests have been declined naturally. It is due to human activities in Basatin. The defect was offset by planting in many years. Although the trend has been declining in recent years again. So, it mentioned satellite images have high ability to estimation all environmental processes. This research showed high correlation between images and indexes such as IPVI and NDVI with ground control points.Keywords: IPVI index, Landsat sensor, maximum likelihood supervised classification, Nayband National Park
Procedia PDF Downloads 294562 Detecting Tomato Flowers in Greenhouses Using Computer Vision
Authors: Dor Oppenheim, Yael Edan, Guy Shani
Abstract:
This paper presents an image analysis algorithm to detect and count yellow tomato flowers in a greenhouse with uneven illumination conditions, complex growth conditions and different flower sizes. The algorithm is designed to be employed on a drone that flies in greenhouses to accomplish several tasks such as pollination and yield estimation. Detecting the flowers can provide useful information for the farmer, such as the number of flowers in a row, and the number of flowers that were pollinated since the last visit to the row. The developed algorithm is designed to handle the real world difficulties in a greenhouse which include varying lighting conditions, shadowing, and occlusion, while considering the computational limitations of the simple processor in the drone. The algorithm identifies flowers using an adaptive global threshold, segmentation over the HSV color space, and morphological cues. The adaptive threshold divides the images into darker and lighter images. Then, segmentation on the hue, saturation and volume is performed accordingly, and classification is done according to size and location of the flowers. 1069 images of greenhouse tomato flowers were acquired in a commercial greenhouse in Israel, using two different RGB Cameras – an LG G4 smartphone and a Canon PowerShot A590. The images were acquired from multiple angles and distances and were sampled manually at various periods along the day to obtain varying lighting conditions. Ground truth was created by manually tagging approximately 25,000 individual flowers in the images. Sensitivity analyses on the acquisition angle of the images, periods throughout the day, different cameras and thresholding types were performed. Precision, recall and their derived F1 score were calculated. Results indicate better performance for the view angle facing the flowers than any other angle. Acquiring images in the afternoon resulted with the best precision and recall results. Applying a global adaptive threshold improved the median F1 score by 3%. Results showed no difference between the two cameras used. Using hue values of 0.12-0.18 in the segmentation process provided the best results in precision and recall, and the best F1 score. The precision and recall average for all the images when using these values was 74% and 75% respectively with an F1 score of 0.73. Further analysis showed a 5% increase in precision and recall when analyzing images acquired in the afternoon and from the front viewpoint.Keywords: agricultural engineering, image processing, computer vision, flower detection
Procedia PDF Downloads 330561 Verification of Geophysical Investigation during Subsea Tunnelling in Qatar
Authors: Gary Peach, Furqan Hameed
Abstract:
Musaimeer outfall tunnel is one of the longest storm water tunnels in the world, with a total length of 10.15 km. The tunnel will accommodate surface and rain water received from the drainage networks from 270 km of urban areas in southern Doha with a pumping capacity of 19.7m³/sec. The tunnel is excavated by Tunnel Boring Machine (TBM) through Rus Formation, Midra Shales, and Simsima Limestone. Water inflows at high pressure, complex mixed ground, and weaker ground strata prone to karstification with the presence of vertical and lateral fractures connected to the sea bed were also encountered during mining. In addition to pre-tender geotechnical investigations, the Contractor carried out a supplementary offshore geophysical investigation in order to fine-tune the existing results of geophysical and geotechnical investigations. Electric resistivity tomography (ERT) and Seismic Reflection survey was carried out. Offshore geophysical survey was performed, and interpretations of rock mass conditions were made to provide an overall picture of underground conditions along the tunnel alignment. This allowed the critical tunnelling area and cutter head intervention to be planned accordingly. Karstification was monitored with a non-intrusive radar system facility installed on the TBM. The Boring Electric Ahead Monitoring(BEAM) was installed at the cutter head and was able to predict the rock mass up to 3 tunnel diameters ahead of the cutter head. BEAM system was provided with an online system for real time monitoring of rock mass condition and then correlated with the rock mass conditions predicted during the interpretation phase of offshore geophysical surveys. The further correlation was carried by Samples of the rock mass taken from tunnel face inspections and excavated material produced by the TBM. The BEAM data was continuously monitored to check the variations in resistivity and percentage frequency effect (PFE) of the ground. This system provided information about rock mass condition, potential karst risk, and potential of water inflow. BEAM system was found to be more than 50% accurate in picking up the difficult ground conditions and faults as predicted in the geotechnical interpretative report before the start of tunnelling operations. Upon completion of the project, it was concluded that the combined use of different geophysical investigation results can make the execution stage be carried out in a more confident way with the less geotechnical risk involved. The approach used for the prediction of rock mass condition in Geotechnical Interpretative Report (GIR) and Geophysical Reflection and electric resistivity tomography survey (ERT) Geophysical Reflection surveys were concluded to be reliable as the same rock mass conditions were encountered during tunnelling operations.Keywords: tunnel boring machine (TBM), subsea, karstification, seismic reflection survey
Procedia PDF Downloads 250560 Social Business Model: Leveraging Business and Social Value of Social Enterprises
Authors: Miriam Borchardt, Agata M. Ritter, Macaliston G. da Silva, Mauricio N. de Carvalho, Giancarlo M. Pereira
Abstract:
This paper aims to analyze the barriers faced by social enterprises and based on that to propose a social business model framework that helps them to leverage their businesses and the social value delivered. A business model for social enterprises should amplify the value perception including social value for the beneficiaries while generating enough profit to escalate the business. Most of the social value beneficiaries are people from the base of the economic pyramid (BOP) or the ones that have specific needs. Because of this, products and services should be affordable to consumers while solving social needs of the beneficiaries. Developing products and services with social value require tie relationship among the social enterprises and universities, public institutions, accelerators, and investors. Despite being focused on social value and contributing to the beneficiaries’ quality of life as well as contributing to the governments that cannot properly guarantee public services and infrastructure to the BOP, many barriers are faced by the social enterprises to escalate their businesses. This is a work in process and five micro- and small-sized social enterprises in Brazil have been studied: (i) one has developed a kit for cervical uterine cancer detection to allow the BOP women to collect their own material and deliver to a laboratory for U$1,00; (ii) other has developed special products without lactose and it is about 70% cheaper than the traditional brands in the market; (iii) the third has developed prosthesis and orthosis to surplus needs that health public system have not done efficiently; (iv) the fourth has produced and commercialized menstrual panties aiming to reduce the consumption of dischargeable ones while saving money to the consumers; (v) the fifth develops and commercializes clothes from fabric wastes in a partnership with BOP artisans. The preliminary results indicate that the main barriers are related to the public system to recognize these products as public money that could be saved if they bought products from these enterprises instead of the multinational pharmaceutical companies, to the traditional distribution system (e.g. pharmacies) that avoid these products because of the low or non-existing profit, to the difficulty buying raw material in small quantities, to leverage investment by the investors, to cultural barriers and taboos. Interesting strategies to reduce the costs have been observed: some enterprises have focused on simplifying products, others have invested in partnerships with local producers and have developed their machines focusing on process efficiency to leverage investment by the investors.Keywords: base of the pyramid, business model, social business, social business model, social enterprises
Procedia PDF Downloads 102559 Biodegradation of Carbamazepine and Diclofenac by Bacterial Strain Labrys Portucalensis
Authors: V. S. Bessa, I. S. Moreira, S. Murgolo, C. Piccirillo, G. Mascolo, P. M. L. Castro
Abstract:
The occurrence of pharmaceuticals in the environment has been a topic of increasing concern. Pharmaceuticals are not completely mineralized in the human body and are released on the sewage systems as the pharmaceutical itself and as their “biologically active” metabolites through excretion, as well as by improper elimination and disposal. Conventional wastewater treatment plants (WWTPs) are not designed to remove these emerging pollutants and they are thus released into the environment. The antiepileptic drug carbamazepine (CBZ) and the non-steroidal anti-inflammatory diclofenac (DCF) are two widely used pharmaceuticals, frequently detected in water bodies, including rivers and groundwater, in concentrations ranging from ng L 1 to mg L 1. These two compounds were classified as medium to high-risk pollutants in WWTP effluents and surface waters. Also, CBZ has been suggested as a molecular marker of wastewater contamination in surface water and groundwater and the European Union included DCF in the watch list of substances Directive to be monitored. In the present study, biodegradation of CBZ and DCF by the bacterial strain Labrys portucalensis F11, a strain able to degrade other pharmaceutical compounds, was assessed; tests were performed with F11 as single carbon and energy source, as well as in presence of 5.9mM of sodium acetate. In assays supplemented with 2.0 and 4.0 µM of CBZ, the compound was no longer detected in the bulk medium after 24hr and 5days, respectively. Complete degradation was achieved in 21 days for 11.0 µM and in 23 days for 21.0 µM. For the highest concentration tested (43.0 µM), 95% of degradation was achieved in 30days. Supplementation with acetate increased the degradation rate of CBZ, for all tested concentrations. In the case of DCF, when supplemented as a single carbon source, approximately 70% of DCF (1.7, 3.3, 8.4, 17.5 and 34.0 µM) was degraded in 30days. Complete degradation was achieved in the presence of acetate for all tested concentrations, at higher degradation rates. The detection of intermediates produced during DCF biodegradation was performed by UPLC-QTOF/MS/MS, which allowed the identification of a range of metabolites. Stoichiometric liberation of chorine occurred and no metabolites were detected at the end of the biodegradation assays suggesting a complete mineralization of DCF. Strain Labrys portucalensis F11 proved to be able to degrade these two top priority environmental contaminants and may be potentially useful for biotechnological applications/environment remediation.Keywords: biodegradation, carbamazepine, diclofenac, pharmaceuticals
Procedia PDF Downloads 273558 Suicide Wrongful Death: Standard of Care Problems Involving the Inaccurate Discernment of Lethal Risk When Focusing on the Elicitation of Suicide Ideation
Authors: Bill D. Geis
Abstract:
Suicide wrongful death forensic cases are the fastest rising tort in mental health law. It is estimated that suicide-related cases have accounted for 15% of U.S. malpractice claims since 2006. Most suicide-related personal injury claims fall into the legal category of “wrongful death.” Though mental health experts may be called on to address a range of forensic questions in wrongful death cases, the central consultation that most experts provide is about the negligence element—specifically, the issue of whether the clinician met the clinical standard of care in assessing, treating, and managing the deceased person’s mental health care. Standards of care, varying from U.S. state to state, are broad and address what a reasonable clinician might do in a similar circumstance. This fact leaves the issue of the suicide standard of care, in each case, up to forensic experts to put forth a reasoned estimate of what the standard of care should have been in the specific case under litigation. Because the general state guidelines for standard of care are broad, forensic experts are readily retained to provide scientific and clinical opinions about whether or not a clinician met the standard of care in their suicide assessment, treatment, and management of the case. In the past and in much of current practice, the assessment of suicide has centered on the elicitation of verbalized suicide ideation. Research in recent years, however, has indicated that the majority of persons who end their lives do not say they are suicidal at their last medical or psychiatric contact. Near-term risk assessment—that goes beyond verbalized suicide ideation—is needed. Our previous research employed structural equation modeling to predict lethal suicide risk--eight negative thought patterns (feeling like a burden on others, hopelessness, self-hatred, etc.) mediated by nine transdiagnostic clinical factors (mental torment, insomnia, substance abuse, PTSD intrusions, etc.) were combined to predict acute lethal suicide risk. This structural equation model, the Lethal Suicide Risk Pattern (LSRP), Acute model, had excellent goodness-of-fit [χ2(df) = 94.25(47)***, CFI = .98, RMSEA = .05, .90CI = .03-.06, p(RMSEA = .05) = .63. AIC = 340.25, ***p < .001.]. A further SEQ analysis was completed for this paper, adding a measure of Acute Suicide Ideation to the previous SEQ. Acceptable prediction model fit was no longer achieved [χ2(df) = 3.571, CFI > .953, RMSEA = .075, .90% CI = .065-.085, AIC = 529.550].This finding suggests that, in this additional study, immediate verbalized suicide ideation information was unhelpful in the assessment of lethal risk. The LSRP and other dynamic, near-term risk models (such as the Acute Suicide Affective Disorder Model and the Suicide Crisis Syndrome Model)—going beyond elicited suicide ideation—need to be incorporated into current clinical suicide assessment training. Without this training, the standard of care for suicide assessment is out of sync with current research—an emerging dilemma for the forensic evaluation of suicide wrongful death cases.Keywords: forensic evaluation, standard of care, suicide, suicide assessment, wrongful death
Procedia PDF Downloads 69557 Effect of Compaction Method on the Mechanical and Anisotropic Properties of Asphalt Mixtures
Authors: Mai Sirhan, Arieh Sidess
Abstract:
Asphaltic mixture is a heterogeneous material composed of three main components: aggregates; bitumen and air voids. The professional experience and scientific literature categorize asphaltic mixture as a viscoelastic material, whose behavior is determined by temperature and loading rate. Properties characterization of the asphaltic mixture used under the service conditions is done by compacting and testing cylindric asphalt samples in the laboratory. These samples must resemble in a high degree internal structure of the mixture achieved in service, and the mechanical characteristics of the compacted asphalt layer in the pavement. The laboratory samples are usually compacted in temperatures between 140 and 160 degrees Celsius. In this temperature range, the asphalt has a low degree of strength. The laboratory samples are compacted using the dynamic or vibrational compaction methods. In the compaction process, the aggregates tend to align themselves in certain directions that lead to anisotropic behavior of the asphaltic mixture. This issue has been studied in the Strategic Highway Research Program (SHRP) research, that recommended using the gyratory compactor based on the assumption that this method is the best in mimicking the compaction in the service. In Israel, the Netivei Israel company is considering adopting the Gyratory Method as a replacement for the Marshall method used today. Therefore, the compatibility of the Gyratory Method for the use with Israeli asphaltic mixtures should be investigated. In this research, we aimed to examine the impact of the compaction method used on the mechanical characteristics of the asphaltic mixtures and to evaluate the degree of anisotropy in relation to the compaction method. In order to carry out this research, samples have been compacted in the vibratory and gyratory compactors. These samples were cylindrically cored both vertically (compaction wise) and horizontally (perpendicular to compaction direction). These models were tested under dynamic modulus and permanent deformation tests. The comparable results of the tests proved that: (1) specimens compacted by the vibratory compactor had higher dynamic modulus values than the specimens compacted by the gyratory compactor (2) both vibratory and gyratory compacted specimens had anisotropic behavior, especially in high temperatures. Also, the degree of anisotropy is higher in specimens compacted by the gyratory method. (3) Specimens compacted by the vibratory method that were cored vertically had the highest resistance to rutting. On the other hand, specimens compacted by the vibratory method that were cored horizontally had the lowest resistance to rutting. Additionally (4) these differences between the different types of specimens rise mainly due to the different internal arrangement of aggregates resulting from the compaction method. (5) Based on the initial prediction of the performance of the flexible pavement containing an asphalt layer having characteristics based on the results achieved in this research. It can be concluded that there is a significant impact of the compaction method and the degree of anisotropy on the strains that develop in the pavement, and the resistance of the pavement to fatigue and rutting defects.Keywords: anisotropy, asphalt compaction, dynamic modulus, gyratory compactor, mechanical properties, permanent deformation, vibratory compactor
Procedia PDF Downloads 119556 The Implementation of Inclusive Education in Collaboration between Teachers of Special Education Classes and Regular Classes in a Preschool
Authors: Chiou-Shiue Ko
Abstract:
As is explicitly stipulated in Article 7 of the Enforcement Rules of the Special Education Act as amended in 1998, "in principle, children with disabilities should be integrated with normal children for preschool education". Since then, all cities and counties have been committed to promoting preschool inclusive education. The Education Department, New Taipei City Government, has been actively recruiting advisory groups of professors to assist in the implementation of inclusive education in preschools since 2001. Since 2011, the author of this study has been guiding Preschool Rainbow to implement inclusive education. Through field observations, meetings, and teaching demonstration seminars, this study explored the process of how inclusive education has been successfully implemented in collaboration with teachers of special education classes and regular classes in Preschool Rainbow. The implementation phases for inclusive education in a single academic year include the following: 1) Preparatory stage. Prior to implementation, teachers in special education and regular classes discuss ways of conducting inclusive education and organize reading clubs to read books related to curriculum modifications that integrate the eight education strategies, early treatment and education, and early childhood education programs to enhance their capacity to implement and compose teaching plans for inclusive education. In addition to the general objectives of inclusive education, the objective of inclusive education for special children is also embedded into the Individualized Education Program (IEP). 2) Implementation stage. Initially, a promotional program for special education is implemented for the children to allow all the children in the preschool to understand their own special qualities and those of special children. After the implementation of three weeks of reverse inclusion, the children in the special education classes are put into groups and enter the regular classes twice a week to implement adjustments to their inclusion in the learning area and the curriculum. In 2013, further cooperation was carried out with adjacent hospitals to perform development screening activities for the early detection of children with developmental delays. 3) Review and reflection stage. After the implementation of inclusive education, all teachers in the preschool are divided into two groups to record their teaching plans and the lessons learned during implementation. The effectiveness of implementing the objective of inclusive education is also reviewed. With the collaboration of all teachers, in 2015, Preschool Rainbow won New Taipei City’s “Preschool Light” award as an exceptional model for inclusive education. Its model of implementing inclusive education can be used as a reference for other preschools.Keywords: collaboration, inclusive education, preschool, teachers, special education classes, regular classes
Procedia PDF Downloads 430555 Potential of Aerodynamic Feature on Monitoring Multilayer Rough Surfaces
Authors: Ibtissem Hosni, Lilia Bennaceur Farah, Saber Mohamed Naceur
Abstract:
In order to assess the water availability in the soil, it is crucial to have information about soil distributed moisture content; this parameter helps to understand the effect of humidity on the exchange between soil, plant cover and atmosphere in addition to fully understanding the surface processes and the hydrological cycle. On the other hand, aerodynamic roughness length is a surface parameter that scales the vertical profile of the horizontal component of the wind speed and characterizes the surface ability to absorb the momentum of the airflow. In numerous applications of the surface hydrology and meteorology, aerodynamic roughness length is an important parameter for estimating momentum, heat and mass exchange between the soil surface and atmosphere. It is important on this side, to consider the atmosphere factors impact in general, and the natural erosion in particular, in the process of soil evolution and its characterization and prediction of its physical parameters. The study of the induced movements by the wind over soil vegetated surface, either spaced plants or plant cover, is motivated by significant research efforts in agronomy and biology. The known major problem in this side concerns crop damage by wind, which presents a booming field of research. Obviously, most models of soil surface require information about the aerodynamic roughness length and its temporal and spatial variability. We have used a bi-dimensional multi-scale (2D MLS) roughness description where the surface is considered as a superposition of a finite number of one-dimensional Gaussian processes each one having a spatial scale using the wavelet transform and the Mallat algorithm to describe natural surface roughness. We have introduced multi-layer aspect of the humidity of the soil surface, to take into account a volume component in the problem of backscattering radar signal. As humidity increases, the dielectric constant of the soil-water mixture increases and this change is detected by microwave sensors. Nevertheless, many existing models in the field of radar imagery, cannot be applied directly on areas covered with vegetation due to the vegetation backscattering. Thus, the radar response corresponds to the combined signature of the vegetation layer and the layer of soil surface. Therefore, the key issue of the numerical estimation of soil moisture is to separate the two contributions and calculate both scattering behaviors of the two layers by defining the scattering of the vegetation and the soil blow. This paper presents a synergistic methodology, and it is for estimating roughness and soil moisture from C-band radar measurements. The methodology adequately represents a microwave/optical model which has been used to calculate the scattering behavior of the aerodynamic vegetation-covered area by defining the scattering of the vegetation and the soil below.Keywords: aerodynamic, bi-dimensional, vegetation, synergistic
Procedia PDF Downloads 271554 Evaluation of the Surveillance System for Rift Valley Fever in Ruminants in Mauritania, 2019
Authors: Mohamed El Kory Yacoub, Ahmed Bezeid El Mamy Beyatt, Djibril Barry, Yanogo Pauline, Nicolas Meda
Abstract:
Introduction: Rift Valley Fever is a zoonotic arbovirosis that severely affects ruminants, as well as humans. It causes abortions in pregnant females and deaths in young animals. The disease occurs during heavy rains followed by large numbers of mosquito vectors. The objective of this work is to evaluate the surveillance system for Rift Valley Fever. Methods: We conducted an evaluation of the Rift Valley Fiver surveillance system. Data were collected from the analysis of the national database of the Mauritanian Network of Animal Disease Epidemiological Surveillance at the Ministry of Rural Development, of RVF cases notified from the whole national territory, of questionnaires and interviews with all persons involved in RVF surveillance at the central level. The quality of the system was assessed by analyzing the quantitative attributes defined by the Centers for Disease Control and Prevention. Results: In 2019, 443 cases of RVF were notified by the surveillance system, of which 36 were positive. Among the notified cases of Rift Valley Fever, the 0- to the 3-year-old age group of small ruminants was the most represented with 49.21% of cases, followed by 33.33%, which was recorded in large ruminants in the 0 to 7-year-old age group, 11.11% of cases were older than seven years. The completeness of the data varied between 14.2% (age) and 100% (species). Most positive cases were recorded between October and November 2019 in seven different regions. Attribute analysis showed that 87% of the respondents were able to use the case definition well, and 78.8% said they were familiar with the reporting and feedback loop of the Rift Valley Fever data. 90.3% of the respondents found it easy, while 95% of them responded that it was easy for them to transmit their data to the next level. Conclusions: The epidemiological surveillance system for Rift Valley Fever in Mauritania is simple and representative. However, data quality, stability, and responsiveness are average, as the diagnosis of the disease requires laboratory confirmation and the average delay for this confirmation is long (13 days). Consequently, the lack of completeness of the recorded data and of description of cases in terms of time-place-animal, associated with the delay between the stages of the surveillance system can make prevention, early detection of epidemics, and the initiation of measures for an adequate response difficult.Keywords: evaluation, epidemiological surveillance system, rift valley fever, mauritania, ruminants
Procedia PDF Downloads 150553 High Capacity SnO₂/Graphene Composite Anode Materials for Li-Ion Batteries
Authors: Hilal Köse, Şeyma Dombaycıoğlu, Ali Osman Aydın, Hatem Akbulut
Abstract:
Rechargeable lithium-ion batteries (LIBs) have become promising power sources for a wide range of applications, such as mobile communication devices, portable electronic devices and electrical/hybrid vehicles due to their long cycle life, high voltage and high energy density. Graphite, as anode material, has been widely used owing to its extraordinary electronic transport properties, large surface area, and high electrocatalytic activities although its limited specific capacity (372 mAh g-1) cannot fulfil the increasing demand for lithium-ion batteries with higher energy density. To settle this problem, many studies have been taken into consideration to investigate new electrode materials and metal oxide/graphene composites are selected as a kind of promising material for lithium ion batteries as their specific capacities are much higher than graphene. Among them, SnO₂, an n-type and wide band gap semiconductor, has attracted much attention as an anode material for the new-generation lithium-ion batteries with its high theoretical capacity (790 mAh g-1). However, it suffers from large volume changes and agglomeration associated with the Li-ion insertion and extraction processes, which brings about failure and loss of electrical contact of the anode. In addition, there is also a huge irreversible capacity during the first cycle due to the formation of amorphous Li₂O matrix. To obtain high capacity anode materials, we studied on the synthesis and characterization of SnO₂-Graphene nanocomposites and investigated the capacity of this free-standing anode material in this work. For this aim, firstly, graphite oxide was obtained from graphite powder using the method described by Hummers method. To prepare the nanocomposites as free-standing anode, graphite oxide particles were ultrasonicated in distilled water with SnO2 nanoparticles (1:1, w/w). After vacuum filtration, the GO-SnO₂ paper was peeled off from the PVDF membrane to obtain a flexible, free-standing GO paper. Then, GO structure was reduced in hydrazine solution. Produced SnO2- graphene nanocomposites were characterized by scanning electron microscopy (SEM), energy dispersive X-ray spectrometer (EDS), and X-ray diffraction (XRD) analyses. CR2016 cells were assembled in a glove box (MBraun-Labstar). The cells were charged and discharged at 25°C between fixed voltage limits (2.5 V to 0.2 V) at a constant current density on a BST8-MA MTI model battery tester with 0.2C charge-discharge rate. Cyclic voltammetry (CV) was performed at the scan rate of 0.1 mVs-1 and electrochemical impedance spectroscopy (EIS) measurements were carried out using Gamry Instrument applying a sine wave of 10 mV amplitude over a frequency range of 1000 kHz-0.01 Hz.Keywords: SnO₂-graphene, nanocomposite, anode, Li-ion battery
Procedia PDF Downloads 228552 Missed Opportunities for Immunization of under Five Children in Calabar South County Cros River State, Nigeria, the Way Forward
Authors: Celestine Odigwe, Epoke Lincoln, Rhoda-Dara Ephraim
Abstract:
Background; Immunization against the childhood killer diseases is the cardinal strategy for the prevention of these diseases all over the world in under five children, these diseases include; Tuberculosis, Measles, Polio, Tetanus, Diphthria, Pertusis, Yellow Fever, Hepatitis B, Haemophilus Influenza type B. 6.9 million children die before their fifth birthday , 80% of the worlds death in children under 5 years occur in 25 countries most in Africa and Asia and 2 million children can be saved each year with routine immunization Therefore failure to achieve total immunization coverage puts several children at risk. Aim; The aim of the study was to ascertain the prevalence, Investigate the various reasons and causes why several under five children in a suburb of calabar municipal county fail to get the required immunizations as at and when due and possibly the consequences, so that efforts can be re-directed towards the solution of the problems so identified. Methods; the study was a community based cross sectional study. The respondents were the mothers/guardians of the sampled children who were all aged 0-59 months. To be eligible for recruitment into the study, the parent or guardian was required to give an informed consent, reside within the Calabar South County with his/her children aged 0-59 months. We calculated our sample size using the Leslie-Kish formula and we used a two-staged sampling method, first to ballot for the wards to be involved and then to select four of the most populated ones in the wards chosen. Data collection was by interviewer administered structured questionnaire (Appendix I), Data collected was entered and analyzed using Statistical Package for the Social Sciences (SPSS) Version 20. Percentages were calculated and represented using charts and tables Results; The number of children sampled was 159. We found that 150 were fully immunized and 9 were not, the prevalence of missed opportunity was 32% from the study. The reasons for missed opportunities were varied, ranging from false contraindications, logistical problems resulting in very poor access roads to health facilities and poor organization of health centers together with negative health worker attitudes. Some of the consequences of these missed opportunities were increased susceptibility to vaccine preventable diseases, resurgence of the above diseases and increased morbidity and mortality of children aged less than 5 years. Conclusion; We found that ignorance on the part of both parents/guardians and health care staff together with infrastructural inadequacies in the county such as- roads, poor electric power supply for storage of vaccines were hugely responsible for most missed opportunities for immunization. The details of these and suggestions for improvement and the way forward are discussed.Keywords: missed opportunity, immunization, under five, Calabar south
Procedia PDF Downloads 326