Search results for: event quantification
1288 A Model of Human Security: A Comparison of Vulnerabilities and Timespace
Authors: Anders Troedsson
Abstract:
For us humans, risks are intimately linked to human vulnerabilities - where there is vulnerability, there is potentially insecurity, and risk. Reducing vulnerability through compensatory measures means increasing security and decreasing risk. The paper suggests that a meaningful way to approach the study of risks (including threats, assaults, crisis etc.), is to understand the vulnerabilities these external phenomena evoke in humans. As is argued, the basis of risk evaluation, as well as responses, is the more or less subjective perception by the individual person, or a group of persons, exposed to the external event or phenomena in question. This will be determined primarily by the vulnerability or vulnerabilities that the external factor are perceived to evoke. In this way, risk perception is primarily an inward dynamic, rather than an outward one. Therefore, a route towards an understanding of the perception of risks, is a closer scrutiny of the vulnerabilities which they can evoke, thereby approaching an understanding of what in the paper is called the essence of risk (including threat, assault etc.), or that which a certain perceived risk means to an individual or group of individuals. As a necessary basis for gauging the wide spectrum of potential risks and their meaning, the paper proposes a model of human vulnerabilities, drawing from i.a. a long tradition of needs theory. In order to account for the subjectivity factor, which mediates between the innate vulnerabilities on the one hand, and the event or phenomenon out there on the other hand, an ensuing ontological discussion about the timespace characteristics of risk/threat/assault as perceived by humans leads to the positing of two dimensions. These two dimensions are applied on the vulnerabilities, resulting in a modelling effort featuring four realms of vulnerabilities which are related to each other and together represent a dynamic whole. In approaching the problem of risk perception, the paper thus defines the relevant realms of vulnerabilities, depicting them as a dynamic whole. With reference to a substantial body of literature and a growing international policy trend since the 1990s, this model is put in the language of human security - a concept relevant not only for international security studies and policy, but also for other academic disciplines and spheres of human endeavor.Keywords: human security, timespace, vulnerabilities, risk perception
Procedia PDF Downloads 3391287 Scale up of Isoniazid Preventive Therapy: A Quality Management Approach in Nairobi County, Kenya
Authors: E. Omanya, E. Mueni, G. Makau, M. Kariuki
Abstract:
HIV infection is the strongest risk factor for a person to develop TB. Isoniazid preventive therapy (IPT) for People Living with HIV (PLWHIV) not only reduces the individual patients’ risk of developing active TB but mitigates cross infection. In Kenya, IPT for six months was recommended through the National TB, Leprosy and Lung Disease Program to treat latent TB. In spite of this recommendation by the national government, uptake of IPT among PLHIV remained low in Kenya by the end of 2015. The USAID/Kenya and East Africa Afya Jijini project, which supports 42 TBHIV health facilities in Nairobi County, began addressing low uptake of IPT through Quality Improvement (QI) teams set up at the facility level. Quality is characterized by WHO as one of the four main connectors between health systems building blocks and health systems outputs. Afya Jijini implements the Kenya Quality Model for Health, which involves QI teams being formed at the county, sub-county and facility levels. The teams review facility performance to identify gaps in service delivery and use QI tools to monitor and improve performance. Afya Jijini supported the formation of these teams in 42 facilities and built the teams’ capacity to review data and use QI principles to identify and address performance gaps. When the QI teams began working on improving IPT uptake among PLHIV, uptake was at 31.8%. The teams first conducted a root cause analysis using cause and effect diagrams, which help the teams to brainstorm on and to identify barriers to IPT uptake among PLHIV at the facility level. This is a participatory process where program staff provides technical support to the QI teams in problem identification and problem-solving. The gaps identified were inadequate knowledge and skills on the use of IPT among health care workers, lack of awareness of IPT by patients, inadequate monitoring and evaluation tools, and poor quantification and forecasting of IPT commodities. In response, Afya Jijini trained over 300 health care workers on the administration of IPT, supported patient education, supported quantification and forecasting of IPT commodities, and provided IPT data collection tools to help facilities monitor their performance. The facility QI teams conducted monthly meetings to monitor progress on implementation of IPT and took corrective action when necessary. IPT uptake improved from 31.8% to 61.2% during the second year of the Afya Jijini project and improved to 80.1% during the third year of the project’s support. Use of QI teams and root cause analysis to identify and address service delivery gaps, in addition to targeted program interventions and continual performance reviews, can be successful in increasing TB related service delivery uptake at health facilities.Keywords: isoniazid, quality, health care workers, people leaving with HIV
Procedia PDF Downloads 1001286 Instructional Information Resources
Authors: Parveen Kumar
Abstract:
This article discusses institute information resources. Information, in its most restricted technical sense, is a sequence of symbols that can be interpreted as message information can be recorded as signs, or transmitted as signals. Information is any kind of event that affects the state of a dynamic system. Conceptually, information is the message being conveyed. This concept has numerous other meanings in different contexts. Moreover, the concept of information is closely related to notions of constraint, communication, control, data, form, instruction, knowledge, meaning, mental stimulus, pattern, perception, representation, and especially entropy.Keywords: institutions, information institutions, information services for mission-oriented institute, pattern
Procedia PDF Downloads 3781285 Managing Crowds at Sports Mega Events: Examining the Impact of ‘Fan Parks’ at International Football Tournaments between 2002 and 2016
Authors: Joel Rookwood
Abstract:
Sports mega events have become increasingly significant in sporting, political and economic terms, with analysis often focusing on issues including resource expenditure, development, legacy and sustainability. Transnational tournaments can inspire interest from a variety of demographics, and the operational management of such events can involve contributions from a range of personnel. In addition to television audiences events also attract attending spectators, and in football contexts the temporary migration of fans from potentially rival nations and teams can present event organising committees and security personnel with various challenges in relation to crowd management. The behaviour, interaction and control of supporters has previously led to incidents of disorder and hooliganism, with damage to property as well as injuries and deaths proving significant consequences. The Heysel tragedy at the 1985 European Cup final in Brussels is a notable example, where 39 fans died following crowd disorder and mismanagement. Football disasters and disorder, particularly in the context of international competition, have inspired responses from police, law makers, event organisers, clubs and associations, including stadium improvements, legislative developments and crowd management practice to improve the effectiveness of spectator safety. The growth and internationalisation of fandom and developments in event management and tourism have seen various responses to the evolving challenges associated with hosting large numbers of visiting spectators at mega events. In football contexts ‘fan parks’ are a notable example. Since the first widespread introduction in European football competitions at the 2006 World Cup finals in Germany, these facilities have become a staple element of such mega events. This qualitative, longitudinal, multi-continent research draws on extensive semi-structured interview and observation data. As a frame of reference, this work considers football events staged before and after the development of fan parks. Research was undertaken at four World Cup finals (Japan 2002, Germany 2006, South Africa 2010 and Brazil 2014), four European Championships (Portugal 2004, Switzerland/Austria 2008, Poland/Ukraine 2012 and France 2016), four other confederation tournaments (Ghana 2008, Qatar 2011, USA 2011 and Chile 2015), and four European club finals (Istanbul 2005, Athens 2007, Rome 2009 and Basle 2016). This work found that these parks are typically temporarily erected, specifically located zones where supporters congregate together irrespective of allegiances to watch matches on large screens, and partake in other forms of organised on-site entertainment. Such facilities can also allow organisers to control the behaviour, confine the movement and monitor the alcohol consumption of supporters. This represents a notable shift in policy from previous football tournaments, when the widely assumed causal link between alcohol and hooliganism which frequently shaped legislative and police responses to disorder, also dissuaded some authorities from permitting fans to consume alcohol in and around stadia. It also reflects changing attitudes towards modern football fans. The work also found that in certain contexts supporters have increasingly engaged with such provision which impacts fan behaviour, but that this is relative to factors including location, facilities, management and security.Keywords: event, facility, fan, management, park
Procedia PDF Downloads 3131284 Financial Ethics: A Review of 2010 Flash Crash
Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid
Abstract:
Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.Keywords: flash crash, market crash, stock market, stock market crash
Procedia PDF Downloads 5211283 The Price of Knowledge in the Times of Commodification of Higher Education: A Case Study on the Changing Face of Education
Authors: Joanna Peksa, Faith Dillon-Lee
Abstract:
Current developments in the Western economies have turned some universities into corporate institutions driven by practices of production and commodity. Academia is increasingly becoming integrated into national economies as a result of students paying fees and is consequently using business practices in student retention and engagement. With these changes, pedagogy status as a priority within the institution has been changing in light of these new demands. New strategies have blurred the boundaries that separate a student from a client. This led to a change of the dynamic, disrupting the traditional idea of the knowledge market, and emphasizing the corporate aspect of universities. In some cases, where students are seen primarily as a customer, the purpose of academia is no longer to educate but sell a commodity and retain fee-paying students. This paper considers opposing viewpoints on the commodification of higher education, reflecting on the reality of maintaining a pedagogic grounding in an increasingly commercialized sector. By analysing a case study of the Student Success Festival, an event that involved academic and marketing teams, the differences are considered between the respective visions of the pedagogic arm of the university and the corporate. This study argues that the initial concept of the event, based on the principles of gamification, independent learning, and cognitive criticality, was more clearly linked to a grounded pedagogic approach. However, when liaising with the marketing team in a crucial step in the creative process, it became apparent that these principles were not considered a priority in terms of their remit. While the study acknowledges in the power of pedagogy, the findings show that a pact of concord is necessary between different stakeholders in order for students to benefit fully from their learning experience. Nevertheless, while issues of power prevail and whenever power is unevenly distributed, reaching a consensus becomes increasingly challenging and further research should closely monitor the developments in pedagogy in the UK higher education.Keywords: economic pressure, commodification, pedagogy, gamification, public service, marketization
Procedia PDF Downloads 1331282 Quantification of Methane Emissions from Solid Waste in Oman Using IPCC Default Methodology
Authors: Wajeeha A. Qazi, Mohammed-Hasham Azam, Umais A. Mehmood, Ghithaa A. Al-Mufragi, Noor-Alhuda Alrawahi, Mohammed F. M. Abushammala
Abstract:
Municipal Solid Waste (MSW) disposed in landfill sites decompose under anaerobic conditions and produce gases which mainly contain carbon dioxide (CO₂) and methane (CH₄). Methane has the potential of causing global warming 25 times more than CO₂, and can potentially affect human life and environment. Thus, this research aims to determine MSW generation and the annual CH₄ emissions from the generated waste in Oman over the years 1971-2030. The estimation of total waste generation was performed using existing models, while the CH₄ emissions estimation was performed using the intergovernmental panel on climate change (IPCC) default method. It is found that total MSW generation in Oman might be reached 3,089 Gg in the year 2030, which approximately produced 85 Gg of CH₄ emissions in the year 2030.Keywords: methane, emissions, landfills, solid waste
Procedia PDF Downloads 5101281 Drying and Transport Processes in Distributed Hydrological Modelling Based on Finite Volume Schemes (Iber Model)
Authors: Carlos Caro, Ernest Bladé, Pedro Acosta, Camilo Lesmes
Abstract:
The drying-wet process is one of the topics to be more careful in distributed hydrological modeling using finite volume schemes as a means of solving the equations of Saint Venant. In a hydrologic and hydraulic computer model, surface flow phenomena depend mainly on the different flow accumulation and subsequent runoff generation. These accumulations are generated by routing, cell by cell, from the heights of water, which begin to appear due to the rain at each instant of time. Determine when it is considered a dry cell and when considered wet to include in the full calculation is an issue that directly affects the quantification of direct runoff or generation of flow at the end of a zone of contribution by accumulations flow generated from cells or finite volume.Keywords: hydrology, transport processes, hydrological modelling, finite volume schemes
Procedia PDF Downloads 3861280 Patient Safety Culture in Brazilian Hospitals from Nurse's Team Perspective
Authors: Carmen Silvia Gabriel, Dsniele Bernardi da Costa, Andrea Bernardes, Sabrina Elias Mikael, Daniele da Silva Ramos
Abstract:
The goal of this quantitative study is to investigate patient safety culture from the perspective of professional from the hospital nursing team.It was conducted in two Brazilian hospitals,.The sample included 282 nurses Data collection occurred in 2013, through the questionnaire Hospital Survey on Patient Safety Culture.Based on the assessment of the dimensions is stressed that, in the dimension teamwork across hospital units, 69.4% of professionals agree that when a lot of work needs to be done quickly, they work together as a team; about the dimension supervisor/ manager expectations and actions promoting safety, 70.2% agree that their supervisor overlooks patient safety problems.Related to organizational learning and continuous improvement, 56.5% agree that there is evaluation of the effectiveness of the changes after its implementation.On hospital management support for patient safety, 52.8% refer that the actions of hospital management show that patient safety is a top priority.On the overall perception of patient safety, 57.2% disagree that patient safety is never compromised due to higher amount of work to be completed.In what refers to feedback and communication about error, 57.7% refer that always and usually receive such information. Relative to communication openness, 42.9% said they never or rarely feel free to question the decisions / actions of their superiors.On frequency of event reporting, 64.7% said often and always notify events with no damages to patients..About teamwork across hospital units is noted similarity between the percentages of agreement and disagreement, as on the item there is a good cooperation among hospital units that need to work together, that indicates 41.4% and 40.5% respectively.Related to adequacy of professionals, 77.8 % disagree on the existence of sufficient amount of employees to do the job, 52.4% agree that shift changes are problematic for patients. On nonpunitive response to errors, 71.7% indicate that when an event is reported it seems that the focus is on the person.On the patient safety grade of the institution, 41.6 % classified it as very good. it is concluded that there are positive points in the safety culture, and some weaknesses as a punitive culture and impaired patient safety due to work overload .Keywords: quality of health care, health services evaluation, safety culture, patient safety, nursing team
Procedia PDF Downloads 2991279 Identification and Quantification of Lisinopril from Pure, Formulated and Urine Samples by Micellar Thin Layer Chromatography
Authors: Sudhanshu Sharma
Abstract:
Lisinopril, 1-[N-{(s)-I-carboxy-3 phenyl propyl}-L-proline dehydrate is a lysine analog of enalaprilat, the active metabolite of enalapril. It is long-acting, non-sulhydryl angiotensin-converting enzyme (ACE) inhibitor that is used for the treatment of hypertension and congestive heart failure in daily dosage 10-80 mg. Pharmacological activity of lisinopril has been proved in various experimental and clinical studies. Owing to its importance and widespread use, efforts have been made towards the development of simple and reliable analytical methods. As per our literature survey, lisinopril in pharmaceutical formulations has been determined by various analytical methodologies like polaragraphy, potentiometry, and spectrophotometry, but most of these analytical methods are not too suitable for the Identification of lisinopril from clinical samples because of the interferences caused by the amino acids and amino groups containing metabolites present in biological samples. This report is an attempt in the direction of developing a simple and reliable method for on plate identification and quantification of lisinopril in pharmaceutical formulations as well as from human urine samples using silica gel H layers developed with a new mobile phase comprising of micellar solutions of N-cetyl-N, N, N-trimethylammonium bromide (CTAB). Micellar solutions have found numerous practical applications in many areas of separation science. Micellar liquid chromatography (MLC) has gained immense popularity and wider applicability due to operational simplicity, cost effectiveness, relatively non-toxicity and enhanced separation efficiency, low aggressiveness. Incorporation of aqueous micellar solutions as mobile phase was pioneered by Armstrong and Terrill as they accentuated the importance of TLC where simultaneous separation of ionic or non-ionic species in a variety of matrices is required. A peculiarity of the micellar mobile phases (MMPs) is that they have no macroscopic analogues, as a result the typical separations can be easily achieved by using MMPs than aqueous organic mobile phases. Previously MMPs were successfully employed in TLC based critical separations of aromatic hydrocarbons, nucleotides, vitamin K1 and K5, o-, m- and p- aminophenol, amino acids, separation of penicillins. The human urine analysis for identification of selected drugs and their metabolites has emerged as an important investigation tool in forensic drug analysis. Among all chromatographic methods available only thin layer chromatography (TLC) enables a simple fast and effective separation of the complex mixtures present in various biological samples and is recommended as an approved testing for forensic drug analysis by federal Law. TLC proved its applicability during successful separation of bio-active amines, carbohydrates, enzymes, porphyrins, and their precursors, alkaloid and drugs from urine samples.Keywords: lisnopril, surfactant, chromatography, micellar solutions
Procedia PDF Downloads 3671278 A Comparative Assessment of Industrial Composites Using Thermography and Ultrasound
Authors: Mosab Alrashed, Wei Xu, Stephen Abineri, Yifan Zhao, Jörn Mehnen
Abstract:
Thermographic inspection is a relatively new technique for Non-Destructive Testing (NDT) which has been gathering increasing interest due to its relatively low cost hardware and extremely fast data acquisition properties. This technique is especially promising in the area of rapid automated damage detection and quantification. In collaboration with a major industry partner from the aerospace sector advanced thermography-based NDT software for impact damaged composites is introduced. The software is based on correlation analysis of time-temperature profiles in combination with an image enhancement process. The prototype software is aiming to a) better visualise the damages in a relatively easy-to-use way and b) automatically and quantitatively measure the properties of the degradation. Knowing that degradation properties play an important role in the identification of degradation types, tests and results on specimens which were artificially damaged have been performed and analyzed.Keywords: NDT, correlation analysis, image processing, damage, inspection
Procedia PDF Downloads 5491277 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI
Authors: Hae-Yeoun Lee
Abstract:
Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring,which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.Keywords: cardiac MRI, graph searching, left ventricle segmentation, K-means clustering
Procedia PDF Downloads 4001276 A Quantification Method of Attractiveness of Stations and an Estimation Method of Number of Passengers Taking into Consideration the Attractiveness of the Station
Authors: Naoya Ozaki, Takuya Watanabe, Ryosuke Matsumoto, Noriko Fukasawa
Abstract:
In the metropolitan areas in Japan, in many stations, shopping areas are set up, and escalators and elevators are installed to make the stations be barrier-free. Further, many areas around the stations are being redeveloped. Railway business operators want to know how much effect these circumstances have on attractiveness of the station or number of passengers using the station. So, we performed a questionnaire survey of the station users in the metropolitan areas for finding factors to affect the attractiveness of stations. Then, based on the analysis of the survey, we developed a method to quantitatively evaluate attractiveness of the stations. We also developed an estimation method for number of passengers based on combination of attractiveness of the station quantitatively evaluated and the residential and labor population around the station. Then, we derived precise linear regression models estimating the attractiveness of the station and number of passengers of the station.Keywords: attractiveness of the station, estimation method, number of passengers of the station, redevelopment around the station, renovation of the station
Procedia PDF Downloads 2871275 Seismic Isolation of Existing Masonry Buildings: Recent Case Studies in Italy
Authors: Stefano Barone
Abstract:
Seismic retrofit of buildings through base isolation represents a consolidated protection strategy against earthquakes. It consists in decoupling the ground motion from that of the structure and introducing anti-seismic devices at the base of the building, characterized by high horizontal flexibility and medium/high dissipative capacity. This allows to protect structural elements and to limit damages to non-structural ones. For these reasons, full functionality is guaranteed after an earthquake event. Base isolation is applied extensively to both new and existing buildings. For the latter, it usually does not require any interruption of the structure use and occupants evacuation, a special advantage for strategic buildings such as schools, hospitals, and military buildings. This paper describes the application of seismic isolation to three existing masonry buildings in Italy: Villa “La Maddalena” in Macerata (Marche region), “Giacomo Matteotti” and “Plinio Il Giovane” school buildings in Perugia (Umbria region). The seismic hazard of the sites is characterized by a Peak Ground Acceleration (PGA) of 0.213g-0.287g for the Life Safety Limit State and between 0.271g-0.359g for the Collapse Limit State. All the buildings are isolated with a combination of free sliders type TETRON® CD with confined elastomeric disk and anti-seismic rubber isolators type ISOSISM® HDRB to reduce the eccentricity between the center of mass and stiffness, thus limiting torsional effects during a seismic event. The isolation systems are designed to lengthen the original period of vibration (i.e., without isolators) by at least three times and to guarantee medium/high levels of energy dissipation capacity (equivalent viscous damping between 12.5% and 16%). This allows the structures to resist 100% of the seismic design action. This article shows the performances of the supplied anti-seismic devices with particular attention to the experimental dynamic response. Finally, a special focus is given to the main site activities required to isolate a masonry building.Keywords: retrofit, masonry buildings, seismic isolation, energy dissipation, anti-seismic devices
Procedia PDF Downloads 741274 Bayesian Networks Scoping the Climate Change Impact on Winter Wheat Freezing Injury Disasters in Hebei Province, China
Authors: Xiping Wang,Shuran Yao, Liqin Dai
Abstract:
Many studies report the winter is getting warmer and the minimum air temperature is obviously rising as the important climate warming evidences. The exacerbated air temperature fluctuation tending to bring more severe weather variation is another important consequence of recent climate change which induced more disasters to crop growth in quite a certain regions. Hebei Province is an important winter wheat growing province in North of China that recently endures more winter freezing injury influencing the local winter wheat crop management. A winter wheat freezing injury assessment Bayesian Network framework was established for the objectives of estimating, assessing and predicting winter wheat freezing disasters in Hebei Province. In this framework, the freezing disasters was classified as three severity degrees (SI) among all the three types of freezing, i.e., freezing caused by severe cold in anytime in the winter, long extremely cold duration in the winter and freeze-after-thaw in early season after winter. The factors influencing winter wheat freezing SI include time of freezing occurrence, growth status of seedlings, soil moisture, winter wheat variety, the longitude of target region and, the most variable climate factors. The climate factors included in this framework are daily mean and range of air temperature, extreme minimum temperature and number of days during a severe cold weather process, the number of days with the temperature lower than the critical temperature values, accumulated negative temperature in a potential freezing event. The Bayesian Network model was evaluated using actual weather data and crop records at selected sites in Hebei Province using real data. With the multi-stage influences from the various factors, the forecast and assessment of the event-based target variables, freezing injury occurrence and its damage to winter wheat production, were shown better scoped by Bayesian Network model.Keywords: bayesian networks, climatic change, freezing Injury, winter wheat
Procedia PDF Downloads 4101273 The Importance of the Fluctuation in Blood Sugar and Blood Pressure of Insulin-Dependent Diabetic Patients with Chronic Kidney Disease
Authors: Hitoshi Minakuchi, Izumi Takei, Shu Wakino, Koichi Hayashi, Hiroshi Itoh
Abstract:
Objectives: Among type 2 diabetics, patients with CKD(chronic kidney disease), insulin resistance, impaired glyconeogenesis in kidney and reduced degradation of insulin are recognized, and we observed different fluctuational patterns of blood sugar between CKD patients and non-CKD patients. On the other hand, non-dipper type blood pressure change is the risk of organ damage and mortality. We performed cross-sectional study to elucidate the characteristic of the fluctuation of blood glucose and blood pressure at insulin-treated diabetic patients with chronic kidney disease. Methods: From March 2011 to April 2013, at the Ichikawa General Hospital of Tokyo Dental College, we recruited 20 outpatients. All participants are insulin-treated type 2 diabetes with CKD. We collected serum samples, urine samples for several hormone measurements, and performed CGMS(Continuous glucose measurement system), ABPM (ambulatory blood pressure monitoring), brain computed tomography, carotid artery thickness, ankle brachial index, PWV, CVR-R, and analyzed these data statistically. Results: Among all 20 participants, hypoglycemia was decided blood glucose 70mg/dl by CGMS of 9 participants (45.0%). The event of hypoglycemia was recognized lower eGFR (29.8±6.2ml/min:41.3±8.5ml/min, P<0.05), lower HbA1c (6.44±0.57%:7.53±0.49%), higher PWV (1858±97.3cm/s:1665±109.2cm/s), higher serum glucagon (194.2±34.8pg/ml:117.0±37.1pg/ml), higher free cortisol of urine (53.8±12.8μg/day:34.8±7.1μg/day), and higher metanephrin of urine (0.162±0.031mg/day:0.076±0.029mg/day). Non-dipper type blood pressure change in ABPM was detected 8 among 9 participants with hypoglycemia (88.9%), 4 among 11 participants (36.4%) without hypoglycemia. Multiplex logistic-regression analysis revealed that the event of hypoglycemia is the independent factor of non-dipper type blood pressure change. Conclusions: Among insulin-treated type 2 diabetic patients with CKD, the events of hypoglycemia were frequently detected, and can associate with the organ derangements through the medium of non-dipper type blood pressure change.Keywords: chronic kidney disease, hypoglycemia, non-dipper type blood pressure change, diabetic patients
Procedia PDF Downloads 4151272 SynKit: A Event-Driven and Scalable Microservices-Based Kitting System
Authors: Bruno Nascimento, Cristina Wanzeller, Jorge Silva, João A. Dias, André Barbosa, José Ribeiro
Abstract:
The increasing complexity of logistics operations stems from evolving business needs, such as the shift from mass production to mass customization, which demands greater efficiency and flexibility. In response, Industry 4.0 and 5.0 technologies provide improved solutions to enhance operational agility and better meet market demands. The management of kitting zones, combined with the use of Autonomous Mobile Robots, faces challenges related to coordination, resource optimization, and rapid response to customer demand fluctuations. Additionally, implementing lean manufacturing practices in this context must be carefully orchestrated by intelligent systems and human operators to maximize efficiency without sacrificing the agility required in an advanced production environment. This paper proposes and implements a microservices-based architecture integrating principles from Industry 4.0 and 5.0 with lean manufacturing practices. The architecture enhances communication and coordination between autonomous vehicles and kitting management systems, allowing more efficient resource utilization and increased scalability. The proposed architecture focuses on the modularity and flexibility of operations, enabling seamless flexibility to change demands and the efficient allocation of resources in realtime. Conducting this approach is expected to significantly improve logistics operations’ efficiency and scalability by reducing waste and optimizing resource use while improving responsiveness to demand changes. The implementation of this architecture provides a robust foundation for the continuous evolution of kitting management and process optimization. It is designed to adapt to dynamic environments marked by rapid shifts in production demands and real-time decision-making. It also ensures seamless integration with automated systems, aligning with Industry 4.0 and 5.0 needs while reinforcing Lean Manufacturing principles.Keywords: microservices, event-driven, kitting, AMR, lean manufacturing, industry 4.0, industry 5.0
Procedia PDF Downloads 281271 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region
Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan
Abstract:
Rainfall-runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15–May 18 2014). The prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.Keywords: flood, HEC-HMS, prediction, rainfall, runoff
Procedia PDF Downloads 3951270 Vulnerability Assessment for Protection of Ghardaia City to the Inundation of M’zabWadi
Authors: Mustapha Kamel Mihoubi, Reda Madi
Abstract:
The problem of natural disasters in general and flooding in particular is a topic which marks a memorable action in the world and specifically in cities and large urban areas. Torrential floods and faster flows pose a major problem in urban area. Indeed, a better management of risks of floods becomes a growing necessity that must mobilize technical and scientific means to curb the adverse consequences of this phenomenon, especially in the Saharan cities in arid climate. The aim of this study is to deploy a basic calculation approach based on a hydrologic and hydraulic quantification for locating the black spots in urban areas generated by the flooding and to locate the areas that are vulnerable to flooding. The principle of flooding method is applied to the city of Ghardaia to identify vulnerable areas to inundation and to establish maps management and prevention against the risks of flooding.Keywords: Alea, Beni Mzab, cartography, HEC-RAS, inundation, torrential, vulnerability, wadi
Procedia PDF Downloads 3121269 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand
Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk
Abstract:
Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment
Procedia PDF Downloads 1291268 Effect of Acetic Acid Fermentation on Bioactive Components and Anti-Xanthine Oxidase Activities in Vinegar Brewed from Monascus-Fermented Soybeans
Authors: Kyung-Soon Choi, Ji-Young Hwang, Young-Hee Pyo
Abstract:
Vinegars have been used as an alternative remedy for treating gout, but the scientific basis remains to be elucidated. In this study, acetic acid fermentation was applied for the first time to Monascus-fermented soybeans to examine its effect on the bioactive components together with the xanthine oxidase inhibitory (XOI) activity of the soy vinegar. The content of total phenols (0.47~0.97 mg gallic acid equivalents/mL) and flavonoids (0.18~0.39 mg quercetin equivallents/mL) were spectrophotometrically determined, and the content of organic acid (10.22~59.76 mg/mL) and isoflavones (6.79~7.46 mg/mL) were determined using HPLC-UV. The analytical method for ubiquinones (0.079~0.276 μg/mL) employed saponification before solvent extraction and quantification using LC-MS. Soy vinegar also showed significant XOI (95.3%) after 20 days of acetic acid fermentation at 30 °C. The results suggest that soy vinegar has potential as a novel medicinal food.Keywords: acetic acid fermentation, bioactive component, soy vinegar, xanthine oxidase inhibitory activity
Procedia PDF Downloads 3831267 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients
Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi
Abstract:
Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution
Procedia PDF Downloads 2611266 Optimization Study of Adsorption of Nickel(II) on Bentonite
Authors: B. Medjahed, M. A. Didi, B. Guezzen
Abstract:
This work concerns with the experimental study of the adsorption of the Ni(II) on bentonite. The effects of various parameters such as contact time, stirring rate, initial concentration of Ni(II), masse of clay, initial pH of aqueous solution and temperature on the adsorption yield, were carried out. The study of the effect of the ionic strength on the yield of adsorption was examined by the identification and the quantification of the present chemical species in the aqueous phase containing the metallic ion Ni(II). The adsorbed species were investigated by a calculation program using CHEAQS V. L20.1 in order to determine the relation between the percentages of the adsorbed species and the adsorption yield. The optimization process was carried out using 23 factorial designs. The individual and combined effects of three process parameters, i.e. initial Ni(II) concentration in aqueous solution (2.10−3 and 5.10−3 mol/L), initial pH of the solution (2 and 6.5), and mass of bentonite (0.03 and 0.3 g) on Ni(II) adsorption, were studied.Keywords: adsorption, bentonite, factorial design, Nickel(II)
Procedia PDF Downloads 1611265 A Construct to Perform in Situ Deformation Measurement of Material Extrusion-Fabricated Structures
Authors: Daniel Nelson, Valeria La Saponara
Abstract:
Material extrusion is an additive manufacturing modality that continues to show great promise in the ability to create low-cost, highly intricate, and exceedingly useful structural elements. As more capable and versatile filament materials are devised, and the resolution of manufacturing systems continues to increase, the need to understand and predict manufacturing-induced warping will gain ever greater importance. The following study presents an in situ remote sensing and data analysis construct that allows for the in situ mapping and quantification of surface displacements induced by residual stresses on a specified test structure. This proof-of-concept experimental process shows that it is possible to provide designers and manufacturers with insight into the manufacturing parameters that lead to the manifestation of these deformations and a greater understanding of the behavior of these warping events over the course of the manufacturing process.Keywords: additive manufacturing, deformation, digital image correlation, fused filament fabrication, residual stress, warping
Procedia PDF Downloads 901264 Solid Waste Management Policy Implementation in Imus, Cavite
Authors: Michael John S. Maceda
Abstract:
Waste has been a global concern aggravated by climate change. In the case of Imus, Cavite which in the past has little or no regard to waste experienced heavy flooding during August 19, 2013. This event led to a full blown implementation of Municipal Solid Waste Management integrating participation and the use of low-cost technology to reduce the amount of waste generated. The methodology employed by the city of Imus, provided a benchmark in the province of Cavite. Reducing the amount of waste generated and Solid Waste Management Cost.Keywords: SWM, IMUS, composting, policy
Procedia PDF Downloads 8391263 Study of the Stability of the Slope Open-Pit Mines: Case of the Mine of Phosphates – Tebessa, Algeria
Authors: Mohamed Fredj, Abdallah Hafsaoui, Radouane Nakache
Abstract:
The study of the stability of the mining works in rock masses fractured is the major concern of the operating engineer. For geotechnical works in mines and quarries, it there is not today's general methodology for analysis and the quantification of the risks relating to the dangers inherent in these concrete types (falling boulders, landslides, etc.). The reasons for this are uncertainty, which weighs on available data or lack of knowledge of the values of the parameters required for this analysis type. Stability calculations must be based on reliable knowledge of the distribution of discontinuities that dissect the Rocky massif and the resistance to shear of the intact rock and discontinuities. This study is aimed to study the stability of slope of mine (Kef Sennoun - Tebessa, Algeria). The problem is analyzed using a numerical model based on the finite elements (software Plaxis 3D).Keywords: stability, discontinuities, finite elements, rock mass, open-pit mine
Procedia PDF Downloads 3211262 Geo-Visualization of Crimes against Children: An India Level Study 2001-2012
Authors: Ritvik Chauhan, Vijay Kumar Baraik
Abstract:
Crime is a rare event on earth surface. It is not simple but a complex event occurring in a spatio- temporal environment. Crime is one of the most serious security threats to human environments as it may result in harm to the individuals through the loss of property, physical and psychological injuries. The conventional studies done on different nature crime was mostly related to laws, psychological, social and political themes. The geographical areas are heterogeneous in their environmental conditions, associations between structural conditions, social organization which contributing specific crimes. The crime pattern analysis is made through theories in which criminal events occurs in persistent, identifiable patterns in a particular space and time. It will be the combined analysis of spatial factors and rational factors to the crime. In this study, we are analyzing the combined factors for the origin of crime against children. Children have always been vulnerable to victimization more because they are silent victims both physically and mentally to crimes and they even not realize what is happening with them. Their trusting nature and innocence always misused by criminals to perform crimes. The nature of crime against children is changed in past years like child rape, kidnapping &abduction, selling & buying of girls, foeticide, infanticide, prostitution, child marriage etc turned to more cruel and inhuman. This study will focus on understanding the space-time pattern of crime against children during the period 2001-2012. It also makes an attempt to explore and ascertain the association of crimes categorised against children, its rates with various geographical and socio-demographic factors through causal analysis using selected indicators (child sex-ratio, education, literacy rate, employment, income, etc.) obtained from the Census of India and other government sources. The outcome of study will help identifying the high crime regions with specified nature of crimes. It will also review the existing efforts and exploring the new plausible measure for tracking, monitoring and minimization of crime rate to meet the end goal of protecting the children from crimes committed against them.Keywords: crime against children, geographic profiling, spatio-temporal analysis, hotspot
Procedia PDF Downloads 2111261 Covid-19 Associated Stress and Coping Strategies
Authors: Bar Shapira-Youngster, Sima Amram-Vaknin, Yuliya Lipshits-Braziler
Abstract:
The study examined how 811 Israelis experienced and coped with the COVID-19 lockdown. Stress, uncertainty, and loss of control were reported as common emotional experiences. Two main difficulties were reported: Loneliness and health and emotional concerns. Frequent explanations for the virus's emergence were: scientific or faith reasoning. The most prevalent coping strategies were distraction activities and acceptance. Reducing the use of maladaptive coping strategies has important implications for mental health outcomes. Objectives: COVID-19 has been recognized as a collective, continuous traumatic stressor. The present study examined how individuals experienced, perceived, and coped with this traumatic event during the lockdown in Israel in April 2020. Method: 811 Israelis (71.3% were women; mean age 43.7, SD=13.3)completed an online semi-structured questionnaire consisting two sections: In the first section, participants were asked to report background information. In the second section, they were asked to answer 8 open-ended questions about their experience, perception, and coping with the covid-19 lockdown. Participation was voluntary, and anonymity was assured, they were not offered compensation of any kind. The data were subjected to qualitative content analysis that seeks to classify the participants` answers into an effective number of categories that represent similar meanings. Our content analysis of participants’ answers extended far beyond simple word counts; our objective was to try to identify recurrent categories that characterized participants’ responses to each question. We sought to ensure that the categories regarding the different questions are as mutually exclusive and exhaustive as possible. To ensure robust analysis, the data were initially analyzed by the first author, and a second opinion was then sought from research colleagues. Contribution: The present research expands our knowledge of individuals' experiences, perceptions, and coping mechanisms with continuous traumatic events. Reducing the use of maladaptive coping strategies has important implications for mental health outcomes.Keywords: Covid-19, emotional distress, coping, continuous traumatic event
Procedia PDF Downloads 1311260 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time
Authors: Yemane Hailu Fissuh, Zhongzhan Zhang
Abstract:
In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate
Procedia PDF Downloads 1271259 Superparamagnetic Sensor with Lateral Flow Immunoassays as Platforms for Biomarker Quantification
Authors: M. Salvador, J. C. Martinez-Garcia, A. Moyano, M. C. Blanco-Lopez, M. Rivas
Abstract:
Biosensors play a crucial role in the detection of molecules nowadays due to their advantages of user-friendliness, high selectivity, the analysis in real time and in-situ applications. Among them, Lateral Flow Immunoassays (LFIAs) are presented among technologies for point-of-care bioassays with outstanding characteristics such as affordability, portability and low-cost. They have been widely used for the detection of a vast range of biomarkers, which do not only include proteins but also nucleic acids and even whole cells. Although the LFIA has traditionally been a positive/negative test, tremendous efforts are being done to add to the method the quantifying capability based on the combination of suitable labels and a proper sensor. One of the most successful approaches involves the use of magnetic sensors for detection of magnetic labels. Bringing together the required characteristics mentioned before, our research group has developed a biosensor to detect biomolecules. Superparamagnetic nanoparticles (SPNPs) together with LFIAs play the fundamental roles. SPMNPs are detected by their interaction with a high-frequency current flowing on a printed micro track. By means of the instant and proportional variation of the impedance of this track provoked by the presence of the SPNPs, quantitative and rapid measurement of the number of particles can be obtained. This way of detection requires no external magnetic field application, which reduces the device complexity. On the other hand, the major limitations of LFIAs are that they are only qualitative or semiquantitative when traditional gold or latex nanoparticles are used as color labels. Moreover, the necessity of always-constant ambient conditions to get reproducible results, the exclusive detection of the nanoparticles on the surface of the membrane, and the short durability of the signal are drawbacks that can be advantageously overcome with the design of magnetically labeled LFIAs. The approach followed was to coat the SPIONs with a specific monoclonal antibody which targets the protein under consideration by chemical bonds. Then, a sandwich-type immunoassay was prepared by printing onto the nitrocellulose membrane strip a second antibody against a different epitope of the protein (test line) and an IgG antibody (control line). When the sample flows along the strip, the SPION-labeled proteins are immobilized at the test line, which provides magnetic signal as described before. Preliminary results using this practical combination for the detection and quantification of the Prostatic-Specific Antigen (PSA) shows the validity and consistency of the technique in the clinical range, where a PSA level of 4.0 ng/mL is the established upper normal limit. Moreover, a LOD of 0.25 ng/mL was calculated with a confident level of 3 according to the IUPAC Gold Book definition. Its versatility has also been proved with the detection of other biomolecules such as troponin I (cardiac injury biomarker) or histamine.Keywords: biosensor, lateral flow immunoassays, point-of-care devices, superparamagnetic nanoparticles
Procedia PDF Downloads 232