Search results for: event quantification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1710

Search results for: event quantification

1260 Financial Ethics: A Review of 2010 Flash Crash

Authors: Omer Farooq, Salman Ahmed Khan, Sadaf Khalid

Abstract:

Modern day stock markets have almost entirely became automated. Even though it means increased profits for the investors by algorithms acting upon the slightest price change in order of microseconds, it also has given birth to many ethical dilemmas in the sense that slightest mistake can cause people to lose all of their livelihoods. This paper reviews one such event that happened on May 06, 2010 in which $1 trillion dollars disappeared from the Dow Jones Industrial Average. We are going to discuss its various aspects and the ethical dilemmas that have arisen due to it.

Keywords: flash crash, market crash, stock market, stock market crash

Procedia PDF Downloads 511
1259 The Price of Knowledge in the Times of Commodification of Higher Education: A Case Study on the Changing Face of Education

Authors: Joanna Peksa, Faith Dillon-Lee

Abstract:

Current developments in the Western economies have turned some universities into corporate institutions driven by practices of production and commodity. Academia is increasingly becoming integrated into national economies as a result of students paying fees and is consequently using business practices in student retention and engagement. With these changes, pedagogy status as a priority within the institution has been changing in light of these new demands. New strategies have blurred the boundaries that separate a student from a client. This led to a change of the dynamic, disrupting the traditional idea of the knowledge market, and emphasizing the corporate aspect of universities. In some cases, where students are seen primarily as a customer, the purpose of academia is no longer to educate but sell a commodity and retain fee-paying students. This paper considers opposing viewpoints on the commodification of higher education, reflecting on the reality of maintaining a pedagogic grounding in an increasingly commercialized sector. By analysing a case study of the Student Success Festival, an event that involved academic and marketing teams, the differences are considered between the respective visions of the pedagogic arm of the university and the corporate. This study argues that the initial concept of the event, based on the principles of gamification, independent learning, and cognitive criticality, was more clearly linked to a grounded pedagogic approach. However, when liaising with the marketing team in a crucial step in the creative process, it became apparent that these principles were not considered a priority in terms of their remit. While the study acknowledges in the power of pedagogy, the findings show that a pact of concord is necessary between different stakeholders in order for students to benefit fully from their learning experience. Nevertheless, while issues of power prevail and whenever power is unevenly distributed, reaching a consensus becomes increasingly challenging and further research should closely monitor the developments in pedagogy in the UK higher education.

Keywords: economic pressure, commodification, pedagogy, gamification, public service, marketization

Procedia PDF Downloads 126
1258 Patient Safety Culture in Brazilian Hospitals from Nurse's Team Perspective

Authors: Carmen Silvia Gabriel, Dsniele Bernardi da Costa, Andrea Bernardes, Sabrina Elias Mikael, Daniele da Silva Ramos

Abstract:

The goal of this quantitative study is to investigate patient safety culture from the perspective of professional from the hospital nursing team.It was conducted in two Brazilian hospitals,.The sample included 282 nurses Data collection occurred in 2013, through the questionnaire Hospital Survey on Patient Safety Culture.Based on the assessment of the dimensions is stressed that, in the dimension teamwork across hospital units, 69.4% of professionals agree that when a lot of work needs to be done quickly, they work together as a team; about the dimension supervisor/ manager expectations and actions promoting safety, 70.2% agree that their supervisor overlooks patient safety problems.Related to organizational learning and continuous improvement, 56.5% agree that there is evaluation of the effectiveness of the changes after its implementation.On hospital management support for patient safety, 52.8% refer that the actions of hospital management show that patient safety is a top priority.On the overall perception of patient safety, 57.2% disagree that patient safety is never compromised due to higher amount of work to be completed.In what refers to feedback and communication about error, 57.7% refer that always and usually receive such information. Relative to communication openness, 42.9% said they never or rarely feel free to question the decisions / actions of their superiors.On frequency of event reporting, 64.7% said often and always notify events with no damages to patients..About teamwork across hospital units is noted similarity between the percentages of agreement and disagreement, as on the item there is a good cooperation among hospital units that need to work together, that indicates 41.4% and 40.5% respectively.Related to adequacy of professionals, 77.8 % disagree on the existence of sufficient amount of employees to do the job, 52.4% agree that shift changes are problematic for patients. On nonpunitive response to errors, 71.7% indicate that when an event is reported it seems that the focus is on the person.On the patient safety grade of the institution, 41.6 % classified it as very good. it is concluded that there are positive points in the safety culture, and some weaknesses as a punitive culture and impaired patient safety due to work overload .

Keywords: quality of health care, health services evaluation, safety culture, patient safety, nursing team

Procedia PDF Downloads 294
1257 Quantification of Methane Emissions from Solid Waste in Oman Using IPCC Default Methodology

Authors: Wajeeha A. Qazi, Mohammed-Hasham Azam, Umais A. Mehmood, Ghithaa A. Al-Mufragi, Noor-Alhuda Alrawahi, Mohammed F. M. Abushammala

Abstract:

Municipal Solid Waste (MSW) disposed in landfill sites decompose under anaerobic conditions and produce gases which mainly contain carbon dioxide (CO₂) and methane (CH₄). Methane has the potential of causing global warming 25 times more than CO₂, and can potentially affect human life and environment. Thus, this research aims to determine MSW generation and the annual CH₄ emissions from the generated waste in Oman over the years 1971-2030. The estimation of total waste generation was performed using existing models, while the CH₄ emissions estimation was performed using the intergovernmental panel on climate change (IPCC) default method. It is found that total MSW generation in Oman might be reached 3,089 Gg in the year 2030, which approximately produced 85 Gg of CH₄ emissions in the year 2030.

Keywords: methane, emissions, landfills, solid waste

Procedia PDF Downloads 503
1256 Drying and Transport Processes in Distributed Hydrological Modelling Based on Finite Volume Schemes (Iber Model)

Authors: Carlos Caro, Ernest Bladé, Pedro Acosta, Camilo Lesmes

Abstract:

The drying-wet process is one of the topics to be more careful in distributed hydrological modeling using finite volume schemes as a means of solving the equations of Saint Venant. In a hydrologic and hydraulic computer model, surface flow phenomena depend mainly on the different flow accumulation and subsequent runoff generation. These accumulations are generated by routing, cell by cell, from the heights of water, which begin to appear due to the rain at each instant of time. Determine when it is considered a dry cell and when considered wet to include in the full calculation is an issue that directly affects the quantification of direct runoff or generation of flow at the end of a zone of contribution by accumulations flow generated from cells or finite volume.

Keywords: hydrology, transport processes, hydrological modelling, finite volume schemes

Procedia PDF Downloads 383
1255 Identification and Quantification of Lisinopril from Pure, Formulated and Urine Samples by Micellar Thin Layer Chromatography

Authors: Sudhanshu Sharma

Abstract:

Lisinopril, 1-[N-{(s)-I-carboxy-3 phenyl propyl}-L-proline dehydrate is a lysine analog of enalaprilat, the active metabolite of enalapril. It is long-acting, non-sulhydryl angiotensin-converting enzyme (ACE) inhibitor that is used for the treatment of hypertension and congestive heart failure in daily dosage 10-80 mg. Pharmacological activity of lisinopril has been proved in various experimental and clinical studies. Owing to its importance and widespread use, efforts have been made towards the development of simple and reliable analytical methods. As per our literature survey, lisinopril in pharmaceutical formulations has been determined by various analytical methodologies like polaragraphy, potentiometry, and spectrophotometry, but most of these analytical methods are not too suitable for the Identification of lisinopril from clinical samples because of the interferences caused by the amino acids and amino groups containing metabolites present in biological samples. This report is an attempt in the direction of developing a simple and reliable method for on plate identification and quantification of lisinopril in pharmaceutical formulations as well as from human urine samples using silica gel H layers developed with a new mobile phase comprising of micellar solutions of N-cetyl-N, N, N-trimethylammonium bromide (CTAB). Micellar solutions have found numerous practical applications in many areas of separation science. Micellar liquid chromatography (MLC) has gained immense popularity and wider applicability due to operational simplicity, cost effectiveness, relatively non-toxicity and enhanced separation efficiency, low aggressiveness. Incorporation of aqueous micellar solutions as mobile phase was pioneered by Armstrong and Terrill as they accentuated the importance of TLC where simultaneous separation of ionic or non-ionic species in a variety of matrices is required. A peculiarity of the micellar mobile phases (MMPs) is that they have no macroscopic analogues, as a result the typical separations can be easily achieved by using MMPs than aqueous organic mobile phases. Previously MMPs were successfully employed in TLC based critical separations of aromatic hydrocarbons, nucleotides, vitamin K1 and K5, o-, m- and p- aminophenol, amino acids, separation of penicillins. The human urine analysis for identification of selected drugs and their metabolites has emerged as an important investigation tool in forensic drug analysis. Among all chromatographic methods available only thin layer chromatography (TLC) enables a simple fast and effective separation of the complex mixtures present in various biological samples and is recommended as an approved testing for forensic drug analysis by federal Law. TLC proved its applicability during successful separation of bio-active amines, carbohydrates, enzymes, porphyrins, and their precursors, alkaloid and drugs from urine samples.

Keywords: lisnopril, surfactant, chromatography, micellar solutions

Procedia PDF Downloads 360
1254 A Comparative Assessment of Industrial Composites Using Thermography and Ultrasound

Authors: Mosab Alrashed, Wei Xu, Stephen Abineri, Yifan Zhao, Jörn Mehnen

Abstract:

Thermographic inspection is a relatively new technique for Non-Destructive Testing (NDT) which has been gathering increasing interest due to its relatively low cost hardware and extremely fast data acquisition properties. This technique is especially promising in the area of rapid automated damage detection and quantification. In collaboration with a major industry partner from the aerospace sector advanced thermography-based NDT software for impact damaged composites is introduced. The software is based on correlation analysis of time-temperature profiles in combination with an image enhancement process. The prototype software is aiming to a) better visualise the damages in a relatively easy-to-use way and b) automatically and quantitatively measure the properties of the degradation. Knowing that degradation properties play an important role in the identification of degradation types, tests and results on specimens which were artificially damaged have been performed and analyzed.

Keywords: NDT, correlation analysis, image processing, damage, inspection

Procedia PDF Downloads 542
1253 Automatic LV Segmentation with K-means Clustering and Graph Searching on Cardiac MRI

Authors: Hae-Yeoun Lee

Abstract:

Quantification of cardiac function is performed by calculating blood volume and ejection fraction in routine clinical practice. However, these works have been performed by manual contouring,which requires computational costs and varies on the observer. In this paper, an automatic left ventricle segmentation algorithm on cardiac magnetic resonance images (MRI) is presented. Using knowledge on cardiac MRI, a K-mean clustering technique is applied to segment blood region on a coil-sensitivity corrected image. Then, a graph searching technique is used to correct segmentation errors from coil distortion and noises. Finally, blood volume and ejection fraction are calculated. Using cardiac MRI from 15 subjects, the presented algorithm is tested and compared with manual contouring by experts to show outstanding performance.

Keywords: cardiac MRI, graph searching, left ventricle segmentation, K-means clustering

Procedia PDF Downloads 394
1252 Seismic Isolation of Existing Masonry Buildings: Recent Case Studies in Italy

Authors: Stefano Barone

Abstract:

Seismic retrofit of buildings through base isolation represents a consolidated protection strategy against earthquakes. It consists in decoupling the ground motion from that of the structure and introducing anti-seismic devices at the base of the building, characterized by high horizontal flexibility and medium/high dissipative capacity. This allows to protect structural elements and to limit damages to non-structural ones. For these reasons, full functionality is guaranteed after an earthquake event. Base isolation is applied extensively to both new and existing buildings. For the latter, it usually does not require any interruption of the structure use and occupants evacuation, a special advantage for strategic buildings such as schools, hospitals, and military buildings. This paper describes the application of seismic isolation to three existing masonry buildings in Italy: Villa “La Maddalena” in Macerata (Marche region), “Giacomo Matteotti” and “Plinio Il Giovane” school buildings in Perugia (Umbria region). The seismic hazard of the sites is characterized by a Peak Ground Acceleration (PGA) of 0.213g-0.287g for the Life Safety Limit State and between 0.271g-0.359g for the Collapse Limit State. All the buildings are isolated with a combination of free sliders type TETRON® CD with confined elastomeric disk and anti-seismic rubber isolators type ISOSISM® HDRB to reduce the eccentricity between the center of mass and stiffness, thus limiting torsional effects during a seismic event. The isolation systems are designed to lengthen the original period of vibration (i.e., without isolators) by at least three times and to guarantee medium/high levels of energy dissipation capacity (equivalent viscous damping between 12.5% and 16%). This allows the structures to resist 100% of the seismic design action. This article shows the performances of the supplied anti-seismic devices with particular attention to the experimental dynamic response. Finally, a special focus is given to the main site activities required to isolate a masonry building.

Keywords: retrofit, masonry buildings, seismic isolation, energy dissipation, anti-seismic devices

Procedia PDF Downloads 64
1251 Bayesian Networks Scoping the Climate Change Impact on Winter Wheat Freezing Injury Disasters in Hebei Province, China

Authors: Xiping Wang,Shuran Yao, Liqin Dai

Abstract:

Many studies report the winter is getting warmer and the minimum air temperature is obviously rising as the important climate warming evidences. The exacerbated air temperature fluctuation tending to bring more severe weather variation is another important consequence of recent climate change which induced more disasters to crop growth in quite a certain regions. Hebei Province is an important winter wheat growing province in North of China that recently endures more winter freezing injury influencing the local winter wheat crop management. A winter wheat freezing injury assessment Bayesian Network framework was established for the objectives of estimating, assessing and predicting winter wheat freezing disasters in Hebei Province. In this framework, the freezing disasters was classified as three severity degrees (SI) among all the three types of freezing, i.e., freezing caused by severe cold in anytime in the winter, long extremely cold duration in the winter and freeze-after-thaw in early season after winter. The factors influencing winter wheat freezing SI include time of freezing occurrence, growth status of seedlings, soil moisture, winter wheat variety, the longitude of target region and, the most variable climate factors. The climate factors included in this framework are daily mean and range of air temperature, extreme minimum temperature and number of days during a severe cold weather process, the number of days with the temperature lower than the critical temperature values, accumulated negative temperature in a potential freezing event. The Bayesian Network model was evaluated using actual weather data and crop records at selected sites in Hebei Province using real data. With the multi-stage influences from the various factors, the forecast and assessment of the event-based target variables, freezing injury occurrence and its damage to winter wheat production, were shown better scoped by Bayesian Network model.

Keywords: bayesian networks, climatic change, freezing Injury, winter wheat

Procedia PDF Downloads 401
1250 The Importance of the Fluctuation in Blood Sugar and Blood Pressure of Insulin-Dependent Diabetic Patients with Chronic Kidney Disease

Authors: Hitoshi Minakuchi, Izumi Takei, Shu Wakino, Koichi Hayashi, Hiroshi Itoh

Abstract:

Objectives: Among type 2 diabetics, patients with CKD(chronic kidney disease), insulin resistance, impaired glyconeogenesis in kidney and reduced degradation of insulin are recognized, and we observed different fluctuational patterns of blood sugar between CKD patients and non-CKD patients. On the other hand, non-dipper type blood pressure change is the risk of organ damage and mortality. We performed cross-sectional study to elucidate the characteristic of the fluctuation of blood glucose and blood pressure at insulin-treated diabetic patients with chronic kidney disease. Methods: From March 2011 to April 2013, at the Ichikawa General Hospital of Tokyo Dental College, we recruited 20 outpatients. All participants are insulin-treated type 2 diabetes with CKD. We collected serum samples, urine samples for several hormone measurements, and performed CGMS(Continuous glucose measurement system), ABPM (ambulatory blood pressure monitoring), brain computed tomography, carotid artery thickness, ankle brachial index, PWV, CVR-R, and analyzed these data statistically. Results: Among all 20 participants, hypoglycemia was decided blood glucose 70mg/dl by CGMS of 9 participants (45.0%). The event of hypoglycemia was recognized lower eGFR (29.8±6.2ml/min:41.3±8.5ml/min, P<0.05), lower HbA1c (6.44±0.57%:7.53±0.49%), higher PWV (1858±97.3cm/s:1665±109.2cm/s), higher serum glucagon (194.2±34.8pg/ml:117.0±37.1pg/ml), higher free cortisol of urine (53.8±12.8μg/day:34.8±7.1μg/day), and higher metanephrin of urine (0.162±0.031mg/day:0.076±0.029mg/day). Non-dipper type blood pressure change in ABPM was detected 8 among 9 participants with hypoglycemia (88.9%), 4 among 11 participants (36.4%) without hypoglycemia. Multiplex logistic-regression analysis revealed that the event of hypoglycemia is the independent factor of non-dipper type blood pressure change. Conclusions: Among insulin-treated type 2 diabetic patients with CKD, the events of hypoglycemia were frequently detected, and can associate with the organ derangements through the medium of non-dipper type blood pressure change.

Keywords: chronic kidney disease, hypoglycemia, non-dipper type blood pressure change, diabetic patients

Procedia PDF Downloads 407
1249 Automatic Flood Prediction Using Rainfall Runoff Model in Moravian-Silesian Region

Authors: B. Sir, M. Podhoranyi, S. Kuchar, T. Kocyan

Abstract:

Rainfall-runoff models play important role in hydrological predictions. However, the model is only one part of the process for creation of flood prediction. The aim of this paper is to show the process of successful prediction for flood event (May 15–May 18 2014). The prediction was performed by rainfall runoff model HEC–HMS, one of the models computed within Floreon+ system. The paper briefly evaluates the results of automatic hydrologic prediction on the river Olše catchment and its gages Český Těšín and Věřňovice.

Keywords: flood, HEC-HMS, prediction, rainfall, runoff

Procedia PDF Downloads 389
1248 A Quantification Method of Attractiveness of Stations and an Estimation Method of Number of Passengers Taking into Consideration the Attractiveness of the Station

Authors: Naoya Ozaki, Takuya Watanabe, Ryosuke Matsumoto, Noriko Fukasawa

Abstract:

In the metropolitan areas in Japan, in many stations, shopping areas are set up, and escalators and elevators are installed to make the stations be barrier-free. Further, many areas around the stations are being redeveloped. Railway business operators want to know how much effect these circumstances have on attractiveness of the station or number of passengers using the station. So, we performed a questionnaire survey of the station users in the metropolitan areas for finding factors to affect the attractiveness of stations. Then, based on the analysis of the survey, we developed a method to quantitatively evaluate attractiveness of the stations. We also developed an estimation method for number of passengers based on combination of attractiveness of the station quantitatively evaluated and the residential and labor population around the station. Then, we derived precise linear regression models estimating the attractiveness of the station and number of passengers of the station.

Keywords: attractiveness of the station, estimation method, number of passengers of the station, redevelopment around the station, renovation of the station

Procedia PDF Downloads 281
1247 Application of Discrete-Event Simulation in Health Technology Assessment: A Cost-Effectiveness Analysis of Alzheimer’s Disease Treatment Using Real-World Evidence in Thailand

Authors: Khachen Kongpakwattana, Nathorn Chaiyakunapruk

Abstract:

Background: Decision-analytic models for Alzheimer’s disease (AD) have been advanced to discrete-event simulation (DES), in which individual-level modelling of disease progression across continuous severity spectra and incorporation of key parameters such as treatment persistence into the model become feasible. This study aimed to apply the DES to perform a cost-effectiveness analysis of treatment for AD in Thailand. Methods: A dataset of Thai patients with AD, representing unique demographic and clinical characteristics, was bootstrapped to generate a baseline cohort of patients. Each patient was cloned and assigned to donepezil, galantamine, rivastigmine, memantine or no treatment. Throughout the simulation period, the model randomly assigned each patient to discrete events including hospital visits, treatment discontinuation and death. Correlated changes in cognitive and behavioral status over time were developed using patient-level data. Treatment effects were obtained from the most recent network meta-analysis. Treatment persistence, mortality and predictive equations for functional status, costs (Thai baht (THB) in 2017) and quality-adjusted life year (QALY) were derived from country-specific real-world data. The time horizon was 10 years, with a discount rate of 3% per annum. Cost-effectiveness was evaluated based on the willingness-to-pay (WTP) threshold of 160,000 THB/QALY gained (4,994 US$/QALY gained) in Thailand. Results: Under a societal perspective, only was the prescription of donepezil to AD patients with all disease-severity levels found to be cost-effective. Compared to untreated patients, although the patients receiving donepezil incurred a discounted additional costs of 2,161 THB, they experienced a discounted gain in QALY of 0.021, resulting in an incremental cost-effectiveness ratio (ICER) of 138,524 THB/QALY (4,062 US$/QALY). Besides, providing early treatment with donepezil to mild AD patients further reduced the ICER to 61,652 THB/QALY (1,808 US$/QALY). However, the dominance of donepezil appeared to wane when delayed treatment was given to a subgroup of moderate and severe AD patients [ICER: 284,388 THB/QALY (8,340 US$/QALY)]. Introduction of a treatment stopping rule when the Mini-Mental State Exam (MMSE) score goes below 10 to a mild AD cohort did not deteriorate the cost-effectiveness of donepezil at the current treatment persistence level. On the other hand, none of the AD medications was cost-effective when being considered under a healthcare perspective. Conclusions: The DES greatly enhances real-world representativeness of decision-analytic models for AD. Under a societal perspective, treatment with donepezil improves patient’s quality of life and is considered cost-effective when used to treat AD patients with all disease-severity levels in Thailand. The optimal treatment benefits are observed when donepezil is prescribed since the early course of AD. With healthcare budget constraints in Thailand, the implementation of donepezil coverage may be most likely possible when being considered starting with mild AD patients, along with the stopping rule introduced.

Keywords: Alzheimer's disease, cost-effectiveness analysis, discrete event simulation, health technology assessment

Procedia PDF Downloads 121
1246 Vulnerability Assessment for Protection of Ghardaia City to the Inundation of M’zabWadi

Authors: Mustapha Kamel Mihoubi, Reda Madi

Abstract:

The problem of natural disasters in general and flooding in particular is a topic which marks a memorable action in the world and specifically in cities and large urban areas. Torrential floods and faster flows pose a major problem in urban area. Indeed, a better management of risks of floods becomes a growing necessity that must mobilize technical and scientific means to curb the adverse consequences of this phenomenon, especially in the Saharan cities in arid climate. The aim of this study is to deploy a basic calculation approach based on a hydrologic and hydraulic quantification for locating the black spots in urban areas generated by the flooding and to locate the areas that are vulnerable to flooding. The principle of flooding method is applied to the city of Ghardaia to identify vulnerable areas to inundation and to establish maps management and prevention against the risks of flooding.

Keywords: Alea, Beni Mzab, cartography, HEC-RAS, inundation, torrential, vulnerability, wadi

Procedia PDF Downloads 305
1245 Application of Gamma Frailty Model in Survival of Liver Cirrhosis Patients

Authors: Elnaz Saeedi, Jamileh Abolaghasemi, Mohsen Nasiri Tousi, Saeedeh Khosravi

Abstract:

Goals and Objectives: A typical analysis of survival data involves the modeling of time-to-event data, such as the time till death. A frailty model is a random effect model for time-to-event data, where the random effect has a multiplicative influence on the baseline hazard function. This article aims to investigate the use of gamma frailty model with concomitant variable in order to individualize the prognostic factors that influence the liver cirrhosis patients’ survival times. Methods: During the one-year study period (May 2008-May 2009), data have been used from the recorded information of patients with liver cirrhosis who were scheduled for liver transplantation and were followed up for at least seven years in Imam Khomeini Hospital in Iran. In order to determine the effective factors for cirrhotic patients’ survival in the presence of latent variables, the gamma frailty distribution has been applied. In this article, it was considering the parametric model, such as Exponential and Weibull distributions for survival time. Data analysis is performed using R software, and the error level of 0.05 was considered for all tests. Results: 305 patients with liver cirrhosis including 180 (59%) men and 125 (41%) women were studied. The age average of patients was 39.8 years. At the end of the study, 82 (26%) patients died, among them 48 (58%) were men and 34 (42%) women. The main cause of liver cirrhosis was found hepatitis 'B' with 23%, followed by cryptogenic with 22.6% were identified as the second factor. Generally, 7-year’s survival was 28.44 months, for dead patients and for censoring was 19.33 and 31.79 months, respectively. Using multi-parametric survival models of progressive and regressive, Exponential and Weibull models with regard to the gamma frailty distribution were fitted to the cirrhosis data. In both models, factors including, age, bilirubin serum, albumin serum, and encephalopathy had a significant effect on survival time of cirrhotic patients. Conclusion: To investigate the effective factors for the time of patients’ death with liver cirrhosis in the presence of latent variables, gamma frailty model with parametric distributions seems desirable.

Keywords: frailty model, latent variables, liver cirrhosis, parametric distribution

Procedia PDF Downloads 258
1244 Effect of Acetic Acid Fermentation on Bioactive Components and Anti-Xanthine Oxidase Activities in Vinegar Brewed from Monascus-Fermented Soybeans

Authors: Kyung-Soon Choi, Ji-Young Hwang, Young-Hee Pyo

Abstract:

Vinegars have been used as an alternative remedy for treating gout, but the scientific basis remains to be elucidated. In this study, acetic acid fermentation was applied for the first time to Monascus-fermented soybeans to examine its effect on the bioactive components together with the xanthine oxidase inhibitory (XOI) activity of the soy vinegar. The content of total phenols (0.47~0.97 mg gallic acid equivalents/mL) and flavonoids (0.18~0.39 mg quercetin equivallents/mL) were spectrophotometrically determined, and the content of organic acid (10.22~59.76 mg/mL) and isoflavones (6.79~7.46 mg/mL) were determined using HPLC-UV. The analytical method for ubiquinones (0.079~0.276 μg/mL) employed saponification before solvent extraction and quantification using LC-MS. Soy vinegar also showed significant XOI (95.3%) after 20 days of acetic acid fermentation at 30 °C. The results suggest that soy vinegar has potential as a novel medicinal food.

Keywords: acetic acid fermentation, bioactive component, soy vinegar, xanthine oxidase inhibitory activity

Procedia PDF Downloads 380
1243 Solid Waste Management Policy Implementation in Imus, Cavite

Authors: Michael John S. Maceda

Abstract:

Waste has been a global concern aggravated by climate change. In the case of Imus, Cavite which in the past has little or no regard to waste experienced heavy flooding during August 19, 2013. This event led to a full blown implementation of Municipal Solid Waste Management integrating participation and the use of low-cost technology to reduce the amount of waste generated. The methodology employed by the city of Imus, provided a benchmark in the province of Cavite. Reducing the amount of waste generated and Solid Waste Management Cost.

Keywords: SWM, IMUS, composting, policy

Procedia PDF Downloads 825
1242 Geo-Visualization of Crimes against Children: An India Level Study 2001-2012

Authors: Ritvik Chauhan, Vijay Kumar Baraik

Abstract:

Crime is a rare event on earth surface. It is not simple but a complex event occurring in a spatio- temporal environment. Crime is one of the most serious security threats to human environments as it may result in harm to the individuals through the loss of property, physical and psychological injuries. The conventional studies done on different nature crime was mostly related to laws, psychological, social and political themes. The geographical areas are heterogeneous in their environmental conditions, associations between structural conditions, social organization which contributing specific crimes. The crime pattern analysis is made through theories in which criminal events occurs in persistent, identifiable patterns in a particular space and time. It will be the combined analysis of spatial factors and rational factors to the crime. In this study, we are analyzing the combined factors for the origin of crime against children. Children have always been vulnerable to victimization more because they are silent victims both physically and mentally to crimes and they even not realize what is happening with them. Their trusting nature and innocence always misused by criminals to perform crimes. The nature of crime against children is changed in past years like child rape, kidnapping &abduction, selling & buying of girls, foeticide, infanticide, prostitution, child marriage etc turned to more cruel and inhuman. This study will focus on understanding the space-time pattern of crime against children during the period 2001-2012. It also makes an attempt to explore and ascertain the association of crimes categorised against children, its rates with various geographical and socio-demographic factors through causal analysis using selected indicators (child sex-ratio, education, literacy rate, employment, income, etc.) obtained from the Census of India and other government sources. The outcome of study will help identifying the high crime regions with specified nature of crimes. It will also review the existing efforts and exploring the new plausible measure for tracking, monitoring and minimization of crime rate to meet the end goal of protecting the children from crimes committed against them.

Keywords: crime against children, geographic profiling, spatio-temporal analysis, hotspot

Procedia PDF Downloads 208
1241 Optimization Study of Adsorption of Nickel(II) on Bentonite

Authors: B. Medjahed, M. A. Didi, B. Guezzen

Abstract:

This work concerns with the experimental study of the adsorption of the Ni(II) on bentonite. The effects of various parameters such as contact time, stirring rate, initial concentration of Ni(II), masse of clay, initial pH of aqueous solution and temperature on the adsorption yield, were carried out. The study of the effect of the ionic strength on the yield of adsorption was examined by the identification and the quantification of the present chemical species in the aqueous phase containing the metallic ion Ni(II). The adsorbed species were investigated by a calculation program using CHEAQS V. L20.1 in order to determine the relation between the percentages of the adsorbed species and the adsorption yield. The optimization process was carried out using 23 factorial designs. The individual and combined effects of three process parameters, i.e. initial Ni(II) concentration in aqueous solution (2.10−3 and 5.10−3 mol/L), initial pH of the solution (2 and 6.5), and mass of bentonite (0.03 and 0.3 g) on Ni(II) adsorption, were studied.

Keywords: adsorption, bentonite, factorial design, Nickel(II)

Procedia PDF Downloads 156
1240 A Construct to Perform in Situ Deformation Measurement of Material Extrusion-Fabricated Structures

Authors: Daniel Nelson, Valeria La Saponara

Abstract:

Material extrusion is an additive manufacturing modality that continues to show great promise in the ability to create low-cost, highly intricate, and exceedingly useful structural elements. As more capable and versatile filament materials are devised, and the resolution of manufacturing systems continues to increase, the need to understand and predict manufacturing-induced warping will gain ever greater importance. The following study presents an in situ remote sensing and data analysis construct that allows for the in situ mapping and quantification of surface displacements induced by residual stresses on a specified test structure. This proof-of-concept experimental process shows that it is possible to provide designers and manufacturers with insight into the manufacturing parameters that lead to the manifestation of these deformations and a greater understanding of the behavior of these warping events over the course of the manufacturing process.

Keywords: additive manufacturing, deformation, digital image correlation, fused filament fabrication, residual stress, warping

Procedia PDF Downloads 80
1239 Study of the Stability of the Slope Open-Pit Mines: Case of the Mine of Phosphates – Tebessa, Algeria

Authors: Mohamed Fredj, Abdallah Hafsaoui, Radouane Nakache

Abstract:

The study of the stability of the mining works in rock masses fractured is the major concern of the operating engineer. For geotechnical works in mines and quarries, it there is not today's general methodology for analysis and the quantification of the risks relating to the dangers inherent in these concrete types (falling boulders, landslides, etc.). The reasons for this are uncertainty, which weighs on available data or lack of knowledge of the values of the parameters required for this analysis type. Stability calculations must be based on reliable knowledge of the distribution of discontinuities that dissect the Rocky massif and the resistance to shear of the intact rock and discontinuities. This study is aimed to study the stability of slope of mine (Kef Sennoun - Tebessa, Algeria). The problem is analyzed using a numerical model based on the finite elements (software Plaxis 3D).

Keywords: stability, discontinuities, finite elements, rock mass, open-pit mine

Procedia PDF Downloads 312
1238 Covid-19 Associated Stress and Coping Strategies

Authors: Bar Shapira-Youngster, Sima Amram-Vaknin, Yuliya Lipshits-Braziler

Abstract:

The study examined how 811 Israelis experienced and coped with the COVID-19 lockdown. Stress, uncertainty, and loss of control were reported as common emotional experiences. Two main difficulties were reported: Loneliness and health and emotional concerns. Frequent explanations for the virus's emergence were: scientific or faith reasoning. The most prevalent coping strategies were distraction activities and acceptance. Reducing the use of maladaptive coping strategies has important implications for mental health outcomes. Objectives: COVID-19 has been recognized as a collective, continuous traumatic stressor. The present study examined how individuals experienced, perceived, and coped with this traumatic event during the lockdown in Israel in April 2020. Method: 811 Israelis (71.3% were women; mean age 43.7, SD=13.3)completed an online semi-structured questionnaire consisting two sections: In the first section, participants were asked to report background information. In the second section, they were asked to answer 8 open-ended questions about their experience, perception, and coping with the covid-19 lockdown. Participation was voluntary, and anonymity was assured, they were not offered compensation of any kind. The data were subjected to qualitative content analysis that seeks to classify the participants` answers into an effective number of categories that represent similar meanings. Our content analysis of participants’ answers extended far beyond simple word counts; our objective was to try to identify recurrent categories that characterized participants’ responses to each question. We sought to ensure that the categories regarding the different questions are as mutually exclusive and exhaustive as possible. To ensure robust analysis, the data were initially analyzed by the first author, and a second opinion was then sought from research colleagues. Contribution: The present research expands our knowledge of individuals' experiences, perceptions, and coping mechanisms with continuous traumatic events. Reducing the use of maladaptive coping strategies has important implications for mental health outcomes.

Keywords: Covid-19, emotional distress, coping, continuous traumatic event

Procedia PDF Downloads 124
1237 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.

Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate

Procedia PDF Downloads 119
1236 Investors’ Misreaction to Subsequent Bad News

Authors: Liang-Chien Lee, Chih-Hsiang Chang, Ying-Shu Tseng

Abstract:

Comparing with prior studies mainly focused on the effect of a certain event (it may be the initial announcement of bad news or the repeated announcements of identical bad news) on stock price, the aim of this study is to explore how investors react to subsequent bad news with identical content. Empirical results show that as a result of behavioral pitfalls, investors underreact to the initial announcement of the bad news (i.e., unknown bad news) and overreact to the repeated announcements of the identical bad news (i.e., known bad news).

Keywords: subsequent bad news, behavioral finance, Investors’ misreaction, behavioral pitfalls

Procedia PDF Downloads 324
1235 Superparamagnetic Sensor with Lateral Flow Immunoassays as Platforms for Biomarker Quantification

Authors: M. Salvador, J. C. Martinez-Garcia, A. Moyano, M. C. Blanco-Lopez, M. Rivas

Abstract:

Biosensors play a crucial role in the detection of molecules nowadays due to their advantages of user-friendliness, high selectivity, the analysis in real time and in-situ applications. Among them, Lateral Flow Immunoassays (LFIAs) are presented among technologies for point-of-care bioassays with outstanding characteristics such as affordability, portability and low-cost. They have been widely used for the detection of a vast range of biomarkers, which do not only include proteins but also nucleic acids and even whole cells. Although the LFIA has traditionally been a positive/negative test, tremendous efforts are being done to add to the method the quantifying capability based on the combination of suitable labels and a proper sensor. One of the most successful approaches involves the use of magnetic sensors for detection of magnetic labels. Bringing together the required characteristics mentioned before, our research group has developed a biosensor to detect biomolecules. Superparamagnetic nanoparticles (SPNPs) together with LFIAs play the fundamental roles. SPMNPs are detected by their interaction with a high-frequency current flowing on a printed micro track. By means of the instant and proportional variation of the impedance of this track provoked by the presence of the SPNPs, quantitative and rapid measurement of the number of particles can be obtained. This way of detection requires no external magnetic field application, which reduces the device complexity. On the other hand, the major limitations of LFIAs are that they are only qualitative or semiquantitative when traditional gold or latex nanoparticles are used as color labels. Moreover, the necessity of always-constant ambient conditions to get reproducible results, the exclusive detection of the nanoparticles on the surface of the membrane, and the short durability of the signal are drawbacks that can be advantageously overcome with the design of magnetically labeled LFIAs. The approach followed was to coat the SPIONs with a specific monoclonal antibody which targets the protein under consideration by chemical bonds. Then, a sandwich-type immunoassay was prepared by printing onto the nitrocellulose membrane strip a second antibody against a different epitope of the protein (test line) and an IgG antibody (control line). When the sample flows along the strip, the SPION-labeled proteins are immobilized at the test line, which provides magnetic signal as described before. Preliminary results using this practical combination for the detection and quantification of the Prostatic-Specific Antigen (PSA) shows the validity and consistency of the technique in the clinical range, where a PSA level of 4.0 ng/mL is the established upper normal limit. Moreover, a LOD of 0.25 ng/mL was calculated with a confident level of 3 according to the IUPAC Gold Book definition. Its versatility has also been proved with the detection of other biomolecules such as troponin I (cardiac injury biomarker) or histamine.

Keywords: biosensor, lateral flow immunoassays, point-of-care devices, superparamagnetic nanoparticles

Procedia PDF Downloads 227
1234 Web Map Service for Fragmentary Rockfall Inventory

Authors: M. Amparo Nunez-Andres, Nieves Lantada

Abstract:

One of the most harmful geological risks is rockfalls. They cause both economic lost, damaged in buildings and infrastructures, and personal ones. Therefore, in order to estimate the risk of the exposed elements, it is necessary to know the mechanism of this kind of events, since the characteristics of the rock walls, to the propagation of fragments generated by the initial detached rock mass. In the framework of the research RockModels project, several inventories of rockfalls were carried out along the northeast of the Spanish peninsula and the Mallorca island. These inventories have general information about the events, although the important fact is that they contained detailed information about fragmentation. Specifically, the IBSD (Insitu Block Size Distribution) is obtained by photogrammetry from drone or TLS (Terrestrial Laser Scanner) and the RBSD (Rock Block Size Distribution) from the volume of the fragment in the deposit measured by hand. In order to share all this information with other scientists, engineers, members of civil protection, and stakeholders, it is necessary a platform accessible from the internet and following interoperable standards. In all the process, open-software have been used: PostGIS 2.1., Geoserver, and OpenLayers library. In the first step, a spatial database was implemented to manage all the information. We have used the data specifications of INSPIRE for natural risks adding specific and detailed data about fragmentation distribution. The next step was to develop a WMS with Geoserver. A previous phase was the creation of several views in PostGIS to show the information at different scales of visualization and with different degrees of detail. In the first view, the sites are identified with a point, and basic information about the rockfall event is facilitated. In the next level of zoom, at medium scale, the convex hull of the rockfall appears with its real shape and the source of the event and fragments are represented by symbols. The queries at this level offer a major detail about the movement. Eventually, the third level shows all elements: deposit, source, and blocks, in their real size, if it is possible, and in their real localization. The last task was the publication of all information in a web mapping site (www.rockdb.upc.edu) with data classified by levels using libraries in JavaScript as OpenLayers.

Keywords: geological risk, web mapping, WMS, rockfalls

Procedia PDF Downloads 157
1233 Aspectual Verbs in Modern Standard Arabic

Authors: Yasir Alotaibi

Abstract:

The aim of this paper is to discuss the syntactic analysis of aspectual or phasal verbs in Modern Standard Arabic (MSA). Aspectual or phasal verbs refer to a class of verbs that require a verbal complement and denote the inception, duration, termination ...etc. of a state or event. This paper will discuss two groups of aspectual verbs in MSA. The first group includes verbs such as ̆gacala, tafiqa, ?akhatha, ?ansha?a, sharaca and bada?a and these verbs are used to denote the inception of an event. The second group includes verbs such as ?awshaka, kaada and karaba and the meaning of these verbs is equivalent to be near/almost . The following examples illustrate the use of the verb bada?a ‘begin’ which is from the first group: a. saalim-un bada?a yuthaakiru. Salem-NOM begin.PFV.3SGM study.IPFV.3SGM ‘Salem began to study’ b.*saalim-un bada?a ?an yuthaakiru. Salem-NOM begin.PFV.3SGM COMP study.IPFV.3SGM ‘Salem began to study’ The example in (1a) is grammatical because the aspectual verb is used with a verbal complement that is not introduced by a complementizer. In contrast, example (1b) is not grammatical because the verbal complement is introduced by the complementizer ?an ‘that’. In contrast, the following examples illustrate the use of the verb kaada ‘be almost’ which is from the second group. However, the two examples are grammatical and this means that the verbal complement of this verb can be without (as in example (2a)) or with ( as in example (2b)) a complementizer. (2) a. saalim-un kaada yuthaakiru. Salem-NOM be.almost.PFV.3SGM study.IPFV.3SGM ‘Salem was almost to study’ b. saalim-un kaada ?an yuthaakiru. Salem-NOM be.almost.PFV.3SGM COMP study.IPFV.3SGM ‘Salem was almost to study’ The salient properties of this class of verbs are that they require a verbal complement, there is no a complementizer that can introduce the complement with the first group while it is possible with the second and the aspectual verb and the embedded verb share and agree with the same subject. To the best of knowledge, aspectual verbs in MSA are discussed in traditional grammar only and have not been studied in modern syntactic theories. This paper will consider the analysis of aspectual verbs in MSA within the Lexical Functional Grammar (LFG) framework. It will use some evidence such as modifier or negation to find out whether these verbs have PRED values and head their f-structures or they form complex predicates with their complements. If aspectual verbs show the properties of heads, then the paper will explore what kind of heads they are. In particular, they should be raising or control verbs. The paper will use some tests such as agreement, selectional restrictions...etc. to find out what kind of verbs they are.

Keywords: aspectual verbs, biclausal, monoclausal, raising

Procedia PDF Downloads 52
1232 Model Updating-Based Approach for Damage Prognosis in Frames via Modal Residual Force

Authors: Gholamreza Ghodrati Amiri, Mojtaba Jafarian Abyaneh, Ali Zare Hosseinzadeh

Abstract:

This paper presents an effective model updating strategy for damage localization and quantification in frames by defining damage detection problem as an optimization issue. A generalized version of the Modal Residual Force (MRF) is employed for presenting a new damage-sensitive cost function. Then, Grey Wolf Optimization (GWO) algorithm is utilized for solving suggested inverse problem and the global extremums are reported as damage detection results. The applicability of the presented method is investigated by studying different damage patterns on the benchmark problem of the IASC-ASCE, as well as a planar shear frame structure. The obtained results emphasize good performance of the method not only in free-noise cases, but also when the input data are contaminated with different levels of noises.

Keywords: frame, grey wolf optimization algorithm, modal residual force, structural damage detection

Procedia PDF Downloads 381
1231 Fault Tree Analysis and Bayesian Network for Fire and Explosion of Crude Oil Tanks: Case Study

Authors: B. Zerouali, M. Kara, B. Hamaidi, H. Mahdjoub, S. Rouabhia

Abstract:

In this paper, a safety analysis for crude oil tanks to prevent undesirable events that may cause catastrophic accidents. The estimation of the probability of damage to industrial systems is carried out through a series of steps, and in accordance with a specific methodology. In this context, this work involves developing an assessment tool and risk analysis at the level of crude oil tanks system, based primarily on identification of various potential causes of crude oil tanks fire and explosion by the use of Fault Tree Analysis (FTA), then improved risk modelling by Bayesian Networks (BNs). Bayesian approach in the evaluation of failure and quantification of risks is a dynamic analysis approach. For this reason, have been selected as an analytical tool in this study. Research concludes that the Bayesian networks have a distinct and effective method in the safety analysis because of the flexibility of its structure; it is suitable for a wide variety of accident scenarios.

Keywords: bayesian networks, crude oil tank, fault tree, prediction, safety

Procedia PDF Downloads 653