Search results for: ruin probability
230 A Method To Assess Collaboration Using Perception of Risk from the Architectural Engineering Construction Industry
Authors: Sujesh F. Sujan, Steve W. Jones, Arto Kiviniemi
Abstract:
The use of Building Information Modelling (BIM) in the Architectural-Engineering-Construction (AEC) industry is a form of systemic innovation. Unlike incremental innovation, (such as the technological development of CAD from hand based drawings to 2D electronically printed drawings) any form of systemic innovation in Project-Based Inter-Organisational Networks requires complete collaboration and results in numerous benefits if adopted and utilised properly. Proper use of BIM involves people collaborating with the use of interoperable BIM compliant tools. The AEC industry globally has been known for its adversarial and fragmented nature where firms take advantage of one another to increase their own profitability. Due to the industry’s nature, getting people to collaborate by unifying their goals is critical to successful BIM adoption. However, this form of innovation is often being forced artificially in the old ways of working which do not suit collaboration. This may be one of the reasons for its low global use even though the technology was developed more than 20 years ago. Therefore, there is a need to develop a metric/method to support and allow industry players to gain confidence in their investment into BIM software and workflow methods. This paper departs from defining systemic risk as a risk that affects all the project participants at a given stage of a project and defines categories of systemic risks. The need to generalise is to allow method applicability to any industry where the category will be the same, but the example of the risk will depend on the industry the study is done in. The method proposed seeks to use individual perception of an example of systemic risk as a key parameter. The significance of this study lies in relating the variance of individual perception of systemic risk to how much the team is collaborating. The method bases its notions on the claim that a more unified range of individual perceptions would mean a higher probability that the team is collaborating better. Since contracts and procurement devise how a project team operates, the method could also break the methodological barrier of highly subjective findings that case studies inflict, which has limited the possibility of generalising between global industries. Since human nature applies in all industries, the authors’ intuition is that perception can be a valuable parameter to study collaboration which is essential especially in projects that utilise systemic innovation such as BIM.Keywords: building information modelling, perception of risk, systemic innovation, team collaboration
Procedia PDF Downloads 185229 Spectroscopic Studies and Reddish Luminescence Enhancement with the Increase in Concentration of Europium Ions in Oxy-Fluoroborate Glasses
Authors: Mahamuda Sk, Srinivasa Rao Allam, Vijaya Prakash G.
Abstract:
The different concentrations of Eu3+ ions doped in Oxy-fluoroborate glasses of composition 60 B2O3-10 BaF2-10 CaF2-15 CaF2- (5-x) Al2O3 -x Eu2O3 where x = 0.1, 0.5, 1.0 and 2.0 mol%, have been prepared by conventional melt quenching technique and are characterized through absorption and photoluminescence (PL), decay, color chromaticity and Confocal measurements. The absorption spectra of all the glasses consists of six peaks corresponding to the transitions 7F0→5D2, 7F0→5D1, 7F1→5D1, 7F1→5D0, 7F0→7F6 and 7F1→7F6 respectively. The experimental oscillator strengths with and without thermal corrections have been evaluated using absorption spectra. Judd-Ofelt (JO) intensity parameters (Ω2 and Ω4) have been evaluated from the photoluminescence spectra of all the glasses. PL spectra of all the glasses have been recorded at excitation wavelengths 395 nm (conventional excitation source) and 410 nm (diode laser) to observe the intensity variation in the PL spectra. All the spectra consists of five emission peaks corresponding to the transitions 5D0→7FJ (J = 0, 1, 2, 3 and 4). Surprisingly no concentration quenching is observed on PL spectra. Among all the glasses the glass with 2.0 mol% of Eu3+ ion concentration possesses maximum intensity for the transition 5D0→7F2 (612 nm) in bright red region. The JO parameters derived from the photoluminescence spectra have been used to evaluate the essential radiative properties such as transition probability (A), radiative lifetime (τR), branching ratio (βR) and peak stimulated emission cross-section (σse) for the 5D0→7FJ (J = 0, 1, 2, 3 and 4) transitions of the Eu3+ ions. The decay rates of the 5D0 fluorescent level of Eu3+ ions in the title glasses are found to be single exponential for all the studied Eu3+ ion concentrations. A marginal increase in lifetime of the 5D0 level has been noticed with increase in Eu3+ ion concentration from 0.1 mol% to 2.0 mol%. Among all the glasses, the glass with 2.0 mol% of Eu3+ ion concentration possesses maximum values of branching ratio, stimulated emission cross-section and quantum efficiency for the transition 5D0→7F2 (612 nm) in bright red region. The color chromaticity coordinates are also evaluated to confirm the reddish luminescence from these glasses. These color coordinates exactly fall in the bright red region. Confocal images also recorded to confirm reddish luminescence from these glasses. From all the obtained results in the present study, it is suggested that the glass with 2.0 mol% of Eu3+ ion concentration is suitable to emit bright red color laser.Keywords: Europium, Judd-Ofelt parameters, laser, luminescence
Procedia PDF Downloads 242228 Prediction of Finned Projectile Aerodynamics Using a Lattice-Boltzmann Method CFD Solution
Authors: Zaki Abiza, Miguel Chavez, David M. Holman, Ruddy Brionnaud
Abstract:
In this paper, the prediction of the aerodynamic behavior of the flow around a Finned Projectile will be validated using a Computational Fluid Dynamics (CFD) solution, XFlow, based on the Lattice-Boltzmann Method (LBM). XFlow is an innovative CFD software developed by Next Limit Dynamics. It is based on a state-of-the-art Lattice-Boltzmann Method which uses a proprietary particle-based kinetic solver and a LES turbulent model coupled with the generalized law of the wall (WMLES). The Lattice-Boltzmann method discretizes the continuous Boltzmann equation, a transport equation for the particle probability distribution function. From the Boltzmann transport equation, and by means of the Chapman-Enskog expansion, the compressible Navier-Stokes equations can be recovered. However to simulate compressible flows, this method has a Mach number limitation because of the lattice discretization. Thanks to this flexible particle-based approach the traditional meshing process is avoided, the discretization stage is strongly accelerated reducing engineering costs, and computations on complex geometries are affordable in a straightforward way. The projectile that will be used in this work is the Army-Navy Basic Finned Missile (ANF) with a caliber of 0.03 m. The analysis will consist in varying the Mach number from M=0.5 comparing the axial force coefficient, normal force slope coefficient and the pitch moment slope coefficient of the Finned Projectile obtained by XFlow with the experimental data. The slope coefficients will be obtained using finite difference techniques in the linear range of the polar curve. The aim of such an analysis is to find out the limiting Mach number value starting from which the effects of high fluid compressibility (related to transonic flow regime) lead the XFlow simulations to differ from the experimental results. This will allow identifying the critical Mach number which limits the validity of the isothermal formulation of XFlow and beyond which a fully compressible solver implementing a coupled momentum-energy equations would be required.Keywords: CFD, computational fluid dynamics, drag, finned projectile, lattice-boltzmann method, LBM, lift, mach, pitch
Procedia PDF Downloads 421227 Elasticity of Soil Fertility Indicators and pH in Termite Infested Cassava Field as Influenced by Tillage and Organic Manure Sources
Authors: K. O. Ogbedeh, T. T. Epidi, E. U. Onweremadu, E. E. Ihem
Abstract:
Apart from the devastating nature of termites as pest of cassava, nearly all termite species have been implicated in soil fertility modifications. Elasticity of soil fertility indicators and pH in termite infested cassava field as influenced by tillage and organic manure sources in Owerri, Southeast, Nigeria was investigated in this study. Three years of of field trials were conducted in 2007, 2008 and 2009 cropping seasons respectively at the Teaching and Research Farm of the Federal University of Technology, Owerri. The experiments were laid out in a 3x6 split-plot factorial arrangement fitted into a randomized complete block design (RCBD) with three replications. The TMS 4 (2)1425 was the cassava cultivar used. Treatments consists three tillage methods (zero, flat and mound), two rates of municipal waste (1.5 and 3.0tonnes/ha), two rates of Azadirachta indica (neem) leaves (20 and 30tonnes/ha), control (0.0 tonnes/ha) and a unit dose of carbofuran (chemical check). Data were collected on pre-planting soil physical and chemical properties, post-harvest soil pH (both in water and KCl) and residual total exchangeable bases (Ca, K, Mg and Na). These were analyzed using a Mixed-model procedure of Statistical Analysis Software (SAS). Means were separated using Least Significant Difference (LSD.) at 5% level of probability. Result shows that the native soil fertility status of the experimental site was poor. However soil pH increased substantially in plots where mounds, A.indica leaves at 30t/ha and municipal waste (1.5 and 3.0t/ha) were treated especially in 2008 and 2009. In 2007 trial, highest soil pH was maintained with flat (5.41 in water and 4.97 in KCl). Control on the other hand, recorded least soil pH especially in 2009 with values of 5.18 and 4.63 in water and KCl respectively. Equally, mound, A. indica leaves at 30t/ha and municipal waste at 3.0t/ha consistently increased organic matter content of the soil than other treatments. Finally, mound and A. indica leaves at 30t/ha linearly and consistently increased residual total exchangeable bases of the soil.Keywords: elasticity, fertility, indicators, termites, tillage, cassava and manure sources
Procedia PDF Downloads 301226 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis
Authors: H. Jung, N. Kim, B. Kang, J. Choe
Abstract:
History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.Keywords: history matching, principal component analysis, reservoir modelling, support vector machine
Procedia PDF Downloads 160225 Last ca 2500 Yr History of the Harmful Algal Blooms in South China Reconstructed on Organic-Walled Dinoflagellate Cysts
Authors: Anastasia Poliakova
Abstract:
Harmful algal bloom (HAB) is a known negative phenomenon that is caused both by natural factors and anthropogenic influence. HABs can result in a series of deleterious effects, such as beach fouling, paralytic shellfish poisoning, mass mortality of marine species, and a threat to human health, especially if toxins pollute drinking water or occur nearby public resorts. In South China, the problem of HABs has an ultimately important meaning. For this study, we used a 1.5 m sediment core LAX-2018-2 collected in 2018 from the Zhanjiang Mangrove National Nature Reserve (109°03´E, 20°30´N), Guangdong Province, South China. High-resolution coastal environment reconstruction with a specific focus on the HABs history during the last ca 2500 yrs was attempted. Age control was performed with five radiocarbon dates obtained from benthic foraminifera. A total number of 71 dinoflagellate cyst types was recorded. The most common types found consistently throughout the sediment sequence were autotrophic Spiniferites spp., Spiniferites hyperacanthus and S. mirabilis, S. ramosus, Operculodinium centrocarpum sensu Wall and Dale 1966, Polysphaeridium zoharyi, and heterotrophic Brigantedinium ssp., cyst of Gymnodinium catenatum and cysts mixture of Protoperidinium. Three local dinoflagellate zones LAX-1 to LAX-3 were established based on the results of the constrained cluster analysis and data ordination; additionally, the middle zone LAX-2 was derived into two subzones, LAX-2a and LAX-2b based on the dynamics of toxic and heterotrophic cysts as well as on the significant changes (probability, P=0.89) in percentages of eutrophic indicators. The total cyst count varied from 106 to 410 dinocysts per slide, with 177 cyst types on average. Dinocyst assemblages are characterized by high values of the dost-depositional degradation index (kt) that varies between 3.6 and 7.6 (averaging 5.4), which is relatively high and is very typical for the areas with selective dinoflagellate cyst preservation that is related to bottom-water oxygen concentrations.Keywords: reconstruction of palaeoenvironment, harmful algal blooms, anthropogenic influence on coastal zones, South China Sea
Procedia PDF Downloads 89224 Biodiesel Production from Edible Oil Wastewater Sludge with Bioethanol Using Nano-Magnetic Catalysis
Authors: Wighens Ngoie Ilunga, Pamela J. Welz, Olewaseun O. Oyekola, Daniel Ikhu-Omoregbe
Abstract:
Currently, most sludge from the wastewater treatment plants of edible oil factories is disposed to landfills, but landfill sites are finite and potential sources of environmental pollution. Production of biodiesel from wastewater sludge can contribute to energy production and waste minimization. However, conventional biodiesel production is energy and waste intensive. Generally, biodiesel is produced from the transesterification reaction of oils with alcohol (i.e., Methanol, ethanol) in the presence of a catalyst. Homogeneously catalysed transesterification is the conventional approach for large-scale production of biodiesel as reaction times are relatively short. Nevertheless, homogenous catalysis presents several challenges such as high probability of soap. The current study aimed to reuse wastewater sludge from the edible oil industry as a novel feedstock for both monounsaturated fats and bioethanol for the production of biodiesel. Preliminary results have shown that the fatty acid profile of the oilseed wastewater sludge is favourable for biodiesel production with 48% (w/w) monounsaturated fats and that the residue left after the extraction of fats from the sludge contains sufficient fermentable sugars after steam explosion followed by an enzymatic hydrolysis for the successful production of bioethanol [29% (w/w)] using a commercial strain of Saccharomyces cerevisiae. A novel nano-magnetic catalyst was synthesised from mineral processing alkaline tailings, mainly containing dolomite originating from cupriferous ores using a modified sol-gel. The catalyst elemental chemical compositions and structural properties were characterised by X-ray diffraction (XRD), scanning electron microscopy (SEM), Fourier transform infra-red (FTIR) and the BET for the surface area with 14.3 m²/g and 34.1 nm average pore diameter. The mass magnetization of the nano-magnetic catalyst was 170 emu/g. Both the catalytic properties and reusability of the catalyst were investigated. A maximum biodiesel yield of 78% was obtained, which dropped to 52% after the fourth transesterification reaction cycle. The proposed approach has the potential to reduce material costs, energy consumption and water usage associated with conventional biodiesel production technologies. It may also mitigate the impact of conventional biodiesel production on food and land security, while simultaneously reducing waste.Keywords: biodiesel, bioethanol, edible oil wastewater sludge, nano-magnetism
Procedia PDF Downloads 145223 Emergency Physician Performance for Hydronephrosis Diagnosis and Grading Compared with Radiologist Assessment in Renal Colic: The EPHyDRA Study
Authors: Sameer A. Pathan, Biswadev Mitra, Salman Mirza, Umais Momin, Zahoor Ahmed, Lubna G. Andraous, Dharmesh Shukla, Mohammed Y. Shariff, Magid M. Makki, Tinsy T. George, Saad S. Khan, Stephen H. Thomas, Peter A. Cameron
Abstract:
Study objective: Emergency physician’s (EP) ability to identify hydronephrosis on point-of-care ultrasound (POCUS) has been assessed in the past using CT scan as the reference standard. We aimed to assess EP interpretation of POCUS to identify and grade the hydronephrosis in a direct comparison with the consensus-interpretation of POCUS by radiologists, and also to compare the EP and radiologist performance using CT scan as the criterion standard. Methods: Using data from a POCUS databank, a prospective interpretation study was conducted at an urban academic emergency department. All POCUS exams were performed on patients presenting with renal colic to the ED. Institutional approval was obtained for conducting this study. All the analyses were performed using Stata MP 14.0 (Stata Corp, College Station, Texas). Results: A total of 651 patients were included, with paired sets of renal POCUS video clips and the CT scan performed at the same ED visit. Hydronephrosis was reported in 69.6% of POCUS exams by radiologists and 72.7% of CT scans (p=0.22). The κ for consensus interpretation of POCUS between the radiologists to detect hydronephrosis was 0.77 (0.72 to 0.82) and weighted κ for grading the hydronephrosis was 0.82 (0.72 to 0.90), interpreted as good to very good. Using CT scan findings as the criterion standard, Eps had an overall sensitivity of 81.1% (95% CI: 79.6% to 82.5%), specificity of 59.4% (95% CI: 56.4% to 62.5%), PPV of 84.3% (95% CI: 82.9% to 85.7%), and NPV of 53.8% (95% CI: 50.8% to 56.7%); compared to radiologist sensitivity of 85.0% (95% CI: 82.5% to 87.2%), specificity of 79.7% (95% CI: 75.1% to 83.7%), PPV of 91.8% (95% CI: 89.8% to 93.5%), and NPV of 66.5% (95% CI: 61.8% to 71.0%). Testing for a report of moderate or high degree of hydronephrosis, specificity of EP was 94.6% (95% CI: 93.7% to 95.4%) and to 99.2% (95% CI: 98.9% to 99.5%) for identifying severe hydronephrosis alone. Conclusion: EP POCUS interpretations were comparable to the radiologists for identifying moderate to severe hydronephrosis using CT scan results as the criterion standard. Among patients with moderate or high pre-test probability of ureteric calculi, as calculated by the STONE-score, the presence of moderate to severe (+LR 6.3 and –LR 0.69) or severe hydronephrosis (+LR 54.4 and –LR 0.57) was highly diagnostic of the stone disease. Low dose CT is indicated in such patients for evaluation of stone size and location.Keywords: renal colic, point-of-care, ultrasound, bedside, emergency physician
Procedia PDF Downloads 284222 A Corpus Output Error Analysis of Chinese L2 Learners From America, Myanmar, and Singapore
Authors: Qiao-Yu Warren Cai
Abstract:
Due to the rise of big data, building corpora and using them to analyze ChineseL2 learners’ language output has become a trend. Various empirical research has been conducted using Chinese corpora built by different academic institutes. However, most of the research analyzed the data in the Chinese corpora usingcorpus-based qualitative content analysis with descriptive statistics. Descriptive statistics can be used to make summations about the subjects or samples that research has actually measured to describe the numerical data, but the collected data cannot be generalized to the population. Comte, a Frenchpositivist, has argued since the 19th century that human beings’ knowledge, whether the discipline is humanistic and social science or natural science, should be verified in a scientific way to construct a universal theory to explain the truth and human beings behaviors. Inferential statistics, able to make judgments of the probability of a difference observed between groups being dependable or caused by chance (Free Geography Notes, 2015)and to infer from the subjects or examples what the population might think or behave, is just the right method to support Comte’s argument in the field of TCSOL. Also, inferential statistics is a core of quantitative research, but little research has been conducted by combing corpora with inferential statistics. Little research analyzes the differences in Chinese L2 learners’ language corpus output errors by using theOne-way ANOVA so that the findings of previous research are limited to inferring the population's Chinese errors according to the given samples’ Chinese corpora. To fill this knowledge gap in the professional development of Taiwanese TCSOL, the present study aims to utilize the One-way ANOVA to analyze corpus output errors of Chinese L2 learners from America, Myanmar, and Singapore. The results show that no significant difference exists in ‘shì (是) sentence’ and word order errors, but compared with Americans and Singaporeans, it is significantly easier for Myanmar to have ‘sentence blends.’ Based on the above results, the present study provides an instructional approach and contributes to further exploration of how Chinese L2 learners can have (and use) learning strategies to lower errors.Keywords: Chinese corpus, error analysis, one-way analysis of variance, Chinese L2 learners, Americans, myanmar, Singaporeans
Procedia PDF Downloads 106221 Application Research of Stilbene Crystal for the Measurement of Accelerator Neutron Sources
Authors: Zhao Kuo, Chen Liang, Zhang Zhongbing, Ruan Jinlu. He Shiyi, Xu Mengxuan
Abstract:
Stilbene, C₁₄H₁₂, is well known as one of the most useful organic scintillators for pulse shape discrimination (PSD) technique for its good scintillation properties. An on-line acquisition system and an off-line acquisition system were developed with several CAMAC standard plug-ins, NIM plug-ins, neutron/γ discriminating plug-in named 2160A and a digital oscilloscope with high sampling rate respectively for which stilbene crystals and photomultiplier tube detectors (PMT) as detector for accelerator neutron sources measurement carried out in China Institute of Atomic Energy. Pulse amplitude spectrums and charge amplitude spectrums were real-time recorded after good neutron/γ discrimination whose best PSD figure-of-merits (FoMs) are 1.756 for D-D accelerator neutron source and 1.393 for D-T accelerator neutron source. The probability of neutron events in total events was 80%, and neutron detection efficiency was 5.21% for D-D accelerator neutron sources, which were 50% and 1.44% for D-T accelerator neutron sources after subtracting the background of scattering observed by the on-line acquisition system. Pulse waveform signals were acquired by the off-line acquisition system randomly while the on-line acquisition system working. The PSD FoMs obtained by the off-line acquisition system were 2.158 for D-D accelerator neutron sources and 1.802 for D-T accelerator neutron sources after waveform digitization off-line processing named charge integration method for just 1000 pulses. In addition, the probabilities of neutron events in total events obtained by the off-line acquisition system matched very well with the probabilities of the on-line acquisition system. The pulse information recorded by the off-line acquisition system could be repetitively used to adjust the parameters or methods of PSD research and obtain neutron charge amplitude spectrums or pulse amplitude spectrums after digital analysis with a limited number of pulses. The off-line acquisition system showed equivalent or better measurement effects compared with the online system with a limited number of pulses which indicated a feasible method based on stilbene crystals detectors for the measurement of prompt neutrons neutron sources like prompt accelerator neutron sources emit a number of neutrons in a short time.Keywords: stilbene crystal, accelerator neutron source, neutron / γ discrimination, figure-of-merits, CAMAC, waveform digitization
Procedia PDF Downloads 187220 Improving Functionality of Radiotherapy Department Through: Systemic Periodic Clinical Audits
Authors: Kamal Kaushik, Trisha, Dandapni, Sambit Nanda, A. Mukherjee, S. Pradhan
Abstract:
INTRODUCTION: As complexity in radiotherapy practice and processes are increasing, there is a need to assure quality control to a greater extent. At present, no international literature available with regards to the optimal quality control indicators for radiotherapy; moreover, few clinical audits have been conducted in the field of radiotherapy. The primary aim is to improve the processes that directly impact clinical outcomes for patients in terms of patient safety and quality of care. PROCEDURE: A team of an Oncologist, a Medical Physicist and a Radiation Therapist was formed for weekly clinical audits of patient’s undergoing radiotherapy audits The stages for audits include Pre planning audits, Simulation, Planning, Daily QA, Implementation and Execution (with image guidance). Errors in all the parts of the chain were evaluated and recorded for the development of further departmental protocols for radiotherapy. EVALUATION: The errors at various stages of radiotherapy chain were evaluated and recorded for comparison before starting the clinical audits in the department of radiotherapy and after starting the audits. It was also evaluated to find the stage in which maximum errors were recorded. The clinical audits were used to structure standard protocols (in the form of checklist) in department of Radiotherapy, which may lead to further reduce the occurrences of clinical errors in the chain of radiotherapy. RESULTS: The aim of this study is to perform a comparison between number of errors in different part of RT chain in two groups (A- Before Audit and B-After Audit). Group A: 94 pts. (48 males,46 female), Total no. of errors in RT chain:19 (9 needed Resimulation) Group B: 94 pts. (61 males,33 females), Total no. of errors in RT chain: 8 (4 needed Resimulation) CONCLUSION: After systematic periodic clinical audits percentage of error in radiotherapy process reduced more than 50% within 2 months. There is a great need in improving quality control in radiotherapy, and the role of clinical audits can only grow. Although clinical audits are time-consuming and complex undertakings, the potential benefits in terms of identifying and rectifying errors in quality control procedures are potentially enormous. Radiotherapy being a chain of various process. There is always a probability of occurrence of error in any part of the chain which may further propagate in the chain till execution of treatment. Structuring departmental protocols and policies helps in reducing, if not completely eradicating occurrence of such incidents.Keywords: audit, clinical, radiotherapy, improving functionality
Procedia PDF Downloads 88219 Synthesis and Prediction of Activity Spectra of Substances-Assisted Evaluation of Heterocyclic Compounds Containing Hydroquinoline Scaffolds
Authors: Gizachew Mulugeta Manahelohe, Khidmet Safarovich Shikhaliev
Abstract:
There has been a significant surge in interest in the synthesis of heterocyclic compounds that contain hydroquinoline fragments. This surge can be attributed to the broad range of pharmaceutical and industrial applications that these compounds possess. The present study provides a comprehensive account of the synthesis of both linear and fused heterocyclic systems that incorporate hydroquinoline fragments. Furthermore, the pharmacological activity spectra of the synthesized compounds were assessed using the in silico method, employing the prediction of activity spectra of substances (PASS) program. Hydroquinoline nitriles 7 and 8 were prepared through the reaction of the corresponding hydroquinolinecarbaldehyde using a hydroxylammonium chloride/pyridine/toluene system and iodine in aqueous ammonia under ambient conditions, respectively. 2-Phenyl-1,3-oxazol-5(4H)-ones 9a,b and 10a,b were synthesized via the condensation of compounds 5a,b and 6a,b with hippuric acid in acetic acid in 30–60% yield. When activated, 7-methylazolopyrimidines 11a and b were reacted with N-alkyl-2,2,4-trimethyl-1,2,3,4-tetrahydroquinoline-6-carbaldehydes 6a and b, and triazolo/pyrazolo[1,5-a]pyrimidin-6-yl carboxylic acids 12a and b were obtained in 60–70% yield. The condensation of 7-hydroxy-1,2,3,4-tetramethyl-1,2-dihydroquinoline 3 h with dimethylacetylenedicarboxylate (DMAD) and ethyl acetoacetate afforded cyclic products 16 and 17, respectively. The condensation reaction of 6-formyl-7-hydroxy-1,2,2,4-tetramethyl-1,2-dihydroquinoline 5e with methylene-active compounds such as ethyl cyanoacetate/dimethyl-3-oxopentanedioate/ethyl acetoacetate/diethylmalonate/Meldrum’s acid afforded 3-substituted coumarins containing dihydroquinolines 19 and 21. Pentacyclic coumarin 22 was obtained via the random condensation of malononitrile with 5e in the presence of a catalytic amount of piperidine in ethanol. The biological activities of the synthesized compounds were assessed using the PASS program. Based on the prognosis, compounds 13a, b, and 14 exhibited a high likelihood of being active as inhibitors of gluconate 2-dehydrogenase, as well as possessing antiallergic, antiasthmatic, and antiarthritic properties, with a probability value (Pa) ranging from 0.849 to 0.870. Furthermore, it was discovered that hydroquinoline carbonitriles 7 and 8 tended to act as effective progesterone antagonists and displayed antiallergic, antiasthmatic, and antiarthritic effects (Pa = 0.276–0.827). Among the hydroquinolines containing coumarin moieties, compounds 17, 19a, and 19c were predicted to be potent progesterone antagonists, with Pa values of 0.710, 0.630, and 0.615, respectively.Keywords: heterocyclic compound, hydroquinoline, Vilsmeier–Haack formulation, quinolone
Procedia PDF Downloads 44218 Expert Supporting System for Diagnosing Lymphoid Neoplasms Using Probabilistic Decision Tree Algorithm and Immunohistochemistry Profile Database
Authors: Yosep Chong, Yejin Kim, Jingyun Choi, Hwanjo Yu, Eun Jung Lee, Chang Suk Kang
Abstract:
For the past decades, immunohistochemistry (IHC) has been playing an important role in the diagnosis of human neoplasms, by helping pathologists to make a clearer decision on differential diagnosis, subtyping, personalized treatment plan, and finally prognosis prediction. However, the IHC performed in various tumors of daily practice often shows conflicting and very challenging results to interpret. Even comprehensive diagnosis synthesizing clinical, histologic and immunohistochemical findings can be helpless in some twisted cases. Another important issue is that the IHC data is increasing exponentially and more and more information have to be taken into account. For this reason, we reached an idea to develop an expert supporting system to help pathologists to make a better decision in diagnosing human neoplasms with IHC results. We gave probabilistic decision tree algorithm and tested the algorithm with real case data of lymphoid neoplasms, in which the IHC profile is more important to make a proper diagnosis than other human neoplasms. We designed probabilistic decision tree based on Bayesian theorem, program computational process using MATLAB (The MathWorks, Inc., USA) and prepared IHC profile database (about 104 disease category and 88 IHC antibodies) based on WHO classification by reviewing the literature. The initial probability of each neoplasm was set with the epidemiologic data of lymphoid neoplasm in Korea. With the IHC results of 131 patients sequentially selected, top three presumptive diagnoses for each case were made and compared with the original diagnoses. After the review of the data, 124 out of 131 were used for final analysis. As a result, the presumptive diagnoses were concordant with the original diagnoses in 118 cases (93.7%). The major reason of discordant cases was that the similarity of the IHC profile between two or three different neoplasms. The expert supporting system algorithm presented in this study is in its elementary stage and need more optimization using more advanced technology such as deep-learning with data of real cases, especially in differentiating T-cell lymphomas. Although it needs more refinement, it may be used to aid pathological decision making in future. A further application to determine IHC antibodies for a certain subset of differential diagnoses might be possible in near future.Keywords: database, expert supporting system, immunohistochemistry, probabilistic decision tree
Procedia PDF Downloads 224217 Adapting to Rural Demographic Change: Impacts, Challenges and Opportunities for Ageing Farmers in Prachin Buri Province, Thailand
Authors: Para Jansuwan, Kerstin K. Zander
Abstract:
Most people in rural Thailand still depend on agriculture. The rural areas are undergoing changes in their demographic structures with an increasing older population, out migration of younger people and a shift away from work in the agricultural sector towards manufacturing and service provisioning. These changes may lead to a decline in agricultural productivity and food insecurity. Our research aims to examine perceptions of older farmers on how rural demographic change affects them, to investigate how farmers may change their agricultural practices to cope with their ageing and to explore the factors affecting these changes, including the opportunities and challenges arising from them. The data were collected through a household survey with 368 farmers in the Prachin Buri province in central Thailand, the main area for agricultural production. A series of binomial logistic regression models were applied to analyse the data. We found that most farmers suffered from age-related diseases, which compromised their working capacity. Most farmers attempted to reduce labour intense work, by either stopping farming through transferring farmland to their children (41%), stopping farming by giving the land to the others (e.g., selling, leasing out) (28%) and continuing farming with making some changes (e.g., changing crops, employing additional workers) (24%). Farmers’ health and having a potential farm successor were positively associated with the probability of stopping farming by transferring the land to the children. Farmers with a successor were also less likely to stop farming by giving the land to the others. Farmers’ age was negatively associated with the likelihood of continuing farming by making some changes. The results show that most farmers base their decisions on the hope that their children will take over the farms, and that without successor, farmers lease out or sell the land. Without successor, they also no longer invest in expansion and improvement of their farm production, especially adoption of innovative technologies that could help them to maintain their farm productivity. To improve farmers’ quality of life and sustain their farm productivity, policies are needed to support the viability of farms, the access to a pension system and the smooth and successful transfer of the land to a successor of farmers.Keywords: rural demographic change, older farmer, stopping farming, continuing farming, health and age, farm successor, Thailand
Procedia PDF Downloads 114216 Incidence and Predictors of Mortality Among HIV Positive Children on Art in Public Hospitals of Harer Town, Enrolled From 2011 to 2021
Authors: Getahun Nigusie
Abstract:
Background; antiretroviral treatment reduce HIV-related morbidity, and prolonged survival of patients however, there is lack of up-to-date information concerning the treatment long term effect on the survival of HIV positive children especially in the study area. Objective: To assess incidence and predictors of mortality among HIV positive children on ART in public hospitals of Harer town who were enrolled from 2011 to 2021. Methodology: Institution based retrospective cohort study was conducted among 429 HIV positive children enrolled in ART clinic from January 1st 2011 to December30th 2021. Data were collected from medical cards by using a data extraction form, Descriptive analyses were used to Summarized the results, and life table was used to estimate survival probability at specific point of time after introduction of ART. Kaplan Meier survival curve together with log rank test was used to compare survival between different categories of covariates, and Multivariate Cox-proportional hazard regression model was used to estimate adjusted Hazard rate. Variables with p-values ≤0.25 in bivariable analysis were candidates to the multivariable analysis. Finally, variables with p-values < 0.05 were considered as significant variables. Results: The study participants had followed for a total of 2549.6 child-years (30596 child months) with an overall mortality rate of 1.5 (95% CI: 1.1, 2.04) per 100 child-years. Their median survival time was 112 months (95% CI: 101–117). There were 38 children with unknown outcome, 39 deaths, and 55 children transfer out to different facility. The overall survival at 6, 12, 24, 48 months were 98%, 96%, 95%, 94% respectively. being in WHO clinical Stage four (AHR=4.55, 95% CI:1.36, 15.24), having anemia(AHR=2.56, 95% CI:1.11, 5.93), baseline low absolute CD4 count (AHR=2.95, 95% CI: 1.22, 7.12), stunting (AHR=4.1, 95% CI: 1.11, 15.42), wasting (AHR=4.93, 95% CI: 1.31, 18.76), poor adherence to treatment (AHR=3.37, 95% CI: 1.25, 9.11), having TB infection at enrollment (AHR=3.26, 95% CI: 1.25, 8.49),and no history of change their regimen(AHR=7.1, 95% CI: 2.74, 18.24), were independent predictors of death. Conclusion: more than half of death occurs within 2 years. Prevalent tuberculosis, anemia, wasting, and stunting nutritional status, socioeconomic factors, and baseline opportunistic infection were independent predictors of death. Increasing early screening and managing those predictors are required.Keywords: human immunodeficiency virus-positive children, anti-retroviral therapy, survival, Ethiopia
Procedia PDF Downloads 22215 The Role of Accounting and Auditing in Anti-Corruption Strategies: The Case of ECOWAS
Authors: Edna Gnomblerou
Abstract:
Given the current scale of corruption epidemic in West African economies, governments are seeking for immediate and effective measures to reduce the likelihood of the plague within the region. Generally, accountants and auditors are expected to help organizations in detecting illegal practices. However, their role in the fight against corruption is sometimes limited due to the collusive nature of corruption. The Denmark anti-corruption model shows that the implementation of additional controls over public accounts and independent efficient audits improve transparency and increase the probability of detection. This study is aimed at reviewing the existing anti-corruption policies of the Economic Commission of West African States (ECOWAS) as to observe the role attributed to accounting, auditing and other managerial practices in their anti-corruption drive. It further discusses the usefulness of accounting and auditing in helping anti-corruption commissions in controlling misconduct and increasing the perception to detect irregularities within public administration. The purpose of this initiative is to identify and assess the relevance of accounting and auditing in curbing corruption. To meet this purpose, the study was designed to answer the questions of whether accounting and auditing processes were included in the reviewed anti-corruption strategies, and if yes, whether they were effective in the detection process. A descriptive research method was adopted in examining the role of accounting and auditing in West African anti-corruption strategies. The analysis reveals that proper recognition of accounting standards and implementation of financial audits are viewed as strategic mechanisms in tackling corruption. Additionally, codes of conduct, whistle-blowing and information disclosure to the public are among the most common managerial practices used throughout anti-corruption policies to effectively and efficiently address the problem. These observations imply that sound anti-corruption strategies cannot ignore the values of including accounting and auditing processes. On one hand, this suggests that governments should employ all resources possible to improve accounting and auditing practices in the management of public sector organizations. On the other hand, governments must ensure that accounting and auditing practices are not limited to the private sector, but when properly implemented constitute crucial mechanisms to control and reduce corrupt incentives in public sector.Keywords: accounting, anti-corruption strategy, auditing, ECOWAS
Procedia PDF Downloads 256214 Regression-Based Approach for Development of a Cuff-Less Non-Intrusive Cardiovascular Health Monitor
Authors: Pranav Gulati, Isha Sharma
Abstract:
Hypertension and hypotension are known to have repercussions on the health of an individual, with hypertension contributing to an increased probability of risk to cardiovascular diseases and hypotension resulting in syncope. This prompts the development of a non-invasive, non-intrusive, continuous and cuff-less blood pressure monitoring system to detect blood pressure variations and to identify individuals with acute and chronic heart ailments, but due to the unavailability of such devices for practical daily use, it becomes difficult to screen and subsequently regulate blood pressure. The complexities which hamper the steady monitoring of blood pressure comprises of the variations in physical characteristics from individual to individual and the postural differences at the site of monitoring. We propose to develop a continuous, comprehensive cardio-analysis tool, based on reflective photoplethysmography (PPG). The proposed device, in the form of an eyewear captures the PPG signal and estimates the systolic and diastolic blood pressure using a sensor positioned near the temporal artery. This system relies on regression models which are based on extraction of key points from a pair of PPG wavelets. The proposed system provides an edge over the existing wearables considering that it allows for uniform contact and pressure with the temporal site, in addition to minimal disturbance by movement. Additionally, the feature extraction algorithms enhance the integrity and quality of the extracted features by reducing unreliable data sets. We tested the system with 12 subjects of which 6 served as the training dataset. For this, we measured the blood pressure using a cuff based BP monitor (Omron HEM-8712) and at the same time recorded the PPG signal from our cardio-analysis tool. The complete test was conducted by using the cuff based blood pressure monitor on the left arm while the PPG signal was acquired from the temporal site on the left side of the head. This acquisition served as the training input for the regression model on the selected features. The other 6 subjects were used to validate the model by conducting the same test on them. Results show that the developed prototype can robustly acquire the PPG signal and can therefore be used to reliably predict blood pressure levels.Keywords: blood pressure, photoplethysmograph, eyewear, physiological monitoring
Procedia PDF Downloads 279213 Inappropriate Prescribing Defined by START and STOPP Criteria and Its Association with Adverse Drug Events among Older Hospitalized Patients
Authors: Mohd Taufiq bin Azmy, Yahaya Hassan, Shubashini Gnanasan, Loganathan Fahrni
Abstract:
Inappropriate prescribing in older patients has been associated with resource utilization and adverse drug events (ADE) such as hospitalization, morbidity and mortality. Globally, there is a lack of published data on ADE induced by inappropriate prescribing. Our study is specific to an older population and is aimed at identifying risk factors for ADE and to develop a model that will link ADE to inappropriate prescribing. The design of the study was prospective whereby computerized medical records of 302 hospitalized elderly aged 65 years and above in 3 public hospitals in Malaysia (Hospital Serdang, Hospital Selayang and Hospital Sungai Buloh) were studied over a 7 month period from September 2013 until March 2014. Potentially inappropriate medications and potential prescribing omissions were determined using the published and validated START-STOPP criteria. Patients who had at least one inappropriate medication were included in Phase II of the study where ADE were identified by local expert consensus panel based on the published and validated Naranjo ADR probability scale. The panel also assessed whether ADE were causal or contributory to current hospitalization. The association between inappropriate prescribing and ADE (hospitalization, mortality and adverse drug reactions) was determined by identifying whether or not the former was causal or contributory to the latter. Rate of ADE avoidability was also determined. Our findings revealed that the prevalence of potential inappropriate prescribing was 58.6%. A total of ADEs were detected in 31 of 105 patients (29.5%) when STOPP criteria were used to identify potentially inappropriate medication; All of the 31 ADE (100%) were considered causal or contributory to admission. Of the 31 ADEs, 28 (90.3%) were considered avoidable or potentially avoidable. After adjusting for age, sex, comorbidity, dementia, baseline activities of daily living function, and number of medications, the likelihood of a serious avoidable ADE increased significantly when a potentially inappropriate medication was prescribed (odds ratio, 11.18; 95% confidence interval [CI], 5.014 - 24.93; p < .001). The medications identified by STOPP criteria, are significantly associated with avoidable ADE in older people that cause or contribute to urgent hospitalization but contributed less towards morbidity and mortality. Findings of the study underscore the importance of preventing inappropriate prescribing.Keywords: adverse drug events, appropriate prescribing, health services research
Procedia PDF Downloads 399212 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications
Procedia PDF Downloads 93211 Factors Associated with Increase of Diabetic Foot Ulcers in Diabetic Patients in Nyahururu County Hospital
Authors: Daniel Wachira
Abstract:
The study aims to determine factors contributing to increasing rates of DFU among DM patients attending clinics in Nyahururu County referral hospital, Lakipia County. The study objectives include;- To determine the demographic factors contributing to increased rates of DFU among DM patients, determining the sociocultural factors that contribute to increased rates of DFU among DM patients and determining the health facility factors contributing to increased rates of DFU among DM patients attending DM clinic at Nyahururu county referral hospital, Laikipia County. This study will adopt a descriptive cross-sectional study design. It involves the collection of data at a one-time point without follow-up. This method is fast and inexpensive, there is no loss to follow up as the data is collected at one time point and associations between variables can be determined. The study population includes all DM patients with or without DFU. The sampling technique that will be used is the probability sampling method, a simple random method of sampling will be used. The study will employ the use of questionnaires to collect the required information. Questionnaires will be a research administered questionnaires. The questionnaire developed was done in consultation with other research experts (supervisor) to ensure reliability. The questionnaire designed will be pre-tested by hand delivering them to a sample 10% of the sample size at J.M Kariuki Memorial hospital, Nyandarua county and thereafter collecting them dully filled followed by refining of errors to ensure it is valid for collection of data relevant for this study. Refining of errors on the questionnaires to ensure it was valid for collection of data relevant for this study. Data collection will begin after the approval of the project. Questionnaires will be administered only to the participants who met the selection criteria by the researcher and those who agreed to participate in the study to collect key information with regard to the objectives of the study. The study's authority will be obtained from the National Commission of Science and Technology and Innovation. Permission will also be obtained from the Nyahururu County referral hospital administration staff. The purpose of the study will be explained to the respondents in order to secure informed consent, and no names will be written on the questionnaires. All the information will be treated with maximum confidentiality by not disclosing who the respondent was and the information.Keywords: diabetes, foot ulcer, social factors, hospital factors
Procedia PDF Downloads 17210 Oral Hygiene Behaviors among Pregnant Women with Diabetes Who Attend Primary Health Care Centers at Baghdad City
Authors: Zena F. Mushtaq, Iqbal M. Abbas
Abstract:
Background: Diabetes mellitus during pregnancy is one of the major medical and social problems with increasing prevalence in last decades and may lead to more vulnerable to dental problems and increased risk for periodontal diseases. Objectives: To assess oral hygiene behaviors among pregnant women with diabetes who attended primary health care centers and find out the relationship between oral hygiene behaviors and studied variables. Methodology: A cross sectional design was conducted from 7 July to 30 September 2014 on non probability (convenient sample) of 150 pregnant women with diabetes was selected from twelve Primary Health Care Centers at Baghdad city. Questionnaire format is tool for data collection which had designed and consisted of three main parts including: socio demographic, reproductive characteristics and items of oral hygiene behaviors among pregnant women with diabetes. Reliability of the questionnaire was determined through internal consistency of correlation coefficient (R= 0.940) and validity of content was determined through reviewing it by (12) experts in different specialties and was determined through pilot study. Descriptive and inferential statistics were used to analyze collected data. Result: Result of study revealed that (35.3%) of study sample was (35-39) years old with mean and SD is (X & SD = 33.57 ± 5.54) years, and (34.7%) of the study sample was graduated from primary school and less, half of the study sample was government employment and self employed, (42.7%) of the study sample had moderate socioeconomic status, the highest percentage (70.0%) of the study sample was nonsmokers, The result indicates that oral hygiene behaviors have moderate mean score in all items. There are no statistical significant association between oral hygiene domain and studied variables. Conclusions: All items related to health behavior concerning oral hygiene is in moderate mean of score, which may expose pregnant women with diabetes to high risk of periodontal diseases. Recommendations: Dental care provider should perform a dental examination at least every three months for each pregnant woman with diabetes, explanation of the effect of DM on periodontal health, oral hygiene instruction, oral prophylaxis, professional cleaning and treatment of periodontal diseases(scaling and root planing) when needed.Keywords: diabetes, health behavior, pregnant women, oral hygiene
Procedia PDF Downloads 287209 Time Fetching Water and Maternal Childcare Practices: Comparative Study of Women with Children Living in Ethiopia and Malawi
Authors: Davod Ahmadigheidari, Isabel Alvarez, Kate Sinclair, Marnie Davidson, Patrick Cortbaoui, Hugo Melgar-Quiñonez
Abstract:
The burden of collecting water tends to disproportionately fall on women and girls in low-income countries. Specifically, women spend between one to eight hours per day fetching water for domestic use in Sub-Saharan Africa. While there has been research done on the global time burden for collecting water, it has been mainly focused on water quality parameters; leaving the relationship between water fetching and health outcomes understudied. There is little available evidence regarding the relationship between water fetching and maternal child care practices. The main objective of this study was to help fill the aforementioned gap in the literature. Data from two surveys in Ethiopia and Malawi conducted by CARE Canada in 2016-2017 were used. Descriptive statistics indicate that women were predominantly responsible for collecting water in both Ethiopia (87%) and Malawi (99%) respectively, with the majority spending more than 30 minutes per day on water collection. With regards to child care practices, in both countries, breastfeeding was relatively high (77% and 82%, respectively); and treatment for malnutrition was low (15% and 8%, respectively). However, the same consistency was not found for weighing; in Ethiopia only 16% took their children for weighting in contrast to 94% in Malawi. These three practices were summed to create one variable for regressions analyses. Unadjusted logistic regression findings showed that only in Ethiopia was time fetching water significantly associated with child care practices. Once adjusted for covariates, this relationship was no longer found to be significant. Adjusted logistic regressions also showed that the factors that did influence child care practices differed slightly between the two countries. In Ethiopia, a lack of access to community water supply (OR= 0.668; P=0.010), poor attitudes towards gender equality (OR= 0.608; P=0.001), no access to land and (OR=0.603; P=0.000), significantly decreased a women’s odd of using positive childcare practices. Notably, being young women between 15-24 years (OR=2.308; P=0.017), and 25-29 (OR=2.065; P=0.028) increased probability of using positive childcare practices. Whereas in Malawi, higher maternal age, low decision-making power, significantly decreased a women’s odd of using positive childcare practices. In conclusion, this study found that even though amount of time spent by women fetching water makes a difference for childcare practices, it is not significantly related to women’s child care practices when controlling the covariates. Importantly, women’s age contributes to child care practices in Ethiopia and Malawi.Keywords: time fetching water, community water supply, women’s child care practices, Ethiopia, Malawi
Procedia PDF Downloads 202208 Simulated Translator-Client Relations in Translator Training: Translator Behavior around Risk Management
Authors: Maggie Hui
Abstract:
Risk management is not a new concept; however, it is an uncharted area as applied to the translation process and translator training. Risk managers are responsible for managing risk, i.e. adopting strategies with the intention to minimize loss and maximize gains in spite of uncertainty. Which risk strategy to use often depends on the frequency of an event (i.e. probability) and the severity of its outcomes (i.e. impact). This is basically the way translation/localization project managers handle risk management. Although risk management could involve both positive and negative impacts, impact seems to be always negative in professional translators’ management models, e.g. how many days of project time are lost or how many clients are lost. However, for analysis of translation performance, the impact should be possibly positive (e.g. increased readability of the translation) or negative (e.g. loss of source-text information). In other words, the straight business model of risk management is not directly applicable to the study of risk management in the rendition process. This research aims to explore trainee translators’ risk managing while translating in a simulated setting that involves translator-client relations. A two-cycle experiment involving two roles, the translator and the simulated client, was carried out with a class of translation students to test the effects of the main variable of peer-group interaction. The researcher made use of a user-friendly screen-voice recording freeware to record subjects’ screen activities, including every word the translator typed and every change they made to the rendition, the websites they browsed and the reference tools they used, in addition to the verbalization of their thoughts throughout the process. The research observes the translation procedures subjects considered and finally adopted, and looks into the justifications for their procedures, in order to interpret their risk management. The qualitative and quantitative results of this study have some implications for translator training: (a) the experience of being a client seems to reinforce the translator’s risk aversion; (b) there is a wide gap between the translator’s internal risk management and their external presentation of risk; and (c) the use of role-playing simulation can empower students’ learning by enhancing their attitudinal or psycho-physiological competence, interpersonal competence and strategic competence.Keywords: risk management, role-playing simulation, translation pedagogy, translator-client relations
Procedia PDF Downloads 261207 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects
Authors: Mehran Barani Shikhrobat, Roger Flanagan
Abstract:
The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling
Procedia PDF Downloads 138206 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms
Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita
Abstract:
Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.Keywords: air quality, internet of things, artificial intelligence, smart home
Procedia PDF Downloads 93205 Behavioral Analysis of Stock Using Selective Indicators from Fundamental and Technical Analysis
Authors: Vish Putcha, Chandrasekhar Putcha, Siva Hari
Abstract:
In the current digital era of free trading and pandemic-driven remote work culture, markets worldwide gained momentum for retail investors to trade from anywhere easily. The number of retail traders rose to 24% of the market from 15% at the pre-pandemic level. Most of them are young retail traders with high-risk tolerance compared to the previous generation of retail traders. This trend boosted the growth of subscription-based market predictors and market data vendors. Young traders are betting on these predictors, assuming one of them is correct. However, 90% of retail traders are on the losing end. This paper presents multiple indicators and attempts to derive behavioral patterns from the underlying stocks. The two major indicators that traders and investors follow are technical and fundamental. The famous investor, Warren Buffett, adheres to the “Value Investing” method that is based on a stock’s fundamental Analysis. In this paper, we present multiple indicators from various methods to understand the behavior patterns of stocks. For this research, we picked five stocks with a market capitalization of more than $200M, listed on the exchange for more than 20 years, and from different industry sectors. To study the behavioral pattern over time for these five stocks, a total of 8 indicators are chosen from fundamental, technical, and financial indicators, such as Price to Earning (P/E), Price to Book Value (P/B), Debt to Equity (D/E), Beta, Volatility, Relative Strength Index (RSI), Moving Averages and Dividend yields, followed by detailed mathematical Analysis. This is an interdisciplinary paper between various disciplines of Engineering, Accounting, and Finance. The research takes a new approach to identify clear indicators affecting stocks. Statistical Analysis of the data will be performed in terms of the probabilistic distribution, then follow and then determine the probability of the stock price going over a specific target value. The Chi-square test will be used to determine the validity of the assumed distribution. Preliminary results indicate that this approach is working well. When the complete results are presented in the final paper, they will be beneficial to the community.Keywords: stock pattern, stock market analysis, stock predictions, trading, investing, fundamental analysis, technical analysis, quantitative trading, financial analysis, behavioral analysis
Procedia PDF Downloads 85204 Simulation of Maximum Power Point Tracking in a Photovoltaic System: A Circumstance Using Pulse Width Modulation Analysis
Authors: Asowata Osamede
Abstract:
Optimized gain in respect to output power of stand-alone photovoltaic (PV) systems is one of the major focus of PV in recent times. This is evident to its low carbon emission and efficiency. Power failure or outage from commercial providers in general does not promote development to the public and private sector, these basically limit the development of industries. The need for a well-structured PV system is of importance for an efficient and cost-effective monitoring system. The purpose of this paper is to validate the maximum power point of an off-grid PV system taking into consideration the most effective tilt and orientation angles for PV's in the southern hemisphere. This paper is based on analyzing the system using a solar charger with MPPT from a pulse width modulation (PWM) perspective. The power conditioning device chosen is a solar charger with MPPT. The practical setup consists of a PV panel that is set to an orientation angle of 0o north, with a corresponding tilt angle of 36 o, 26o and 16o. The load employed in this set-up are three Lead Acid Batteries (LAB). The percentage fully charged, charging and not charging conditions are observed for all three batteries. The results obtained in this research is used to draw the conclusion that would provide a benchmark for researchers and scientist worldwide. This is done so as to have an idea of the best tilt and orientation angles for maximum power point in a basic off-grid PV system. A quantitative analysis would be employed in this research. Quantitative research tends to focus on measurement and proof. Inferential statistics are frequently used to generalize what is found about the study sample to the population as a whole. This would involve: selecting and defining the research question, deciding on a study type, deciding on the data collection tools, selecting the sample and its size, analyzing, interpreting and validating findings Preliminary results which include regression analysis (normal probability plot and residual plot using polynomial 6) showed the maximum power point in the system. The best tilt angle for maximum power point tracking proves that the 36o tilt angle provided the best average on time which in turns put the system into a pulse width modulation stage.Keywords: power-conversion, meteonorm, PV panels, DC-DC converters
Procedia PDF Downloads 147203 Landsat Data from Pre Crop Season to Estimate the Area to Be Planted with Summer Crops
Authors: Valdir Moura, Raniele dos Anjos de Souza, Fernando Gomes de Souza, Jose Vagner da Silva, Jerry Adriani Johann
Abstract:
The estimate of the Area of Land to be planted with annual crops and its stratification by the municipality are important variables in crop forecast. Nowadays in Brazil, these information’s are obtained by the Brazilian Institute of Geography and Statistics (IBGE) and published under the report Assessment of the Agricultural Production. Due to the high cloud cover in the main crop growing season (October to March) it is difficult to acquire good orbital images. Thus, one alternative is to work with remote sensing data from dates before the crop growing season. This work presents the use of multitemporal Landsat data gathered on July and September (before the summer growing season) in order to estimate the area of land to be planted with summer crops in an area of São Paulo State, Brazil. Geographic Information Systems (GIS) and digital image processing techniques were applied for the treatment of the available data. Supervised and non-supervised classifications were used for data in digital number and reflectance formats and the multitemporal Normalized Difference Vegetation Index (NDVI) images. The objective was to discriminate the tracts with higher probability to become planted with summer crops. Classification accuracies were evaluated using a sampling system developed basically for this study region. The estimated areas were corrected using the error matrix derived from these evaluations. The classification techniques presented an excellent level according to the kappa index. The proportion of crops stratified by municipalities was derived by a field work during the crop growing season. These proportion coefficients were applied onto the area of land to be planted with summer crops (derived from Landsat data). Thus, it was possible to derive the area of each summer crop by the municipality. The discrepancies between official statistics and our results were attributed to the sampling and the stratification procedures. Nevertheless, this methodology can be improved in order to provide good crop area estimates using remote sensing data, despite the cloud cover during the growing season.Keywords: area intended for summer culture, estimated area planted, agriculture, Landsat, planting schedule
Procedia PDF Downloads 150202 Optimal Data Selection in Non-Ergodic Systems: A Tradeoff between Estimator Convergence and Representativeness Errors
Authors: Jakob Krause
Abstract:
Past Financial Crisis has shown that contemporary risk management models provide an unjustified sense of security and fail miserably in situations in which they are needed the most. In this paper, we start from the assumption that risk is a notion that changes over time and therefore past data points only have limited explanatory power for the current situation. Our objective is to derive the optimal amount of representative information by optimizing between the two adverse forces of estimator convergence, incentivizing us to use as much data as possible, and the aforementioned non-representativeness doing the opposite. In this endeavor, the cornerstone assumption of having access to identically distributed random variables is weakened and substituted by the assumption that the law of the data generating process changes over time. Hence, in this paper, we give a quantitative theory on how to perform statistical analysis in non-ergodic systems. As an application, we discuss the impact of a paragraph in the last iteration of proposals by the Basel Committee on Banking Regulation. We start from the premise that the severity of assumptions should correspond to the robustness of the system they describe. Hence, in the formal description of physical systems, the level of assumptions can be much higher. It follows that every concept that is carried over from the natural sciences to economics must be checked for its plausibility in the new surroundings. Most of the probability theory has been developed for the analysis of physical systems and is based on the independent and identically distributed (i.i.d.) assumption. In Economics both parts of the i.i.d. assumption are inappropriate. However, only dependence has, so far, been weakened to a sufficient degree. In this paper, an appropriate class of non-stationary processes is used, and their law is tied to a formal object measuring representativeness. Subsequently, that data set is identified that on average minimizes the estimation error stemming from both, insufficient and non-representative, data. Applications are far reaching in a variety of fields. In the paper itself, we apply the results in order to analyze a paragraph in the Basel 3 framework on banking regulation with severe implications on financial stability. Beyond the realm of finance, other potential applications include the reproducibility crisis in the social sciences (but not in the natural sciences) and modeling limited understanding and learning behavior in economics.Keywords: banking regulation, non-ergodicity, risk management, semimartingale modeling
Procedia PDF Downloads 148201 High Performance Liquid Cooling Garment (LCG) Using ThermoCore
Authors: Venkat Kamavaram, Ravi Pare
Abstract:
Modern warfighters experience extreme environmental conditions in many of their operational and training activities. In temperatures exceeding 95°F, the body’s temperature regulation can no longer cool through convection and radiation. In this case, the only cooling mechanism is evaporation. However, evaporative cooling is often compromised by excessive humidity. Natural cooling mechanisms can be further compromised by clothing and protective gear, which trap hot air and moisture close to the body. Creating an efficient heat extraction apparel system that is also lightweight without hindering dexterity or mobility of personnel working in extreme temperatures is a difficult technical challenge and one that needs to be addressed to increase the probability for the future success of the US military. To address this challenge, Oceanit Laboratories, Inc. has developed and patented a Liquid Cooled Garment (LCG) more effective than any on the market today. Oceanit’s LCG is a form-fitting garment with a network of thermally conductive tubes that extracts body heat and can be worn under all authorized and chemical/biological protective clothing. Oceanit specifically designed and developed ThermoCore®, a thermally conductive polymer, for use in this apparel, optimizing the product for thermal conductivity, mechanical properties, manufacturability, and performance temperatures. Thermal Manikin tests were conducted in accordance with the ASTM test method, ASTM F2371, Standard Test Method for Measuring the Heat Removal Rate of Personal Cooling Systems Using a Sweating Heated Manikin, in an environmental chamber using a 20-zone sweating thermal manikin. Manikin test results have shown that Oceanit’s LCG provides significantly higher heat extraction under the same environmental conditions than the currently fielded Environmental Control Vest (ECV) while at the same time reducing the weight. Oceanit’s LCG vests performed nearly 30% better in extracting body heat while weighing 15% less than the ECV. There are NO cooling garments in the market that provide the same thermal extraction performance, form-factor, and reduced weight as Oceanit’s LCG. The two cooling garments that are commercially available and most commonly used are the Environmental Control Vest (ECV) and the Microclimate Cooling Garment (MCG).Keywords: thermally conductive composite, tubing, garment design, form fitting vest, thermocore
Procedia PDF Downloads 115