Search results for: occurrence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1256

Search results for: occurrence

1106 Possible Sulfur Induced Superconductivity in Nano-Diamond

Authors: J. Mona, R. R. da Silva, C.-L.Cheng, Y. Kopelevich

Abstract:

We report on a possible occurrence of superconductivity in 5 nm particle size diamond powders treated with sulfur (S) at 500 o C for 10 hours in ~10-2 Torr vacuum. Superconducting-like magnetization hysteresis loops M(H) have been measured up to ~ 50 K by means of the SQUID magnetometer (Quantum Design). Both X-ray (Θ-2Θ geometry) and Raman spectroscopy analyses revealed no impurity or additional phases. Nevertheless, the measured Raman spectra are characteristic to the diamond with embedded disordered carbon and/or graphitic fragments suggesting a link to the previous reports of the local or surface superconductivity in graphite- and amorphous carbon–sulfur composites.

Keywords: nanodiamond, sulfur, superconductivity, Raman spectroscopy

Procedia PDF Downloads 460
1105 Estimation of Dynamic Characteristics of a Middle Rise Steel Reinforced Concrete Building Using Long-Term

Authors: Fumiya Sugino, Naohiro Nakamura, Yuji Miyazu

Abstract:

In earthquake resistant design of buildings, evaluation of vibration characteristics is important. In recent years, due to the increment of super high-rise buildings, the evaluation of response is important for not only the first mode but also higher modes. The knowledge of vibration characteristics in buildings is mostly limited to the first mode and the knowledge of higher modes is still insufficient. In this paper, using earthquake observation records of a SRC building by applying frequency filter to ARX model, characteristics of first and second modes were studied. First, we studied the change of the eigen frequency and the damping ratio during the 3.11 earthquake. The eigen frequency gradually decreases from the time of earthquake occurrence, and it is almost stable after about 150 seconds have passed. At this time, the decreasing rates of the 1st and 2nd eigen frequencies are both about 0.7. Although the damping ratio has more large error than the eigen frequency, both the 1st and 2nd damping ratio are 3 to 5%. Also, there is a strong correlation between the 1st and 2nd eigen frequency, and the regression line is y=3.17x. In the damping ratio, the regression line is y=0.90x. Therefore 1st and 2nd damping ratios are approximately the same degree. Next, we study the eigen frequency and damping ratio from 1998 after 3.11 earthquakes, the final year is 2014. In all the considered earthquakes, they are connected in order of occurrence respectively. The eigen frequency slowly declined from immediately after completion, and tend to stabilize after several years. Although it has declined greatly after the 3.11 earthquake. Both the decresing rate of the 1st and 2nd eigen frequencies until about 7 years later are about 0.8. For the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1% and the 2nd increases by less than 1%. For the eigen frequency, there is a strong correlation between the 1st and 2nd, and the regression line is y=3.17x. For the damping ratio, the regression line is y=1.01x. Therefore, it can be said that the 1st and 2nd damping ratio is approximately the same degree. Based on the above results, changes in eigen frequency and damping ratio are summarized as follows. In the long-term study of the eigen frequency, both the 1st and 2nd gradually declined from immediately after completion, and tended to stabilize after a few years. Further it declined after the 3.11 earthquake. In addition, there is a strong correlation between the 1st and 2nd, and the declining time and the decreasing rate are the same degree. In the long-term study of the damping ratio, both the 1st and 2nd are about 1 to 6%. After the 3.11 earthquake, the 1st increases by about 1%, the 2nd increases by less than 1%. Also, the 1st and 2nd are approximately the same degree.

Keywords: eigenfrequency, damping ratio, ARX model, earthquake observation records

Procedia PDF Downloads 194
1104 Risk Factors Affecting Construction Project Cost in Oman

Authors: Omar Amoudi, Latifa Al Brashdi

Abstract:

Construction projects are always subject to risks and uncertainties due to its unique and dynamic nature, outdoor work environment, the wide range of skills employed, various parties involved in addition to situation of construction business environment at large. Altogether, these risks and uncertainties affect projects objectives and lead to cost overruns, delay, and poor quality. Construction projects in Oman often experience cost overruns and delay. Managing these risks and reducing their impacts on construction cost requires firstly identifying these risks, and then analyzing their severity on project cost to obtain deep understanding about these risks. This in turn will assist construction managers in managing and tacking these risks. This paper aims to investigate the main risk factors that affect construction projects cost in the Sultanate of Oman. In order to achieve the main aim, literature review was carried out to identify the main risk factors affecting construction cost. Thirty-three risk factors were identified from the literature. Then, a questionnaire survey was designed and distributed among construction professionals (i.e., client, contractor and consultant) to obtain their opinion toward the probability of occurrence for each risk factor and its possible impact on construction project cost. The collected data was analyzed based on qualitative aspects and in several ways. The severity of each risk factor was obtained by multiplying the probability occurrence of a risk factor with its impact. The findings of this study reveal that the most significant risk factors that have high severity impact on construction project cost are: Change of Oil Price, Delay of Materials and Equipment Delivery, Changes in Laws and Regulations, Improper Budgeting, and Contingencies, Lack of Skilled Workforce and Personnel, Delays Caused by Contractor, Delays of Owner Payments, Delays Caused by Client, and Funding Risk. The results can be used as a basis for construction managers to make informed decisions and produce risk response procedures and strategies to tackle these risks and reduce their negative impacts on construction project cost.

Keywords: construction cost, construction projects, Oman, risk factors, risk management

Procedia PDF Downloads 305
1103 Comparison of Risk Analysis Methodologies Through the Consequences Identification in Chemical Accidents Associated with Dangerous Flammable Goods Storage

Authors: Daniel Alfonso Reséndiz-García, Luis Antonio García-Villanueva

Abstract:

As a result of the high industrial activity, which arises from the search to satisfy the needs of products and services for society, several chemical accidents have occurred, causing serious damage to different sectors: human, economic, infrastructure and environmental losses. Historically, with the study of this chemical accidents, it has been determined that the causes are mainly due to human errors (inexperienced personnel, negligence, lack of maintenance and deficient risk analysis). The industries have the aim to increase production and reduce costs. However, it should be kept in mind that the costs involved in risk studies, implementation of barriers and safety systems is much cheaper than paying for the possible damages that could occur in the event of an accident, without forgetting that there are things that cannot be replaced, such as human lives.Therefore, it is of utmost importance to implement risk studies in all industries, which provide information for prevention and planning. The aim of this study is to compare risk methodologies by identifying the consequences of accidents related to the storage of flammable, dangerous goods for decision making and emergency response.The methodologies considered in this study are qualitative and quantitative risk analysis and consequence analysis. The latter, by means of modeling software, which provides radius of affectation and the possible scope and magnitude of damages.By using risk analysis, possible scenarios of occurrence of chemical accidents in the storage of flammable substances are identified. Once the possible risk scenarios have been identified, the characteristics of the substances, their storage and atmospheric conditions are entered into the software.The results provide information that allows the implementation of prevention, detection, control, and combat elements for emergency response, thus having the necessary tools to avoid the occurrence of accidents and, if they do occur, to significantly reduce the magnitude of the damage.This study highlights the importance of risk studies applying tools that best suited to each case study. It also proves the importance of knowing the risk exposure of industrial activities for a better prevention, planning and emergency response.

Keywords: chemical accidents, emergency response, flammable substances, risk analysis, modeling

Procedia PDF Downloads 45
1102 Modeling and Prediction of Hot Deformation Behavior of IN718

Authors: M. Azarbarmas, J. M. Cabrera, J. Calvo, M. Aghaie-Khafri

Abstract:

The modeling of hot deformation behavior for unseen conditions is important in metal-forming. In this study, the hot deformation of IN718 has been characterized in the temperature range 950-1100 and strain rate range 0.001-0.1 s-1 using hot compression tests. All stress-strain curves showed the occurrence of dynamic recrystallization. These curves were implemented quantitatively in mathematics, and then constitutive equation indicating the relationship between the flow stress and hot deformation parameters was obtained successfully.

Keywords: compression test, constitutive equation, dynamic recrystallization, hot working

Procedia PDF Downloads 398
1101 Evaluation of Relationship between Job Stress Dimensions with Occupational Accidents in Industrial Factories in Southwest of Iran

Authors: Ali Ahmadi, Maryam Abbasi, Mohammad Mehdi Parsaei

Abstract:

Background: Stress in the workplace today is one of the most important public health concerns and a serious threat to the health of the workforce worldwide. Occupational stress can cause occupational events and reduce quality of life. As a result, it has a very undesirable impact on the performance of organizations, companies, and their human resources. This study aimed to evaluate the relationship between job stress dimensions and occupational accidents in industrial factories in Southwest Iran. Materials and Methods: This cross-sectional study was conducted among 200 workers in the summer of 2023 in the Southwest of Iran. To select participants, we used a convenience sampling method. The research tools in this study were the Health and Safety Executive (HSE) stress questionnaire with 35 questions and 7 dimensions and demographic information. A high score on this questionnaire indicates that there is low job stress and pressure. All workers completed the informed consent form. Univariate analysis was performed using chi-square and T-test. Multiple regression analysis was used to estimate the odds ratios (OR) and 95% confidence interval (CI) for the association of stress-related factors with job accidents in participants. Stata 14.0 software was used for analysis. Results: The mean age of the participants was 39.81(6.36) years. The prevalence of job accidents was 28.0% (95%CI: 21.0, 34.0). Based on the results of the multiple logistic regression with the adjustment of the effect of the confounding variables, one increase in the score of the demand dimension had a protective impact on the risk of job accidents(aOR=0.91,95%CI:0.85-0.95). Additionally, an increase in one of the scores of the managerial support (aOR=0.89, 95% CI: 0.83-0.95) and peer support (aOR=0.76, 95%CI: 0.67-87) dimensions was associated with a lower number of job accidents. Among dimensions, an increase in the score of relationship (aOR=0.89, 95%CI: 0.80-0.98) and change (aOR=0.86, 95%CI: 0.74-0.96) reduced the odds of the accident's occurrence among the workers by 11% and 16%, respectively. However, there was no significant association between role and control dimensions and the job accident (p>0.05). Conclusions: The results show that the prevalence of job accidents was alarmingly high. Our results suggested that an increase in scores of dimensions HSE questioners is significantly associated with a decrease the accident occurrence in the workplace. Therefore, planning to address stressful factors in the workplace seems necessary to prevent occupational accidents.

Keywords: HSE, Iran, job stress occupational accident, safety, occupational health

Procedia PDF Downloads 29
1100 A Rare Case of Synchronous Colon Adenocarcinoma

Authors: Mohamed Shafi Bin Mahboob Ali

Abstract:

Introduction: Synchronous tumor is defined as the presence of more than one primary malignant lesion in the same patient at the indexed diagnosis. It is a rare occurrence, especially in the spectrum of colorectal cancer, which accounts for less than 4%. The underlying pathology of a synchronous tumor is thought to be due to a genomic factor, which is microsatellite instability (MIS) with the involvement of BRAF, KRAS, and the GSRM1 gene. There are no specific sites of occurrence for the synchronous colorectal tumor, but many studies have shown that a synchronous tumor has about 43% predominance in the ascending colon with rarity in the sigmoid colon. Case Report: We reported a case of a young lady in the middle of her 30's with no family history of colorectal cancer that was diagnosed with a synchronous adenocarcinoma at the descending colon and rectosigmoid region. The lady's presentation was quite perplexing as she presented to the district hospital initially with simple, uncomplicated hemorrhoids and constipation. She was then referred to our center for further management as she developed a 'football' sized right gluteal swelling with a complete intestinal obstruction and bilateral lower-limb paralysis. We performed a CT scan and biopsy of the lesion, which found that the tumor engulfed the sacrococcygeal region with more than one primary lesion in the colon as well as secondaries in the liver. The patient was operated on after a multidisciplinary meeting was held. Pelvic exenteration with tumor debulking and anterior resection were performed. Postoperatively, she was referred to the oncology team for chemotherapy. She had a tremendous recovery in eight months' time with a partial regain of her lower limb power. The patient is still under our follow-up with an improved quality of life post-intervention. Discussion: Synchronous colon cancer is rare, with an incidence of 2.4% to 12.4%. It has male predominance and is pathologically more advanced compared to a single colon lesion. Down staging the disease by means of chemoradiotherapy has shown to be effective in managing this tumor. It is seen commonly on the right colon, but in our case, we found it on the left colon and the rectosigmoid. Conclusion: Managing a synchronous colon tumor could be challenging to surgeons, especially in deciding the extent of resection and postoperative functional outcomes of the bowel; thus, individual treatment strategies are needed to tackle this pathology.

Keywords: synchronous, colon, tumor, adenocarcinoma

Procedia PDF Downloads 79
1099 Crime Prevention with Artificial Intelligence

Authors: Mehrnoosh Abouzari, Shahrokh Sahraei

Abstract:

Today, with the increase in quantity and quality and variety of crimes, the discussion of crime prevention has faced a serious challenge that human resources alone and with traditional methods will not be effective. One of the developments in the modern world is the presence of artificial intelligence in various fields, including criminal law. In fact, the use of artificial intelligence in criminal investigations and fighting crime is a necessity in today's world. The use of artificial intelligence is far beyond and even separate from other technologies in the struggle against crime. Second, its application in criminal science is different from the discussion of prevention and it comes to the prediction of crime. Crime prevention in terms of the three factors of the offender, the offender and the victim, following a change in the conditions of the three factors, based on the perception of the criminal being wise, and therefore increasing the cost and risk of crime for him in order to desist from delinquency or to make the victim aware of self-care and possibility of exposing him to danger or making it difficult to commit crimes. While the presence of artificial intelligence in the field of combating crime and social damage and dangers, like an all-seeing eye, regardless of time and place, it sees the future and predicts the occurrence of a possible crime, thus prevent the occurrence of crimes. The purpose of this article is to collect and analyze the studies conducted on the use of artificial intelligence in predicting and preventing crime. How capable is this technology in predicting crime and preventing it? The results have shown that the artificial intelligence technologies in use are capable of predicting and preventing crime and can find patterns in the data set. find large ones in a much more efficient way than humans. In crime prediction and prevention, the term artificial intelligence can be used to refer to the increasing use of technologies that apply algorithms to large sets of data to assist or replace police. The use of artificial intelligence in our debate is in predicting and preventing crime, including predicting the time and place of future criminal activities, effective identification of patterns and accurate prediction of future behavior through data mining, machine learning and deep learning, and data analysis, and also the use of neural networks. Because the knowledge of criminologists can provide insight into risk factors for criminal behavior, among other issues, computer scientists can match this knowledge with the datasets that artificial intelligence uses to inform them.

Keywords: artificial intelligence, criminology, crime, prevention, prediction

Procedia PDF Downloads 52
1098 Geo-Spatial Distribution of Radio Refractivity and the Influence of Fade Depth on Microwave Propagation Signals over Nigeria

Authors: Olalekan Lawrence Ojo

Abstract:

Designing microwave terrestrial propagation networks requires a thorough evaluation of the severity of multipath fading, especially at frequencies below 10 GHz. In nations like Nigeria, without a large enough databases to support the existing empirical models, the mistakes in the prediction technique intended for the evaluation may be severe. The need for higher bandwidth for various satellite applications makes the investigation of the effects of radio refractivity, fading due to multipath, and Geoclimatic factors on satellite propagation links more important. One of the key elements to take into account for the best functioning of microwave frequencies is the clear air effects. This work has taken into account the geographical distribution of radio refractivity and fades depth over a number of stations in Nigeria. Data from five locations in Nigeria—Akure, Enugu, Jos, Minna, and Sokoto—based on five-year (2017–2021) measurement methods of atmospheric pressure, relative, and humidity temperature—at two levels (ground surface and 100 m heights)—are studied to deduced their effects on signals propagated through a µwave communication links. The assessments included considerations for µwave communication systems as well as the impacts of the dry and wet components of radio refractivity, the effects of the fade depth at various frequencies, and a 20 km link distance. The results demonstrate that the percentage occurrence of the dry terms dominated the radio refractivity constituent at the surface level, contributing a minimum of about 78% and a maximum of about 92%, while at heights of 100 meters, the percentage occurrence of the dry terms dominated the radio refractivity constituent, contributing a minimum of about 79% and a maximum of about 92%. The spatial distribution reveals that, regardless of height, the country's tropical rainforest (TRF) and freshwater swampy mangrove (FWSM) regions reported the greatest values of radio refractivity. The statistical estimate shows that fading values can differ by as much as 1.5 dB, especially near the TRF and FWSM coastlines, even during clear air conditions. The current findings will be helpful for budgeting Earth-space microwave links, particularly for the rollout of Nigeria's 5G and 6G projected microcellular networks.

Keywords: fade depth, geoclimatic factor, refractivity, refractivity gradient

Procedia PDF Downloads 47
1097 Atmospheric Circulation Patterns Inducing Coastal Upwelling in the Baltic Sea

Authors: Ewa Bednorz, Marek Polrolniczak, Bartosz Czernecki, Arkadiusz Marek Tomczyk

Abstract:

This study is meant as a contribution to the research of the upwelling phenomenon, which is one of the most pronounced examples of the sea-atmosphere coupling. The aim is to confirm the atmospheric forcing of the sea waters circulation and sea surface temperature along the variously oriented Baltic Sea coasts and to find out macroscale and regional circulation patterns triggering upwelling along different sections of this relatively small and semi-closed sea basin. The mean daily sea surface temperature data from the summer seasons (June–August) of the years 1982–2017 made the basis for the detection of upwelling cases. For the atmospheric part of the analysis, monthly indices of the Northern Hemisphere macroscale circulation patterns were used. Besides, in order to identify the local direction of airflow, the daily zonal and meridional regional circulation indices were constructed and introduced to the analysis. Finally, daily regional circulation patterns over the Baltic Sea region were distinguished by applying the principal component analysis to the gridded mean daily sea level pressure data. Within the Baltic Sea, upwelling is the most frequent along the zonally oriented northern coast of the Gulf of Finland, southern coasts of Sweden, and along the middle part of the western Gulf of Bothnia coast. Among the macroscale circulation patterns, the Scandinavian type (SCAND), with a primary circulation center located over Scandinavia, has the strongest impact on the horizontal flow of surface sea waters in the Baltic Sea, which triggers upwelling. An anticyclone center over Scandinavia in the positive phase of SCAND enhances the eastern airflow, which increases upwelling frequency along southeastern Baltic coasts. It was proved in the study that the zonal circulation has a stronger impact on upwelling occurrence than the meridional one, and it could increase/decrease a chance of upwelling formation by more than 70% in some coastal sections. Positive and negative phases of six distinguished regional daily circulation patterns made 12 different synoptic situations which were analyzed in the terms of their influence on the upwelling formation. Each of them revealed some impact on the frequency of upwelling in some coastal section of the Baltic Sea; however, two kinds of synoptic situations seemed to have the strongest influence, namely, the first kind representing pressure patterns enhancing the zonal flow and the second kind representing synoptic patterns with a cyclone/anticyclone centers over southern Scandinavia. Upwelling occurrence appeared to be particularly strongly reliant on the atmospheric conditions in some specific coastal sections, namely: the Gulf of Finland, the south eastern Baltic coasts (Polish and Latvian-Lithuanian section), and the western part of the Gulf of Bothnia. Concluding, it can be stated that atmospheric conditions strongly control the occurrence of upwelling within the Baltic Sea basin. Both local and macroscale circulation patterns expressed by the location of the pressure centers influence the frequency of this phenomenon; however, the impact strength varies, depending on the coastal region. Acknowledgment: This research was funded by the National Science Centre, Poland, grant number 2016/21/B/ST10/01440.

Keywords: Baltic Sea, circulation patterns, coastal upwelling, synoptic conditions

Procedia PDF Downloads 97
1096 Impact Evaluation of Vaccination against Eight-Child-Killer Diseases on under-Five Children Mortality at Mbale District, Uganda

Authors: Lukman Abiodun Nafiu

Abstract:

This study examines the impact evaluation of vaccination against eight-child-killer diseases on under-five children mortality at Mbale District. It was driven by three specific objectives which are to determine the proportion of under-five children mortality due to the eight-child-killer diseases to the total under-five children mortality; establish the cause-effect relationship between the eight-child-killer diseases and under-five children mortality; as well as establish the dependence of under-five children mortality in the location at Mbale District. A community based cross-sectional and longitudinal (panel) study design involving both quantitative and qualitative (focus group discussion and in-depth interview) approaches was employed over a period of 36 months. Multi-stage cluster design involving Health Sub-District (HSD), Forms of Ownership (FOO) and Health Facilities Centres (HFC) as the first, second and third stages respectively was used. Data was collected regarding the eight-child-killer diseases namely: measles, pneumonia, pertussis (whooping cough), diphtheria, poliomyelitis (polio), tetanus, haemophilus influenza, rotavirus gastroenteritis and mortality regarding immunized and non-immunized children aged 0-59 months. We monitored the children over a period of 24 months. The study used a sample of 384 children out of all the registered children for each year at Mbale Referral Hospital and other Primary Health Care Centres (HCIV, HCIII and HCII) at Mbale District between 2015 and 2019. These children were followed from birth to their current state (living or dead). The data collected in this study was analysed using cross tabulation and the chi-square test. The study concluded that majority of mothers at Mbale district took their children for immunization and thus reducing the occurrence of under-five children mortality. Overall, 2.3%, 4.6%, 3.1%, 5.4%, 1.5%, 3.8%, 0.0% and 0.0% of under-five children had polio, tetanus, diphtheria, measles, pertussis, pneumonia, haemophilus influenzae and rotavirus gastroenteritis respectively across all the sub counties at Mbale district during the period considered. Also, different locations (sub counties) do not have significant influence on the occurrence of these eight-child-killer diseases among the under-five children at Mbale district. Therefore, the study recommended that government and agencies should continue to work together to implement measures of vaccination programs and increasing access to basic health care with a continuous improvement on the social interventions to progress child survival.

Keywords: Diseases, Mortality, Children, Vaccination

Procedia PDF Downloads 101
1095 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data

Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca

Abstract:

In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.

Keywords: citizen science, data quality filtering, species distribution models, trait profiles

Procedia PDF Downloads 167
1094 A Comparative Study of Insurance Policies Worldwide in Public Private Partnerships

Authors: Guanqun Shi, Xueqing Zhang

Abstract:

The frequent occurrence of failures in PPP projects which caused great loss has raised attention from the government as well as the concessionaire. PPPs are complex arrangements for its long operation period and multiple players. Many types of risks in PPP projects may cause the project fail. The insurance is an important tool to transfer the risks. Through a comparison and analysis of international government PPP guidelines and contracts as well as the case studies worldwide, we have identified eight main insurance principles, discussed thirteen insurance types in different stages. An overall procedure would be established to improve the practices in PPP projects.

Keywords: public private partnerships, insurance, contract, risk

Procedia PDF Downloads 252
1093 Seamless Mobility in Heterogeneous Mobile Networks

Authors: Mohab Magdy Mostafa Mohamed

Abstract:

The objective of this paper is to introduce a vertical handover (VHO) algorithm between wireless LANs (WLANs) and LTE mobile networks. The proposed algorithm is based on the fuzzy control theory and takes into consideration power level, subscriber velocity, and target cell load instead of only power level in traditional algorithms. Simulation results show that network performance in terms of number of handovers and handover occurrence distance is improved.

Keywords: vertical handover, fuzzy control theory, power level, speed, target cell load

Procedia PDF Downloads 318
1092 Improving Functionality of Radiotherapy Department Through: Systemic Periodic Clinical Audits

Authors: Kamal Kaushik, Trisha, Dandapni, Sambit Nanda, A. Mukherjee, S. Pradhan

Abstract:

INTRODUCTION: As complexity in radiotherapy practice and processes are increasing, there is a need to assure quality control to a greater extent. At present, no international literature available with regards to the optimal quality control indicators for radiotherapy; moreover, few clinical audits have been conducted in the field of radiotherapy. The primary aim is to improve the processes that directly impact clinical outcomes for patients in terms of patient safety and quality of care. PROCEDURE: A team of an Oncologist, a Medical Physicist and a Radiation Therapist was formed for weekly clinical audits of patient’s undergoing radiotherapy audits The stages for audits include Pre planning audits, Simulation, Planning, Daily QA, Implementation and Execution (with image guidance). Errors in all the parts of the chain were evaluated and recorded for the development of further departmental protocols for radiotherapy. EVALUATION: The errors at various stages of radiotherapy chain were evaluated and recorded for comparison before starting the clinical audits in the department of radiotherapy and after starting the audits. It was also evaluated to find the stage in which maximum errors were recorded. The clinical audits were used to structure standard protocols (in the form of checklist) in department of Radiotherapy, which may lead to further reduce the occurrences of clinical errors in the chain of radiotherapy. RESULTS: The aim of this study is to perform a comparison between number of errors in different part of RT chain in two groups (A- Before Audit and B-After Audit). Group A: 94 pts. (48 males,46 female), Total no. of errors in RT chain:19 (9 needed Resimulation) Group B: 94 pts. (61 males,33 females), Total no. of errors in RT chain: 8 (4 needed Resimulation) CONCLUSION: After systematic periodic clinical audits percentage of error in radiotherapy process reduced more than 50% within 2 months. There is a great need in improving quality control in radiotherapy, and the role of clinical audits can only grow. Although clinical audits are time-consuming and complex undertakings, the potential benefits in terms of identifying and rectifying errors in quality control procedures are potentially enormous. Radiotherapy being a chain of various process. There is always a probability of occurrence of error in any part of the chain which may further propagate in the chain till execution of treatment. Structuring departmental protocols and policies helps in reducing, if not completely eradicating occurrence of such incidents.

Keywords: audit, clinical, radiotherapy, improving functionality

Procedia PDF Downloads 48
1091 Risk of Occupational Exposure to Cytotoxic Drugs: The Role of Handling Procedures of Hospital Workers

Authors: J. Silva, P. Arezes, R. Schierl, N. Costa

Abstract:

In order to study environmental contamination by cytostatic drugs in Portugal hospitals, sampling campaigns were conducted in three hospitals in 2015 (112 samples). Platinum containing drugs and fluorouracil were chosen because both were administered in high amounts. The detection limit was 0.01 pg/cm² for platinum and 0.1 pg/cm² for fluorouracil. The results show that spills occur mainly on the patient`s chair, while the most referenced occurrence is due to an inadequately closed wrapper. Day hospitals facilities were detected as having the largest number of contaminated samples and with higher levels of contamination.

Keywords: cytostatic, contamination, hospital, procedures, handling

Procedia PDF Downloads 269
1090 Study on Energy Absorption Characteristic of Cab Frame with FEM

Authors: Shigeyuki Haruyama, Oke Oktavianty, Zefry Darmawan, Tadayuki Kyoutani, Ken Kaminishi

Abstract:

Cab’s frame strength is considered as an important factor in excavator’s operator safety, especially during roll-over. In this study, we use a model of cab frame with different thicknesses and perform elastoplastic numerical analysis by using Finite Element Method (FEM). Deformation mode and energy absorption's of cab’s frame part are investigated on two conditions, with wrinkle and without wrinkle. The occurrence of wrinkle when deforming cab frame can reduce energy absorption, and among 4 parts with wrinkle, the energy absorption significantly decreases in part C. Residual stress that generated upon the bending process of part C is analyzed to confirm it possibility in increasing the energy absorption.

Keywords: ROPS, FEM, hydraulic excavator, cab frame

Procedia PDF Downloads 405
1089 Speech Impact Realization via Manipulative Argumentation Techniques in Modern American Political Discourse

Authors: Zarine Avetisyan

Abstract:

Paper presents the discussion of scholars concerning speech impact, peculiarities of its realization, speech strategies, and techniques. Departing from the viewpoints of many prominent linguists, the paper suggests manipulative argumentation be viewed as a most pervasive speech strategy with a certain set of techniques which are to be found in modern American political discourse. The precedence of their occurrence allows us to regard them as pragmatic patterns of speech impact realization in effective public speaking.

Keywords: speech impact, manipulative argumentation, political discourse, technique

Procedia PDF Downloads 471
1088 Biomedical Definition Extraction Using Machine Learning with Synonymous Feature

Authors: Jian Qu, Akira Shimazu

Abstract:

OOV (Out Of Vocabulary) terms are terms that cannot be found in many dictionaries. Although it is possible to translate such OOV terms, the translations do not provide any real information for a user. We present an OOV term definition extraction method by using information available from the Internet. We use features such as occurrence of the synonyms and location distances. We apply machine learning method to find the correct definitions for OOV terms. We tested our method on both biomedical type and name type OOV terms, our work outperforms existing work with an accuracy of 86.5%.

Keywords: information retrieval, definition retrieval, OOV (out of vocabulary), biomedical information retrieval

Procedia PDF Downloads 463
1087 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 244
1086 Lessons from Nature: Defensive Designs for the Built Environment

Authors: Rebecca A. Deek

Abstract:

There is evidence that erratic and extreme weather is becoming a common occurrence, and even predictions that this will become even more frequent and more severe. It also appears that the severity of earthquakes is intensifying. Some observers believe that human conduct has given reasons for such change; others attribute this to environmental and geological cycles. However, as some physicists, environmental scientists, politicians, and others continue to debate the connection between weather events, seismic activities, and climate change, other scientists, engineers, and urban planners are exploring how can our habitat become more responsive and resilient to such phenomena. There are a number of recent instances of nature’s destructive events that provide basis for the development of defensive measures.

Keywords: biomimicry, natural disasters, protection of human lives, resilient infrastructures

Procedia PDF Downloads 474
1085 Drug Therapy Problem and Its Contributing Factors among Pediatric Patients with Infectious Diseases Admitted to Jimma University Medical Center, South West Ethiopia: Prospective Observational Study

Authors: Desalegn Feyissa Desu

Abstract:

Drug therapy problem is a significant challenge to provide high quality health care service for the patients. It is associated with morbidity, mortality, increased hospital stay, and reduced quality of life. Moreover, pediatric patients are quite susceptible to drug therapy problems. Thus this study aimed to assess drug therapy problem and its contributing factors among pediatric patients diagnosed with infectious disease admitted to pediatric ward of Jimma university medical center, from April 1 to June 30, 2018. Prospective observational study was conducted among pediatric patients with infectious disease admitted from April 01 to June 30, 2018. Drug therapy problems were identified by using Cipolle’s and strand’s drug related problem classification method. Patient’s written informed consent was obtained after explaining the purpose of the study. Patient’s specific data were collected using structured questionnaire. Data were entered into Epi data version 4.0.2 and then exported to statistical software package version 21.0 for analysis. To identify predictors of drug therapy problems occurrence, multiple stepwise backward logistic regression analysis was done. The 95% CI was used to show the accuracy of data analysis and statistical significance was considered at p-value < 0.05. A total of 304 pediatric patients were included in the study. Of these, 226(74.3%) patients had at least one drug therapy problem during their hospital stay. A total of 356 drug therapy problems were identified among two hundred twenty six patients. Non-compliance (28.65%) and dose too low (27.53%) were the most common type of drug related problems while disease comorbidity [AOR=3.39, 95% CI= (1.89-6.08)], Polypharmacy [AOR=3.16, 95% CI= (1.61-6.20)] and more than six days stay in hospital [AOR=3.37, 95% CI= (1.71-6.64) were independent predictors of drug therapy problem occurrence. Drug therapy problems were common in pediatric patients with infectious disease in the study area. Presence of comorbidity, polypharmacy and prolonged hospital stay were the predictors of drug therapy problem in study area. Therefore, to overcome the significant gaps in pediatric pharmaceutical care, clinical pharmacists, Pediatricians, and other health care professionals have to work in collaboration.

Keywords: drug therapy problem, pediatric, infectious disease, Ethiopia

Procedia PDF Downloads 127
1084 A Ground Observation Based Climatology of Winter Fog: Study over the Indo-Gangetic Plains, India

Authors: Sanjay Kumar Srivastava, Anu Rani Sharma, Kamna Sachdeva

Abstract:

Every year, fog formation over the Indo-Gangetic Plains (IGPs) of Indian region during the winter months of December and January is believed to create numerous hazards, inconvenience, and economic loss to the inhabitants of this densely populated region of Indian subcontinent. The aim of the paper is to analyze the spatial and temporal variability of winter fog over IGPs. Long term ground observations of visibility and other meteorological parameters (1971-2010) have been analyzed to understand the formation of fog phenomena and its relevance during the peak winter months of January and December over IGP of India. In order to examine the temporal variability, time series and trend analysis were carried out by using the Mann-Kendall Statistical test. Trend analysis performed by using the Mann-Kendall test, accepts the alternate hypothesis with 95% confidence level indicating that there exists a trend. Kendall tau’s statistics showed that there exists a positive correlation between time series and fog frequency. Further, the Theil and Sen’s median slope estimate showed that the magnitude of trend is positive. Magnitude is higher during January compared to December for the entire IGP except in December when it is high over the western IGP. Decade wise time series analysis revealed that there has been continuous increase in fog days. The net overall increase of 99 % was observed over IGP in last four decades. Diurnal variability and average daily persistence were computed by using descriptive statistical techniques. Geo-statistical analysis of fog was carried out to understand the spatial variability of fog. Geo-statistical analysis of fog revealed that IGP is a high fog prone zone with fog occurrence frequency of more than 66% days during the study period. Diurnal variability indicates the peak occurrence of fog is between 06:00 and 10:00 local time and average daily fog persistence extends to 5 to 7 hours during the peak winter season. The results would offer a new perspective to take proactive measures in reducing the irreparable damage that could be caused due to changing trends of fog.

Keywords: fog, climatology, Mann-Kendall test, trend analysis, spatial variability, temporal variability, visibility

Procedia PDF Downloads 214
1083 A Multilevel Approach for Stroke Prediction Combining Risk Factors and Retinal Images

Authors: Jeena R. S., Sukesh Kumar A.

Abstract:

Stroke is one of the major reasons of adult disability and morbidity in many of the developing countries like India. Early diagnosis of stroke is essential for timely prevention and cure. Various conventional statistical methods and computational intelligent models have been developed for predicting the risk and outcome of stroke. This research work focuses on a multilevel approach for predicting the occurrence of stroke based on various risk factors and invasive techniques like retinal imaging. This risk prediction model can aid in clinical decision making and help patients to have an improved and reliable risk prediction.

Keywords: prediction, retinal imaging, risk factors, stroke

Procedia PDF Downloads 269
1082 Relation Between Traffic Mix and Traffic Accidents in a Mixed Industrial Urban Area

Authors: Michelle Eliane Hernández-García, Angélica Lozano

Abstract:

The traffic accidents study usually contemplates the relation between factors such as the type of vehicle, its operation, and the road infrastructure. Traffic accidents can be explained by different factors, which have a greater or lower relevance. Two zones are studied, a mixed industrial zone and the extended zone of it. The first zone has mainly residential (57%), and industrial (23%) land uses. Trucks are mainly on the roads where industries are located. Four sensors give information about traffic and speed on the main roads. The extended zone (which includes the first zone) has mainly residential (47%) and mixed residential (43%) land use, and just 3% of industrial use. The traffic mix is composed mainly of non-trucks. 39 traffic and speed sensors are located on main roads. The traffic mix in a mixed land use zone, could be related to traffic accidents. To understand this relation, it is required to identify the elements of the traffic mix which are linked to traffic accidents. Models that attempt to explain what factors are related to traffic accidents have faced multiple methodological problems for obtaining robust databases. Poisson regression models are used to explain the accidents. The objective of the Poisson analysis is to estimate a vector to provide an estimate of the natural logarithm of the mean number of accidents per period; this estimate is achieved by standard maximum likelihood procedures. For the estimation of the relation between traffic accidents and the traffic mix, the database is integrated of eight variables, with 17,520 observations and six vectors. In the model, the dependent variable is the occurrence or non-occurrence of accidents, and the vectors that seek to explain it, correspond to the vehicle classes: C1, C2, C3, C4, C5, and C6, respectively, standing for car, microbus, and van, bus, unitary trucks (2 to 6 axles), articulated trucks (3 to 6 axles) and bi-articulated trucks (5 to 9 axles); in addition, there is a vector for the average speed of the traffic mix. A Poisson model is applied, using a logarithmic link function and a Poisson family. For the first zone, the Poisson model shows a positive relation among traffic accidents and C6, average speed, C3, C2, and C1 (in a decreasing order). The analysis of the coefficient shows a high relation with bi-articulated truck and bus (C6 and the C3), indicating an important participation of freight trucks. For the expanded zone, the Poisson model shows a positive relation among traffic accidents and speed average, biarticulated truck (C6), and microbus and vans (C2). The coefficients obtained in both Poisson models shows a higher relation among freight trucks and traffic accidents in the first industrial zone than in the expanded zone.

Keywords: freight transport, industrial zone, traffic accidents, traffic mix, trucks

Procedia PDF Downloads 110
1081 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 294
1080 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers

Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang

Abstract:

Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.

Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors

Procedia PDF Downloads 95
1079 Ethnic Conflict and African Women's Capacity for Preventive Diplomacy

Authors: Olaifa Temitope Abimbola

Abstract:

The spate of the occurrence of Ethnic Conflict in Nigeria and indeed Africa is sporadic and to say the least alarming. To scholars of Ethnic Conflict in Africa, it has defied all logical approaches to its resolution. Based on this fact international organisations have begun to look for alternative means of approaching these conflicts. Not a few have agreed that wars are better and cheaper prevented than resolved or transformed. In the light of this, this paper had set out to look at the concept of Preventive Diplomacy, Ethnic Conflict, Women and the role they play in mitigating conflict by researching into activities of women in pre and post-conflict situations in selected African conflict and has been able to establish the peculiar capacity of women in dousing tension both at domestic and communal levels.

Keywords: preventive diplomacy, gender, peacebuilding, low

Procedia PDF Downloads 556
1078 Survival Data with Incomplete Missing Categorical Covariates

Authors: Madaki Umar Yusuf, Mohd Rizam B. Abubakar

Abstract:

The survival censored data with incomplete covariate data is a common occurrence in many studies in which the outcome is survival time. With model when the missing covariates are categorical, a useful technique for obtaining parameter estimates is the EM by the method of weights. The survival outcome for the class of generalized linear model is applied and this method requires the estimation of the parameters of the distribution of the covariates. In this paper, we propose some clinical trials with ve covariates, four of which have some missing values which clearly show that they were fully censored data.

Keywords: EM algorithm, incomplete categorical covariates, ignorable missing data, missing at random (MAR), Weibull Distribution

Procedia PDF Downloads 376
1077 Nullity of t-Tupple Graphs

Authors: Khidir R. Sharaf, Didar A. Ali

Abstract:

The nullity η (G) of a graph is the occurrence of zero as an eigenvalue in its spectra. A zero-sum weighting of a graph G is real valued function, say f from vertices of G to the set of real numbers, provided that for each vertex of G the summation of the weights f (w) over all neighborhood w of v is zero for each v in G.A high zero-sum weighting of G is one that uses maximum number of non-zero independent variables. If G is graph with an end vertex, and if H is an induced sub-graph of G obtained by deleting this vertex together with the vertex adjacent to it, then, η(G)= η(H). In this paper, a high zero-sum weighting technique and the end vertex procedure are applied to evaluate the nullity of t-tupple and generalized t-tupple graphs are derived and determined for some special types of graphs. Also, we introduce and prove some important results about the t-tupple coalescence, Cartesian and Kronecker products of nut graphs.

Keywords: graph theory, graph spectra, nullity of graphs, statistic

Procedia PDF Downloads 204