Search results for: systemic methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15759

Search results for: systemic methods

14439 Policy Analysis and Program Evaluation: Need to Designate a Navigable Spatial Identity for Slums Dwellers in India to Maximize Accessibility and Policy Impact

Authors: Resham Badri

Abstract:

Cities today are unable to justify equitable distribution of theirsocio- economic and infrastructural benefits to the marginalized urban poor, and the emergence of a pressing pandemic like COVID-19 has amplified its impact. Lack of identity, vulnerability, and inaccessibility contribute to exclusion. Owing to systemic gaps in institutional processes, urban development policiesfail to represent and cater to the urban poor. This paper aims to be a roadmap for the Indian Government to understand the significance of the designation of a navigable spatial identity to slum dwellers in the form of a digital address, which can form the fundamental basis of identification to enable accessibility to not only basic servicesbut also other utilities. Capitalizing on such a granular and technology backed approach shall allow to target and reach out to the urban poor strategically andaid effective urban governance. This paper adopts a three-pronged approach;(i) Policy analysis- understanding gaps in existing urban policies of India, such as the Pradhan Mantri Awas Yojana, Swachh Bharat Mission, and Adhaar Card policy, (ii) Program Evaluation- analyzing a case study, where slum dwellers in Kolhapur city in India have been provided with navigable addresses using Google Plus Codes and have gained access to basic services, vaccinations, and other emergency deliveries in COVID-19 times, (iii) Policy recommendation. This designation of a navigable spatial identity has tremendous potential to form the foundation on which policies can further base their data collection and service delivery processes to not only provide basic services but also other infrastructural and social welfare initiatives. Hence, a massive window of opportunity lies in addressing the unaddressed to elevate their living standards and respond to their basic needs.

Keywords: policy analysis, urban poor, navigable spatial identity, accessibility

Procedia PDF Downloads 80
14438 The Agile Management and Its Relationship to Administrative Ambidexterity: An Applied Study in Alexandria Library

Authors: Samar Sheikhelsouk, Dina Abdel Qader, Nada Rizk

Abstract:

The plan of the organization may impede its progress and creativity, especially in the framework of its work in independent environments and fast-shifting markets, unless the leaders and minds of the organization use a set of practices, tools, and techniques encapsulated in so-called “agile methods” or “lightweight” methods. Thus, this research paper examines the agile management approach as a flexible and dynamic approach and its relationship to the administrative ambidexterity at the Alexandria library. The sample of the study is the employees of the Alexandria library. The study is expected to provide both theoretical and practical implications. The current study will bridge the gap between agile management and administrative approaches in the literature. The study will lead managers to comprehend how the role of agile management in establishing administrative ambidexterity in the organization.

Keywords: agile management, administrative innovation, Alexandria library, Egypt

Procedia PDF Downloads 82
14437 Introduction of the Harmfulness of the Seismic Signal in the Assessment of the Performance of Reinforced Concrete Frame Structures

Authors: Kahil Amar, Boukais Said, Kezmane Ali, Hannachi Naceur Eddine, Hamizi Mohand

Abstract:

The principle of the seismic performance evaluation methods is to provide a measure of capability for a building or set of buildings to be damaged by an earthquake. The common objective of many of these methods is to supply classification criteria. The purpose of this study is to present a method for assessing the seismic performance of structures, based on Pushover method, we are particularly interested in reinforced concrete frame structures, which represent a significant percentage of damaged structures after a seismic event. The work is based on the characterization of seismic movement of the various earthquake zones in terms of PGA and PGD that is obtained by means of SIMQK_GR and PRISM software and the correlation between the points of performance and the scalar characterizing the earthquakes will be developed.

Keywords: seismic performance, pushover method, characterization of seismic motion, harmfulness of the seismic

Procedia PDF Downloads 381
14436 Implementation of Enhanced Recovery after Surgery (ERAS) Protocols in Laparoscopic Sleeve Gastrectomy (LSG); A Systematic Review and Meta-analysis

Authors: Misbah Nizamani, Saira Malik

Abstract:

Introduction: Bariatric surgery is the most effective treatment for patients suffering from morbid obesity. Laparoscopic sleeve gastrectomy (LSG) accounts for over 50% of total bariatric procedures. The aim of our meta-analysis is to investigate the effectiveness and safety of Enhanced Recovery After Surgery (ERAS) protocols for patients undergoing laparoscopic sleeve gastrectomy. Method: To gather data, we searched PubMed, Google Scholar, ScienceDirect, and Cochrane Central. Eligible studies were randomized controlled trials and cohort studies involving adult patients (≥18 years) undergoing bariatric surgeries, i.e., Laparoscopic sleeve gastrectomy. Outcome measures included LOS, postoperative narcotic usage, postoperative pain score, postoperative nausea and vomiting, postoperative complications and mortality, emergency department visits and readmission rates. RevMan version 5.4 was used to analyze outcomes. Results: Three RCTs and three cohorts with 1522 patients were included in this study. ERAS group and control group were compared for eight outcomes. LOS was reduced significantly in the intervention group (p=0.00001), readmission rates had borderline differences (p=0.35) and higher postoperative complications in the control group, but the result was non-significant (p=0.68), whereas postoperative pain score was significantly reduced (p=0.005). Total MME requirements became significant after performing sensitivity analysis (p= 0.0004). Postoperative mortality could not be analyzed on account of invalid data showing 0% mortality in two cohort studies. Conclusion: This systemic review indicated the effectiveness of the application of ERAS protocols in LSG in reducing the length of stay, post-operative pain and total MME requirements postoperatively, indicating the feasibility and assurance of its application.

Keywords: eras protocol, sleeve gastrectomy, bariatric surgery, enhanced recovery after surgery

Procedia PDF Downloads 39
14435 Robust Numerical Method for Singularly Perturbed Semilinear Boundary Value Problem with Nonlocal Boundary Condition

Authors: Habtamu Garoma Debela, Gemechis File Duressa

Abstract:

In this work, our primary interest is to provide ε-uniformly convergent numerical techniques for solving singularly perturbed semilinear boundary value problems with non-local boundary condition. These singular perturbation problems are described by differential equations in which the highest-order derivative is multiplied by an arbitrarily small parameter ε (say) known as singular perturbation parameter. This leads to the existence of boundary layers, which are basically narrow regions in the neighborhood of the boundary of the domain, where the gradient of the solution becomes steep as the perturbation parameter tends to zero. Due to the appearance of the layer phenomena, it is a challenging task to provide ε-uniform numerical methods. The term 'ε-uniform' refers to identify those numerical methods in which the approximate solution converges to the corresponding exact solution (measured to the supremum norm) independently with respect to the perturbation parameter ε. Thus, the purpose of this work is to develop, analyze, and improve the ε-uniform numerical methods for solving singularly perturbed problems. These methods are based on nonstandard fitted finite difference method. The basic idea behind the fitted operator, finite difference method, is to replace the denominator functions of the classical derivatives with positive functions derived in such a way that they capture some notable properties of the governing differential equation. A uniformly convergent numerical method is constructed via nonstandard fitted operator numerical method and numerical integration methods to solve the problem. The non-local boundary condition is treated using numerical integration techniques. Additionally, Richardson extrapolation technique, which improves the first-order accuracy of the standard scheme to second-order convergence, is applied for singularly perturbed convection-diffusion problems using the proposed numerical method. Maximum absolute errors and rates of convergence for different values of perturbation parameter and mesh sizes are tabulated for the numerical example considered. The method is shown to be ε-uniformly convergent. Finally, extensive numerical experiments are conducted which support all of our theoretical findings. A concise conclusion is provided at the end of this work.

Keywords: nonlocal boundary condition, nonstandard fitted operator, semilinear problem, singular perturbation, uniformly convergent

Procedia PDF Downloads 141
14434 An Automated R-Peak Detection Method Using Common Vector Approach

Authors: Ali Kirkbas

Abstract:

R peaks in an electrocardiogram (ECG) are signs of cardiac activity in individuals that reveal valuable information about cardiac abnormalities, which can lead to mortalities in some cases. This paper examines the problem of detecting R-peaks in ECG signals, which is a two-class pattern classification problem in fact. To handle this problem with a reliable high accuracy, we propose to use the common vector approach which is a successful machine learning algorithm. The dataset used in the proposed method is obtained from MIT-BIH, which is publicly available. The results are compared with the other popular methods under the performance metrics. The obtained results show that the proposed method shows good performance than that of the other. methods compared in the meaning of diagnosis accuracy and simplicity which can be operated on wearable devices.

Keywords: ECG, R-peak classification, common vector approach, machine learning

Procedia PDF Downloads 62
14433 The Impact of Diseases and Epidemics in the Field of Medicine and Health in General

Authors: Nedjar Abdelhadi

Abstract:

The pharmaceutical industry is one of the most important structures and foundations for the management and development of the modern world, especially the advanced part of it, meaning that there are some exceptions for third-world countries. The world today has witnessed radical transformations and changes, some of which made it better and some of which affected the path of its growth. At the beginning of my research, there was a detailed presentation overview of the current situation of the world in terms of growth and development, and it proceeded through that overview as the introduction to my research. The first chapter had divided into three sections; each topic was unique to one of the new methods of manufacturing, deducing and developing medicines Several examples of various recently developed medicines were used The second chapter dealt with the defects and shortcomings that pioneers and drug makers at various levels, as well as various regions and major companies, suffer from on the basis that they are international, especially those specialized in the manufacture of medicines related to viruses and chronic diseases, as well as incurable. As for the third chapter, it was devoted to marketing methods, methods of achieving sales, as well as the basics of spreading medicines and preparing the minds of consumers. Through my research, the one concluded that the current world has become completely different from the world we used to know, and it means by saying the field of manufacturing, selling and marketing medicines. It was noted that one of the biggest factors that affected the change in the field of medicine was the corona disaster. At the end of my research, I was left with nothing but to show the importance and necessity of the pharmaceutical industry and its effective role, not only in the development of mankind, but its main role is in the survival of mankind.

Keywords: health, diseases, medicine, epidemics

Procedia PDF Downloads 68
14432 Identifying Autism Spectrum Disorder Using Optimization-Based Clustering

Authors: Sharifah Mousli, Sona Taheri, Jiayuan He

Abstract:

Autism spectrum disorder (ASD) is a complex developmental condition involving persistent difficulties with social communication, restricted interests, and repetitive behavior. The challenges associated with ASD can interfere with an affected individual’s ability to function in social, academic, and employment settings. Although there is no effective medication known to treat ASD, to our best knowledge, early intervention can significantly improve an affected individual’s overall development. Hence, an accurate diagnosis of ASD at an early phase is essential. The use of machine learning approaches improves and speeds up the diagnosis of ASD. In this paper, we focus on the application of unsupervised clustering methods in ASD as a large volume of ASD data generated through hospitals, therapy centers, and mobile applications has no pre-existing labels. We conduct a comparative analysis using seven clustering approaches such as K-means, agglomerative hierarchical, model-based, fuzzy-C-means, affinity propagation, self organizing maps, linear vector quantisation – as well as the recently developed optimization-based clustering (COMSEP-Clust) approach. We evaluate the performances of the clustering methods extensively on real-world ASD datasets encompassing different age groups: toddlers, children, adolescents, and adults. Our experimental results suggest that the COMSEP-Clust approach outperforms the other seven methods in recognizing ASD with well-separated clusters.

Keywords: autism spectrum disorder, clustering, optimization, unsupervised machine learning

Procedia PDF Downloads 115
14431 Prediction of Road Accidents in Qatar by 2022

Authors: M. Abou-Amouna, A. Radwan, L. Al-kuwari, A. Hammuda, K. Al-Khalifa

Abstract:

There is growing concern over increasing incidences of road accidents and consequent loss of human life in Qatar. In light to the future planned event in Qatar, World Cup 2022; Qatar should put into consideration the future deaths caused by road accidents, and past trends should be considered to give a reasonable picture of what may happen in the future. Qatar roads should be arranged and paved in a way that accommodate high capacity of the population in that time, since then there will be a huge number of visitors from the world. Qatar should also consider the risk issues of road accidents raised in that period, and plan to maintain high level to safety strategies. According to the increase in the number of road accidents in Qatar from 1995 until 2012, an analysis of elements affecting and causing road accidents will be effectively studied. This paper aims to identify and criticize the factors that have high effect on causing road accidents in the state of Qatar, and predict the total number of road accidents in Qatar 2022. Alternative methods are discussed and the most applicable ones according to the previous researches are selected for further studies. The methods that satisfy the existing case in Qatar were the multiple linear regression model (MLR) and artificial neutral network (ANN). Those methods are analyzed and their findings are compared. We conclude that by using MLR the number of accidents in 2022 will become 355,226 accidents, and by using ANN 216,264 accidents. We conclude that MLR gave better results than ANN because the artificial neutral network doesn’t fit data with large range varieties.

Keywords: road safety, prediction, accident, model, Qatar

Procedia PDF Downloads 257
14430 Review on Optimization of Drinking Water Treatment Process

Authors: M. Farhaoui, M. Derraz

Abstract:

In the drinking water treatment processes, the optimization of the treatment is an issue of particular concern. In general, the process consists of many units as settling, coagulation, flocculation, sedimentation, filtration and disinfection. The optimization of the process consists of some measures to decrease the managing and monitoring expenses and improve the quality of the produced water. The objective of this study is to provide water treatment operators with methods and practices that enable to attain the most effective use of the facility and, in consequence, optimize the of the cubic meter price of the treated water. This paper proposes a review on optimization of drinking water treatment process by analyzing all of the water treatment units and gives some solutions in order to maximize the water treatment performances without compromising the water quality standards. Some solutions and methods are performed in the water treatment plant located in the middle of Morocco (Meknes).

Keywords: coagulation process, optimization, turbidity removal, water treatment

Procedia PDF Downloads 420
14429 Passenger Preferences on Airline Check-In Methods: Traditional Counter Check-In Versus Common-Use Self-Service Kiosk

Authors: Cruz Queen Allysa Rose, Bautista Joymeeh Anne, Lantoria Kaye, Barretto Katya Louise

Abstract:

The study presents the preferences of passengers on the quality of service provided by the two airline check-in methods currently present in airports-traditional counter check-in and common-use self-service kiosks. Since a study has shown that airlines perceive self-service kiosks alone are sufficient enough to ensure adequate services and customer satisfaction, and in contrast, agents and passengers stated that it alone is not enough and that human interaction is essential. In reference with former studies that established opposing ideas about the choice of the more favorable airline check-in method to employ, it is the purpose of this study to present a recommendation that shall somehow fill-in the gap between the conflicting ideas by means of comparing the perceived quality of service through the RATER model. Furthermore, this study discusses the major competencies present in each method which are supported by the theories–FIRO Theory of Needs upholding the importance of inclusion, control and affection, and the Queueing Theory which points out the discipline of passengers and the length of the queue line as important factors affecting quality service. The findings of the study were based on the data gathered by the researchers from selected Thomasian third year and fourth year college students currently enrolled in the first semester of the academic year 2014-2015, who have already experienced both airline check-in methods through the implication of a stratified probability sampling. The statistical treatments applied in order to interpret the data were mean, frequency, standard deviation, t-test, logistic regression and chi-square test. The final point of the study revealed that there is a greater effect in passenger preference concerning the satisfaction experienced in common-use self-service kiosks in comparison with the application of the traditional counter check-in.

Keywords: traditional counter check-in, common-use self-service Kiosks, airline check-in methods

Procedia PDF Downloads 405
14428 Expression of ULK-1 mRNA in Human Peripheral Blood Mononuclear Cells from Patients with Alzheimer's Disease

Authors: Ali Bayram, Remzi Yiğiter

Abstract:

Objective: Alzheimer's disease (AD), the most common cause of dementia, is a progressive neurodegenerative disease. At present, diagnosis of AD is rather late in the disease. Therefore, we attempted to find peripheral biomarkers for the early diagnosis of AD. Herein, we conducted a study to investigate the unc-51 like autophagy activating kinase-1 (ULK1) mRNA expression levels in human peripheral blood mononuclear cells from patients with Alzheimer's disease. Method: To determine whether ULK1 gene expression are altered in AD patients, we measured their gene expression in human peripheral blood cell in 50 patients with AD and 50 age and gender matched healthy controls by quantitative real-time PCR technique. Results: We found that both ULK1 gene expression in peripheral blood cell were significantly decreased in patients with AD as compared with controls (p <0.05). Lower levels of ULK1 gene expression were significantly associated with the increased risk for AD. Conclusions: Serine/threonine-protein kinase involved in autophagy in response to starvation. Acts upstream of phosphatidylinositol 3-kinase PIK3C3 to regulate the formation of autophagophores, the precursors of autophagosomes. Part of regulatory feedback loops in autophagy: acts both as a downstream effector and negative regulator of mammalian target of rapamycin complex 1 (mTORC1) via interaction with RPTOR. Activated via phosphorylation by AMPK and also acts as a regulator of AMPK by mediating phosphorylation of AMPK subunits PRKAA1, PRKAB2, and PRKAG1, leading to negatively regulate AMPK activity. May phosphorylate ATG13/KIAA0652 and RPTOR; however such data need additional evidences. Plays a role early in neuronal differentiation and is required for granule cell axon formation. Alzheimer is the most common neurodegenerative disease. Our results provide useful information that the ULK1 gene expression is decreased in the neurodegeneration and AD patients with, indicating their possible systemic involvement in AD.

Keywords: Alzheimer’s sisease, ULK1, mRNA expression, RT-PCR

Procedia PDF Downloads 396
14427 Preparation and Characterization of Nanometric Ni-Zn Ferrite via Different Methods

Authors: Ebtesam. E. Ateia, L. M. Salah, A. H. El-Bassuony

Abstract:

The aim of the presented study was the possibility of developing a nanosized material with enhanced structural properties that was suitable for many applications. Nanostructure ferrite of composition Ni0.5 Zn0.5 Cr0.1 Fe1.9 O4 were prepared by sol–gel, co-precipitation, citrate-gel, flash and oxalate precursor methods. The Structural and micro structural analysis of the investigated samples were carried out. It was observed that the lattice parameter of cubic spinel was constant, and the positions of both tetrahedral and the octahedral bands had a fixed position. The values of the lattice parameter had a significant role in determining the stoichiometric cation distribution of the composition.The average crystalline sizes of the investigated samples were from 16.4 to 69 nm. Discussion was made on the basis of a comparison of average crystallite size of the investigated samples, indicating that the co-precipitation method was the the effective one in producing small crystallite sized samples.

Keywords: chemical preparation, ferrite, grain size, nanocomposites, sol-gel

Procedia PDF Downloads 339
14426 Finite-Sum Optimization: Adaptivity to Smoothness and Loopless Variance Reduction

Authors: Bastien Batardière, Joon Kwon

Abstract:

For finite-sum optimization, variance-reduced gradient methods (VR) compute at each iteration the gradient of a single function (or of a mini-batch), and yet achieve faster convergence than SGD thanks to a carefully crafted lower-variance stochastic gradient estimator that reuses past gradients. Another important line of research of the past decade in continuous optimization is the adaptive algorithms such as AdaGrad, that dynamically adjust the (possibly coordinate-wise) learning rate to past gradients and thereby adapt to the geometry of the objective function. Variants such as RMSprop and Adam demonstrate outstanding practical performance that have contributed to the success of deep learning. In this work, we present AdaLVR, which combines the AdaGrad algorithm with loopless variance-reduced gradient estimators such as SAGA or L-SVRG that benefits from a straightforward construction and a streamlined analysis. We assess that AdaLVR inherits both good convergence properties from VR methods and the adaptive nature of AdaGrad: in the case of L-smooth convex functions we establish a gradient complexity of O(n + (L + √ nL)/ε) without prior knowledge of L. Numerical experiments demonstrate the superiority of AdaLVR over state-of-the-art methods. Moreover, we empirically show that the RMSprop and Adam algorithm combined with variance-reduced gradients estimators achieve even faster convergence.

Keywords: convex optimization, variance reduction, adaptive algorithms, loopless

Procedia PDF Downloads 68
14425 Application the Queuing Theory in the Warehouse Optimization

Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova

Abstract:

The aim of optimization of store management is not only designing the situation of store management itself including its equipment, technology and operation. In optimization of store management we need to consider also synchronizing of technological, transport, store and service operations throughout the whole process of logistic chain in such a way that a natural flow of material from provider to consumer will be achieved the shortest possible way, in the shortest possible time in requested quality and quantity and with minimum costs. The paper deals with the application of the queuing theory for optimization of warehouse processes. The first part refers to common information about the problematic of warehousing and using mathematical methods for logistics chains optimization. The second part refers to preparing a model of a warehouse within queuing theory. The conclusion of the paper includes two examples of using queuing theory in praxis.

Keywords: queuing theory, logistics system, mathematical methods, warehouse optimization

Procedia PDF Downloads 590
14424 Performance Evaluation of Contemporary Classifiers for Automatic Detection of Epileptic EEG

Authors: K. E. Ch. Vidyasagar, M. Moghavvemi, T. S. S. T. Prabhat

Abstract:

Epilepsy is a global problem, and with seizures eluding even the smartest of diagnoses a requirement for automatic detection of the same using electroencephalogram (EEG) would have a huge impact in diagnosis of the disorder. Among a multitude of methods for automatic epilepsy detection, one should find the best method out, based on accuracy, for classification. This paper reasons out, and rationalizes, the best methods for classification. Accuracy is based on the classifier, and thus this paper discusses classifiers like quadratic discriminant analysis (QDA), classification and regression tree (CART), support vector machine (SVM), naive Bayes classifier (NBC), linear discriminant analysis (LDA), K-nearest neighbor (KNN) and artificial neural networks (ANN). Results show that ANN is the most accurate of all the above stated classifiers with 97.7% accuracy, 97.25% specificity and 98.28% sensitivity in its merit. This is followed closely by SVM with 1% variation in result. These results would certainly help researchers choose the best classifier for detection of epilepsy.

Keywords: classification, seizure, KNN, SVM, LDA, ANN, epilepsy

Procedia PDF Downloads 519
14423 Screening and Optimization of Pretreatments for Rice Straw and Their Utilization for Bioethanol Production Using Developed Yeast Strain

Authors: Ganesh Dattatraya Saratale, Min Kyu Oh

Abstract:

Rice straw is one of the most abundant lignocellulosic waste materials and its annual production is about 731 Mt in the world. This study treats the subject of effective utilization of this waste biomass for biofuels production. We have showed a comparative assessment of numerous pretreatment strategies for rice straw, comprising of major physical, chemical and physicochemical methods. Among the different methods employed for pretreatment alkaline pretreatment in combination with sodium chlorite/acetic acid delignification found efficient pretreatment with significant improvement in the enzymatic digestibility of rice straw. A cellulase dose of 20 filter paper units (FPU) released a maximum 63.21 g/L of reducing sugar with 94.45% hydrolysis yield and 64.64% glucose yield from rice straw, respectively. The effects of different pretreatment methods on biomass structure and complexity were investigated by FTIR, XRD and SEM analytical techniques. Finally the enzymatic hydrolysate of rice straw was used for ethanol production using developed Saccharomyces cerevisiae SR8. The developed yeast strain enabled efficient fermentation of xylose and glucose and produced higher ethanol production. Thus development of bioethanol production from lignocellulosic waste biomass is generic, applicable methodology and have great implication for using ‘green raw materials’ and producing ‘green products’ much needed today.

Keywords: rice straw, pretreatment, enzymatic hydrolysis, FPU, Saccharomyces cerevisiae SR8, ethanol fermentation

Procedia PDF Downloads 537
14422 Environmental Exposure Assessment among Refuellers at Brussels South Charleroi Airport

Authors: Mostosi C., Stéphenne J., Kempeneers E.

Abstract:

Introduction: Refuellers from Brussels South Charleroi Airport (BSCA) expressed concerns about the risks involved in handling JET-A1 fuel. The HSE Manager of BSCA, in collaboration with the occupational physician and the industrial hygiene unit of the External Service of Occupational Medicine, decided to assess the toxicological exposure of these workers. Materials and methods: Two measurement methods were used. The first was to assay three types of metabolites in urine to highlight the exposure to xylenes, toluene, and benzene in aircraft fuels. Out of 32 refuellers in the department, 26 participated in the sampling, and 23 samples were exploited. The second method targeted the assessment of environmental exposure to certain potentially hazardous substances that refuellers are likely to breathe in work areas at the airport. It was decided to carry out two ambient air measurement campaigns, using static systems on the one hand and, on the other hand, using individual sensors worn by the refuellers at the level of the respiratory tract. Volatile organic compounds and diesel particles were analyzed. Results: Despite the fears that motivated these analyzes, the overall results showed low levels of exposure, far below the existing limit values, both in air quality and in urinary measurements. Conclusion: These results are comparable to a study carried out in several French airports. The staff could be reassured, and then the medical surveillance was modified by the occupational physician. With the aviation development at BSCA, equipment and methods are evolving. Their exposure will have to be reassessed.

Keywords: refuelling, airport, exposure, fuel, occupational health, air quality

Procedia PDF Downloads 84
14421 Novel Aminoglycosides to Target Resistant Pathogens

Authors: Nihar Ranjan, Derrick Watkins, Dev P. Arya

Abstract:

Current methods in the study of antibiotic activity of ribosome targeted antibiotics are dependent on cell based bacterial inhibition assays or various forms of ribosomal binding assays. These assays are typically independent of each other and little direct correlation between the ribosomal binding and bacterial inhibition is established with the complementary assay. We have developed novel high-throughput capable assays for ribosome targeted drug discovery. One such assay examines the compounds ability to bind to a model ribosomal RNA A-site. We have also coupled this assay to other functional orthogonal assays. Such analysis can provide valuable understanding of the relationships between two complementary drug screening methods and could be used as standard analysis to correlate the affinity of a compound for its target and the effect the compound has on a cell.

Keywords: bacterial resistance, aminoglycosides, screening, drugs

Procedia PDF Downloads 369
14420 Hybrid Approach for the Min-Interference Frequency Assignment

Authors: F. Debbat, F. T. Bendimerad

Abstract:

The efficient frequency assignment for radio communications becomes more and more crucial when developing new information technologies and their applications. It is consists in defining an assignment of frequencies to radio links, to be established between base stations and mobile transmitters. Separation of the frequencies assigned is necessary to avoid interference. However, unnecessary separation causes an excess requirement for spectrum, the cost of which may be very high. This problem is NP-hard problem which cannot be solved by conventional optimization algorithms. It is therefore necessary to use metaheuristic methods to solve it. This paper proposes Hybrid approach based on simulated annealing (SA) and Tabu Search (TS) methods to solve this problem. Computational results, obtained on a number of standard problem instances, testify the effectiveness of the proposed approach.

Keywords: cellular mobile communication, frequency assignment problem, optimization, tabu search, simulated annealing

Procedia PDF Downloads 382
14419 Circular Labour Migration and Its Consequences in Georgia

Authors: Manana Lobzhanidze

Abstract:

Introduction: The paper will argue that labor migration is the most important problem Georgia faces today. The structure of labor migration by age and gender of Georgia is analyzed. The main driving factors of circular labor migration during the last ten years are identified. While studying migration, it is necessary to discuss the interconnection of economic, social, and demographic features, also taking into consideration the policy of state regulations in terms of education and professional training. Methodology: Different research methods are applied in the presented paper: statistical, such as selection, grouping, observation, trend, and qualitative research methods, namely; analysis, synthesis, induction, deduction, comparison ones. Main Findings: Labour migrants are filling the labor market as a low salary worker. The main positive feedback of migration from developing countries is poverty eradication, but this process is accompanied by problems, such as 'Brain Drain'. The country loses an important part of its intellectual potential, and it is invested by households or state itself. Conclusions: Labor migration is characterized to be temporary, but socio-economic problems of the country often push the labor migration in the direction of longterm and illegal migration. Countries with developed economies try to stricter migration policy and fight illegal migration with different methods; circular migration helps solve this problem. Conclusions and recommendations are included about circular labor migration consequences in Georgia and its influence on the reduction of unemployment level.

Keywords: migration, circular labor migration, labor migration employment, unemployment

Procedia PDF Downloads 177
14418 A Neural Approach for the Offline Recognition of the Arabic Handwritten Words of the Algerian Departments

Authors: Salim Ouchtati, Jean Sequeira, Mouldi Bedda

Abstract:

In this work we present an off line system for the recognition of the Arabic handwritten words of the Algerian departments. The study is based mainly on the evaluation of neural network performances, trained with the gradient back propagation algorithm. The used parameters to form the input vector of the neural network are extracted on the binary images of the handwritten word by several methods: the parameters of distribution, the moments centered of the different projections and the Barr features. It should be noted that these methods are applied on segments gotten after the division of the binary image of the word in six segments. The classification is achieved by a multi layers perceptron. Detailed experiments are carried and satisfactory recognition results are reported.

Keywords: handwritten word recognition, neural networks, image processing, pattern recognition, features extraction

Procedia PDF Downloads 513
14417 Generating Product Description with Generative Pre-Trained Transformer 2

Authors: Minh-Thuan Nguyen, Phuong-Thai Nguyen, Van-Vinh Nguyen, Quang-Minh Nguyen

Abstract:

Research on automatically generating descriptions for e-commerce products is gaining increasing attention in recent years. However, the generated descriptions of their systems are often less informative and attractive because of lacking training datasets or the limitation of these approaches, which often use templates or statistical methods. In this paper, we explore a method to generate production descriptions by using the GPT-2 model. In addition, we apply text paraphrasing and task-adaptive pretraining techniques to improve the qualify of descriptions generated from the GPT-2 model. Experiment results show that our models outperform the baseline model through automatic evaluation and human evaluation. Especially, our methods achieve a promising result not only on the seen test set but also in the unseen test set.

Keywords: GPT-2, product description, transformer, task-adaptive, language model, pretraining

Procedia PDF Downloads 195
14416 Inhalable Lipid-Coated-Chitosan Nano-Embedded Microdroplets of an Antifungal Drug for Deep Lung Delivery

Authors: Ranjot Kaur, Om P. Katare, Anupama Sharma, Sarah R. Dennison, Kamalinder K. Singh, Bhupinder Singh

Abstract:

Respiratory microbial infections being among the top leading cause of death worldwide are difficult to treat as the microbes reside deep inside the airways, where only a small fraction of drug can access after traditional oral or parenteral routes. As a result, high doses of drugs are required to maintain drug levels above minimum inhibitory concentrations (MIC) at the infection site, unfortunately leading to severe systemic side-effects. Therefore, delivering antimicrobials directly to the respiratory tract provides an attractive way out in such situations. In this context, current study embarks on the systematic development of lung lia pid-modified chitosan nanoparticles for inhalation of voriconazole. Following the principles of quality by design, the chitosan nanoparticles were prepared by ionic gelation method and further coated with major lung lipid by precipitation method. The factor screening studies were performed by fractional factorial design, followed by optimization of the nanoparticles by Box-Behnken Design. The optimized formulation has a particle size range of 170-180nm, PDI 0.3-0.4, zeta potential 14-17, entrapment efficiency 45-50% and drug loading of 3-5%. The presence of a lipid coating was confirmed by FESEM, FTIR, and X-RD. Furthermore, the nanoparticles were found to be safe upto 40µg/ml on A549 and Calu-3 cell lines. The quantitative and qualitative uptake studies also revealed the uptake of nanoparticles in lung epithelial cells. Moreover, the data from Spraytec and next-generation impactor studies confirmed the deposition of nanoparticles in lower airways. Also, the interaction of nanoparticles with DPPC monolayers signifies its biocompatibility with lungs. Overall, the study describes the methodology and potential of lipid-coated chitosan nanoparticles in futuristic inhalation nanomedicine for the management of pulmonary aspergillosis.

Keywords: dipalmitoylphosphatidylcholine, nebulization, DPPC monolayers, quality-by-design

Procedia PDF Downloads 142
14415 Forecasting Residential Water Consumption in Hamilton, New Zealand

Authors: Farnaz Farhangi

Abstract:

Many people in New Zealand believe that the access to water is inexhaustible, and it comes from a history of virtually unrestricted access to it. For the region like Hamilton which is one of New Zealand’s fastest growing cities, it is crucial for policy makers to know about the future water consumption and implementation of rules and regulation such as universal water metering. Hamilton residents use water freely and they do not have any idea about how much water they use. Hence, one of proposed objectives of this research is focusing on forecasting water consumption using different methods. Residential water consumption time series exhibits seasonal and trend variations. Seasonality is the pattern caused by repeating events such as weather conditions in summer and winter, public holidays, etc. The problem with this seasonal fluctuation is that, it dominates other time series components and makes difficulties in determining other variations (such as educational campaign’s effect, regulation, etc.) in time series. Apart from seasonality, a stochastic trend is also combined with seasonality and makes different effects on results of forecasting. According to the forecasting literature, preprocessing (de-trending and de-seasonalization) is essential to have more performed forecasting results, while some other researchers mention that seasonally non-adjusted data should be used. Hence, I answer the question that is pre-processing essential? A wide range of forecasting methods exists with different pros and cons. In this research, I apply double seasonal ARIMA and Artificial Neural Network (ANN), considering diverse elements such as seasonality and calendar effects (public and school holidays) and combine their results to find the best predicted values. My hypothesis is the examination the results of combined method (hybrid model) and individual methods and comparing the accuracy and robustness. In order to use ARIMA, the data should be stationary. Also, ANN has successful forecasting applications in terms of forecasting seasonal and trend time series. Using a hybrid model is a way to improve the accuracy of the methods. Due to the fact that water demand is dominated by different seasonality, in order to find their sensitivity to weather conditions or calendar effects or other seasonal patterns, I combine different methods. The advantage of this combination is reduction of errors by averaging of each individual model. It is also useful when we are not sure about the accuracy of each forecasting model and it can ease the problem of model selection. Using daily residential water consumption data from January 2000 to July 2015 in Hamilton, I indicate how prediction by different methods varies. ANN has more accurate forecasting results than other method and preprocessing is essential when we use seasonal time series. Using hybrid model reduces forecasting average errors and increases the performance.

Keywords: artificial neural network (ANN), double seasonal ARIMA, forecasting, hybrid model

Procedia PDF Downloads 336
14414 Statistical Analysis to Select Evacuation Route

Authors: Zaky Musyarof, Dwi Yono Sutarto, Dwima Rindy Atika, R. B. Fajriya Hakim

Abstract:

Each country should be responsible for the safety of people, especially responsible for the safety of people living in disaster-prone areas. One of those services is provides evacuation route for them. But all this time, the selection of evacuation route is seem doesn’t well organized, it could be seen that when a disaster happen, there will be many accumulation of people on the steps of evacuation route. That condition is dangerous to people because hampers evacuation process. By some methods in Statistical analysis, author tries to give a suggestion how to prepare evacuation route which is organized and based on people habit. Those methods are association rules, sequential pattern mining, hierarchical cluster analysis and fuzzy logic.

Keywords: association rules, sequential pattern mining, cluster analysis, fuzzy logic, evacuation route

Procedia PDF Downloads 502
14413 Challenges in the Material and Action-Resistance Factor Design for Embedded Retaining Wall Limit State Analysis

Authors: Kreso Ivandic, Filip Dodigovic, Damir Stuhec

Abstract:

The paper deals with the proposed 'Material' and 'Action-resistance factor' design methods in designing the embedded retaining walls. The parametric analysis of evaluating the differences of the output values mutually and compared with classic approach computation was performed. There is a challenge with the criteria for choosing the proposed calculation design methods in Eurocode 7 with respect to current technical regulations and regular engineering practice. The basic criterion for applying a particular design method is to ensure minimum an equal degree of reliability in relation to the current practice. The procedure of combining the relevant partial coefficients according to design methods was carried out. The use of mentioned partial coefficients should result in the same level of safety, regardless of load combinations, material characteristics and problem geometry. This proposed approach of the partial coefficients related to the material and/or action-resistance should aimed at building a bridge between calculations used so far and pure probability analysis. The measure to compare the results was to determine an equivalent safety factor for each analysis. The results show a visible wide span of equivalent values of the classic safety factors.

Keywords: action-resistance factor design, classic approach, embedded retaining wall, Eurocode 7, limit states, material factor design

Procedia PDF Downloads 229
14412 Comparison of Cyclone Design Methods for Removal of Fine Particles from Plasma Generated Syngas

Authors: Mareli Hattingh, I. Jaco Van der Walt, Frans B. Waanders

Abstract:

A waste-to-energy plasma system was designed by Necsa for commercial use to create electricity from unsorted municipal waste. Fly ash particles must be removed from the syngas stream at operating temperatures of 1000 °C and recycled back into the reactor for complete combustion. A 2D2D high efficiency cyclone separator was chosen for this purpose. During this study, two cyclone design methods were explored: The Classic Empirical Method (smaller cyclone) and the Flow Characteristics Method (larger cyclone). These designs were optimized with regard to efficiency, so as to remove at minimum 90% of the fly ash particles of average size 10 μm by 50 μm. Wood was used as feed source at a concentration of 20 g/m3 syngas. The two designs were then compared at room temperature, using Perspex test units and three feed gases of different densities, namely nitrogen, helium and air. System conditions were imitated by adapting the gas feed velocity and particle load for each gas respectively. Helium, the least dense of the three gases, would simulate higher temperatures, whereas air, the densest gas, simulates a lower temperature. The average cyclone efficiencies ranged between 94.96% and 98.37%, reaching up to 99.89% in individual runs. The lowest efficiency attained was 94.00%. Furthermore, the design of the smaller cyclone proved to be more robust, while the larger cyclone demonstrated a stronger correlation between its separation efficiency and the feed temperatures. The larger cyclone can be assumed to achieve slightly higher efficiencies at elevated temperatures. However, both design methods led to good designs. At room temperature, the difference in efficiency between the two cyclones was almost negligible. At higher temperatures, however, these general tendencies are expected to be amplified so that the difference between the two design methods will become more obvious. Though the design specifications were met for both designs, the smaller cyclone is recommended as default particle separator for the plasma system due to its robust nature.

Keywords: Cyclone, design, plasma, renewable energy, solid separation, waste processing

Procedia PDF Downloads 211
14411 Effect of Different Porous Media Models on Drug Delivery to Solid Tumors: Mathematical Approach

Authors: Mostafa Sefidgar, Sohrab Zendehboudi, Hossein Bazmara, Madjid Soltani

Abstract:

Based on findings from clinical applications, most drug treatments fail to eliminate malignant tumors completely even though drug delivery through systemic administration may inhibit their growth. Therefore, better understanding of tumor formation is crucial in developing more effective therapeutics. For this purpose, nowadays, solid tumor modeling and simulation results are used to predict how therapeutic drugs are transported to tumor cells by blood flow through capillaries and tissues. A solid tumor is investigated as a porous media for fluid flow simulation. Most of the studies use Darcy model for porous media. In Darcy model, the fluid friction is neglected and a few simplified assumptions are implemented. In this study, the effect of these assumptions is studied by considering Brinkman model. A multi scale mathematical method which calculates fluid flow to a solid tumor is used in this study to investigate how neglecting fluid friction affects the solid tumor simulation. In this work, the mathematical model in our previous studies is developed by considering two model of momentum equation for porous media: Darcy and Brinkman. The mathematical method involves processes such as fluid flow through solid tumor as porous media, extravasation of blood flow from vessels, blood flow through vessels and solute diffusion, convective transport in extracellular matrix. The sprouting angiogenesis model is used for generating capillary network and then fluid flow governing equations are implemented to calculate blood flow through the tumor-induced capillary network. Finally, the two models of porous media are used for modeling fluid flow in normal and tumor tissues in three different shapes of tumors. Simulations of interstitial fluid transport in a solid tumor demonstrate that the simplifications used in Darcy model affect the interstitial velocity and Brinkman model predicts a lower value for interstitial velocity than the values that Darcy model does.

Keywords: solid tumor, porous media, Darcy model, Brinkman model, drug delivery

Procedia PDF Downloads 305
14410 Atherosclerosis Prevalence Within Populations of the Southeastern United States

Authors: Samuel P. Prahlow, Anthony Sciuva, Katherine Bombly, Emily Wilson, Shiv Dhiman, Savita Arya

Abstract:

A prevalence cohort study of atherosclerotic lesions within cadavers was performed to better understand and characterize the prevalence of atherosclerosis among Georgia residents within body donors in the Philadelphia College of Osteopathic Medicine (PCOM) - Georgia body donor program. We procured specimens from cadavers used for medical students, physical therapy students, and biomedical science students cadaveric anatomical dissection at PCOM - South Georgia and PCOM - Georgia. Tissues were prepared using hematoxylin and eosin (H&E) stainas histological slides by Colquitt Regional Medical Center Laboratory Services. One section from each of the following arteries was taken after cadaveric dissection at the site of most calcification palpated grossly (if present): left anterior descending coronary artery, left internal carotid artery, abdominal aorta, splenic artery, and hepatic artery. All specimens were graded and categorized according to the American Heart Association’s Modified and Conventional Standards for Atherosclerotic Lesions using x4, x10, x40 microscopic magnification. Our study cohort included 22 cadavers, with 16 females and 6 males. The average age was 72.54, and the median age was 72, with a range of 52 to 90 years old. The cause of death determination listing vascular and/or cardiovascular causes was present on 6 of the 22 death certificates. 19 of 22 (86%) cadavers had at least a single artery grading > 5. Of the cadavers with at least a single artery graded at greater than 5, only 5 of 19 (26%) cadavers had a vascular or cardiovascular cause of death reported. Malignancy was listed as a cause of death on 7 (32%) death certificates. The average atherosclerosis grading of the common hepatic, splenic and left internal carotid arteries (2.15, 3.05, and 3.36 respectively) were lower than the left anterior descending artery and the abdominal aorta (5.16 and 5.86 respectively). This prevalence study characterizes atherosclerosis found in five medium and large systemic arteries within cadavers from the state of Georgia.

Keywords: pathology, atherosclerosis, histology, cardiovascular

Procedia PDF Downloads 214