Search results for: agile methods
14678 The Effectiveness of Kinesiotaping Methods in Rehabilitation Therapy
Authors: Ana-Katarina Nikich
Abstract:
Background: The kinesiotaping method is often used in physiotherapy and rehabilitation. The purpose of this study was to evaluate the effectiveness of taping in the rehabilitation process of patients. Materials and methods: The study involved 90 male and female patients (the average age was 40-50 years) with various conditions requiring rehabilitation, such as injuries of the musculoskeletal system, sports injuries and other ailments. All patients were divided into two groups: experimental (n=40) and control (n=50). Both groups received 20 days of standard rehabilitation. In the experimental group, kinesiotaping methods were used, taking into account the individual characteristics of each patient. The control group performed regular exercises and physical therapy, but without using kinesiotape. During the study, physical parameters were monitored, interviews were conducted and the conditions of patients from both groups were compared. Results and discussion: The use of the kinesiotaping method in the rehabilitation process led to a significant improvement in physical parameters and pain reduction in patients. Significant improvement (p <0.005) was observed in all evaluated parameters among the patients of the experimental group. The control group also showed sufficient improvement (p <0.005), but the percentage of the experimental group was higher. As a result of the observation, the patients of the experimental group showed faster and more complete rehabilitation compared to the control group. The use of the kinesiotaping method allows to reduce the load on the damaged areas, improve blood circulation and lymphatic drainage, as well as increase stability and coordination of movements. Conclusions: Kinesiotaping as one of the modern therapeutic methods has shown its effectiveness in the rehabilitation process, contributing to the optimal recovery of patients with various conditions requiring rehabilitation. The use of tapes should be included in a comprehensive rehabilitation program to achieve the best results and reduce recovery time.Keywords: kinesiotaping, rehabilitation, therapy, pain
Procedia PDF Downloads 7114677 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation
Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski
Abstract:
Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.Keywords: bootstrap, edgeworth approximation, IID, quantile
Procedia PDF Downloads 15914676 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution
Authors: Haiyan Wu, Ying Liu, Shaoyun Shi
Abstract:
Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction
Procedia PDF Downloads 13614675 Effects of Performance Appraisal on Employee Productivity in Yobe State University, Damaturu, (A Case Study of the Department of Islamic Studies)
Authors: Adam Abdullahi Mohammed
Abstract:
Performance appraisal is an assessment made to ensure the level of a worker’s productivity in a given period of time. The appraisal system is divided into two categories that are traditional methods and modern methods, with emphasis based on the evaluation of work results. In the traditional approach of staff appraisal, which puts more emphasis on individual traits, supervisors are required to measure employees through interactions based on what they achieved with reference to job descriptions, as well as rating them based on questionnaires without staff interaction. These methods are not effective because staff may give biased information. The study will attempt to assess the effect of performance appraisal on employee productivity at Yobe State University, Damaturu. It is aimed at assessing the process, methods, and objectives of performance appraisal and its feedback to know how they affect the success of the appraisal, its results, and employee productivity. In this study, a quantitative research method is adopted in collecting and analyzing data, and a questionnaire will be used as data collecting instrument. As it is a case study, the target population is the staff of the department of Islamic Studies. The research will employ a census sampling technique where all the subjects in the target populations are given a chance to participate in the study. This sampling method was considered because the entire target population is considered researchable. The expected findings are that staff performance appraisal in the department of Islamic Studies has effects on employee productivity; this is to say if it is given due consideration and the needful being done will improve employee productivity.Keywords: performance appraisal, employee productivity, Yobe state University, appraisal feedback
Procedia PDF Downloads 7214674 BIM-based Construction Noise Management Approach With a Focus on Inner-City Construction
Authors: Nasim Babazadeh
Abstract:
Growing demand for a quieter dwelling environment has turned the attention of construction companies to reducing the propagated noise of their project. In inner-city constructions, close distance between the construction site and surrounding buildings lessens the efficiency of passive noise control methods. Dwellers of the nearby areas may file complaints and lawsuits against the construction companies due to the emitted construction noise, thereby leading to the interruption of processes, compensation costs, or even suspension of the project. Therefore, construction noise should be predicted along with the project schedule. The advantage of managing the noise in the pre-construction phase is two-fold. Firstly, changes in the time plan and construction methods can be applied more flexibly. Thus, the costs related to rescheduling can be avoided. Secondly, noise-related legal problems are expected to be reduced. To implement noise mapping methods for the mentioned prediction, the required detailed information (such as the location of the noisy process, duration of the noisy work) can be exported from the 4D BIM model. The results obtained from the noise maps would be used to help the planners to define different work scenarios. The proposed approach has been applied for the foundation and earthwork of a site located in a residential area, and the obtained results are discussed.Keywords: building information modeling, construction noise management, noise mapping, 4D BIM
Procedia PDF Downloads 18514673 Digital Retinal Images: Background and Damaged Areas Segmentation
Authors: Eman A. Gani, Loay E. George, Faisel G. Mohammed, Kamal H. Sager
Abstract:
Digital retinal images are more appropriate for automatic screening of diabetic retinopathy systems. Unfortunately, a significant percentage of these images are poor quality that hinders further analysis due to many factors (such as patient movement, inadequate or non-uniform illumination, acquisition angle and retinal pigmentation). The retinal images of poor quality need to be enhanced before the extraction of features and abnormalities. So, the segmentation of retinal image is essential for this purpose, the segmentation is employed to smooth and strengthen image by separating the background and damaged areas from the overall image thus resulting in retinal image enhancement and less processing time. In this paper, methods for segmenting colored retinal image are proposed to improve the quality of retinal image diagnosis. The methods generate two segmentation masks; i.e., background segmentation mask for extracting the background area and poor quality mask for removing the noisy areas from the retinal image. The standard retinal image databases DIARETDB0, DIARETDB1, STARE, DRIVE and some images obtained from ophthalmologists have been used to test the validation of the proposed segmentation technique. Experimental results indicate the introduced methods are effective and can lead to high segmentation accuracy.Keywords: retinal images, fundus images, diabetic retinopathy, background segmentation, damaged areas segmentation
Procedia PDF Downloads 40314672 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images
Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn
Abstract:
The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation
Procedia PDF Downloads 35714671 Detection of Parkinsonian Freezing of Gait
Authors: Sang-Hoon Park, Yeji Ho, Gwang-Moon Eom
Abstract:
Fast and accurate detection of Freezing of Gait (FOG) is desirable for appropriate application of cueing which has been shown to ameliorate FOG. Utilization of frequency spectrum of leg acceleration to derive the freeze index requires much calculation and it would lead to delayed cueing. We hypothesized that FOG can be reasonably detected from the time domain amplitude of foot acceleration. A time instant was recognized as FOG if the mean amplitude of the acceleration in the time window surrounding the time instant was in the specific FOG range. Parameters required in the FOG detection was optimized by simulated annealing. The suggested time domain methods showed performances comparable to those of frequency domain methods.Keywords: freezing of gait, detection, Parkinson's disease, time-domain method
Procedia PDF Downloads 44414670 Wired Network Services in Mobile Phones
Authors: Subhash Reddy
Abstract:
Mobile communication in today’s world means a lot to the human kind, through this many deals are made and others are broken, within seconds. That is because of our sophisticated methods of transporting the data at very high speeds and to very long distances, within no time. That is also because we kept on changing the method of serving the connections as the no of connections kept on increasing, that has led to many methods like TDMA, CDMA, and FDMA, etc. in wireless communications. And also the areas, where the connections are provided are also divided into CELLS, which are the basic blocks for cellular communications. Along with the wireless network, providing a wired network in mobile phones would serve as a very good alternative and would divert the extra traffic of a cell, so that a CELL which is providing wireless network can operate more efficiently.Keywords: CDMA, FDMA, TDMA, CELL
Procedia PDF Downloads 48614669 Analysis of Expert Possibilities While Identifying Human Teeth
Authors: Saule Mussabekova
Abstract:
Forensic investigation of human teeth plays an important role in detection of crime, particularly in cases of personal identification of dead bodies changed by putrefactive processes or skeletonized bodies as well as when finding bodies of unknown persons. 152 teeth have been investigated; 85 of them belonged to men and 67 belonged to women taken from alive people of different age. Teeth have been investigated after extraction. Two types of teeth have been investigated: teeth without integrity violation of dental crown and teeth with different degrees of its violation. Additionally, 517 teeth have been investigated that were collected from dead bodies, 252 of which belonged to women and 265 belonged to men, whatever the cause of death with death limitation from 1 month to 20 years. Isohemagglutinating serums and Coliclons of different series have been used for the research of tooth-group specificity by serological methods according to the AB0 system. Standard protocols of different techniques have been used for DNA purification from teeth (by reagent Chelex 100 produced by Bio-Rad using reagent kit 'DNA IQTM System' produced by Promega company (USA) and using columns 'QIAamp DNA Investigator Kit' produced by Qiagen company). Results of comparative forensic investigation of human teeth using serological and molecular genetic methods have shown that use of serological methods for forensic identification is sensible only in cases of preselection prior to the next molecular genetic investigation as well as in cases of impossibility of corresponding genetic investigation for different objective reasons. A number of advantages of methods of molecular genetics in the dental investigation have been marked, particularly in putrefactive changes, in personal identification. Key moments of modern condition of personal identification have been reflected according to dental state. Prospective directions of advance preparation of material have been emphasized for identification of teeth in forensic practice.Keywords: dental state, forensic identification, molecular genetic analysis, teeth
Procedia PDF Downloads 14114668 An Investigation on Hot-Spot Temperature Calculation Methods of Power Transformers
Authors: Ahmet Y. Arabul, Ibrahim Senol, Fatma Keskin Arabul, Mustafa G. Aydeniz, Yasemin Oner, Gokhan Kalkan
Abstract:
In the standards of IEC 60076-2 and IEC 60076-7, three different hot-spot temperature estimation methods are suggested. In this study, the algorithms which used in hot-spot temperature calculations are analyzed by comparing the algorithms with the results of an experimental set-up made by a Transformer Monitoring System (TMS) in use. In tested system, TMS uses only top oil temperature and load ratio for hot-spot temperature calculation. And also, it uses some constants from standards which are on agreed statements tables. During the tests, it came out that hot-spot temperature calculation method is just making a simple calculation and not uses significant all other variables that could affect the hot-spot temperature.Keywords: Hot-spot temperature, monitoring system, power transformer, smart grid
Procedia PDF Downloads 57314667 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data
Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu
Abstract:
Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.Keywords: food waste reduction, particle filter, point-of-sales, sustainable development goals, Taylor's law, time series analysis
Procedia PDF Downloads 13114666 Determinants for Discontinuing Contraceptive Use and Regional Variations in Bangladesh: A Sociological Perspective
Authors: Md. Shahriar Sabuz
Abstract:
Bangladesh, a South Asian developing country, has experienced an increasing rate of contraceptive use in the last few decades. But one-third of the pregnancies are still unintended, and the fertility rate surpasses the desired rate of children. It may be because of the discontinuation of the use of contraceptive methods. So, it is necessary to find out the reasons for the discontinuation of the use of contraceptives. Moreover, the rate of contraception discontinuation varies from rural to urban, region to region. In this study, our objectives are to find out the reasons behind the discontinuation of the use of the contraceptive method, and the regional variations of the rate of those reasons. We are using the dataset of Bangladesh Demographic and Health Surveys (BDHS) 2014 for this study and the ever-married women of Bangladesh who have discontinued the use of contraceptive methods aged 15-49. The data was collected from the seven districts of the country. The finding shows that currently there are 23% of women have stopped using their contraception. The most common reasons for stopping using the method are that either they are pregnant or want to be pregnant. A significant number of people are not using the contraceptive method because of the fear of side effects. Though the rate of non-user is higher in rural areas than in urban areas, reasons for method discontinuation are not significantly different between urban and rural areas. However, reasons for discontinuing contraceptive methods significantly vary from region to region.Keywords: discontinuation of contraceptive, health, pregnant, fertility
Procedia PDF Downloads 9514665 Dynamic Compaction Assessment for Improving Pasdaran Highway
Authors: Alireza Motamadnia, Roohollah Zohdi Oliayi, Hümeyra Bolakar, Ahmet Tortum
Abstract:
Dynamic compression as a method of soil improvement in recent decades has been considered by engineers and experts. Three methods mainly, deep dynamic compaction, soil density, dynamic and rapid change have been proposed and implemented to improve subgrade conditions of highway road. Northern highway route in Tabriz (Pasdaran), Iran that was placed on the manual soil was the main concern. Engineering properties of soil have been investigated experimentally and theoretically. Among the three methods rapid dynamic compaction for highway has been suggested to improve the soil subgrade conditions.Keywords: manual soil, subsidence, improvement, dynamic compression
Procedia PDF Downloads 60114664 Study of the Stability of Underground Mines by Numerical Method: The Mine Chaabet El Hamra, Algeria
Authors: Nakache Radouane, M. Boukelloul, M. Fredj
Abstract:
Method room and pillar sizes are key factors for safe mining and their recovery in open-stop mining. This method is advantageous due to its simplicity and requirement of little information to be used. It is probably the most representative method among the total load approach methods although it also remains a safe design method. Using a finite element software (PLAXIS 3D), analyses were carried out with an elasto-plastic model and comparisons were made with methods based on the total load approach. The results were presented as the optimization for improving the ore recovery rate while maintaining a safe working environment.Keywords: room and pillar, mining, total load approach, elasto-plastic
Procedia PDF Downloads 33014663 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh
Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila
Abstract:
Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.Keywords: data culture, data-driven organization, data mesh, data quality for business success
Procedia PDF Downloads 13514662 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform
Authors: Sadam Alwadi
Abstract:
Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.Keywords: outlier values, imputation, stock market data, detecting, estimation
Procedia PDF Downloads 8114661 Treatment of Cyanide Effluents with Platinum Impregned on Mg-Al Layered Hydroxides
Authors: María R. Contreras, Diana Endara
Abstract:
Cyanide leaching is the most used technology for gold mining industry, which produces large amounts of effluents requiring treatment. In Ecuador the development of gold mining industry has increased, causing significant environmental impacts due to the highly use of cyanide, it is estimated that 10 gr of extracted gold generates 7000 liters of water contaminated with 300mg/L of free cyanide. The most common methods used nowadays are the treatment with peroxodisulfuric acid, ozonation, H₂O₂ and other reactants which are expensive and present disadvantages. Several methods have been developed to treat this contaminant such as heterogeneous catalysts. Layered double hydroxides (LDHs) have received much attention due to their wide applications like a catalysis support. Therefore, in this study, Mg-Al/ LDH was synthetized by coprecipitation method and then platinum was impregned on it, in order to enhance its catalytic activity. Two methods of impregnation were used, the first one, called incipient wet impregnation and the second one was developed by continuous agitation of LDH in contact with chloroplatinic acid solution for 24 h. The support impregnated was analyzed by X-ray diffraction, FTIR and SEM. Finally, the oxidation of cyanide ion was performed by preparing synthetic solutions of sodium cyanide (NaCN) with an initial concentration of 500 mg/L at pH 10,5 and air flow of 180 NL/h. After 8 hours of treatment, an 80% of oxidation of ion cyanide was achieved.Keywords: catalysis, cyanide, LDHs, mining
Procedia PDF Downloads 14614660 Effect of Rubber Treatment on Compressive Strength and Modulus of Elasticity of Self-Compacting Rubberized Concrete
Authors: I. Miličević, M. Hadzima Nyarko, R. Bušić, J. Simonović Radosavljević, M. Prokopijević, K. Vojisavljević
Abstract:
This paper investigates the effects of different treatment methods of rubber aggregates for self-compacting concrete (SCC) on compressive strength and modulus of elasticity. SCC mixtures with 10% replacement of fine aggregate with crumb rubber by total aggregate volume and with different aggregate treatment methods were investigated. The rubber aggregate was treated in three different methods: dry process, water-soaking, and NaOH treatment plus water soaking. Properties of SCC in a fresh and hardened state were tested and evaluated. Scanning electron microscope (SEM) analysis of three different SCC patches were made and discussed. It was observed that applying the proposed NaOH plus water soaking method resulted in the improvement of fresh and hardened concrete properties. It resulted in a more uniform distribution of rubber particles in the cement matrix, a better bond between rubber particles and the cement matrix, and higher compressive strength of SCC rubberized concrete.Keywords: compressive strength, modulus of elasticity, NaOH treatment, rubber aggregate, self-compacting rubberized concrete, scanning electron microscope analysis
Procedia PDF Downloads 10814659 Anisotropic Approach for Discontinuity Preserving in Optical Flow Estimation
Authors: Pushpendra Kumar, Sanjeev Kumar, R. Balasubramanian
Abstract:
Estimation of optical flow from a sequence of images using variational methods is one of the most successful approach. Discontinuity between different motions is one of the challenging problem in flow estimation. In this paper, we design a new anisotropic diffusion operator, which is able to provide smooth flow over a region and efficiently preserve discontinuity in optical flow. This operator is designed on the basis of intensity differences of the pixels and isotropic operator using exponential function. The combination of these are used to control the propagation of flow. Experimental results on the different datasets verify the robustness and accuracy of the algorithm and also validate the effect of anisotropic operator in the discontinuity preserving.Keywords: optical flow, variational methods, computer vision, anisotropic operator
Procedia PDF Downloads 87314658 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology
Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando
Abstract:
Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry
Procedia PDF Downloads 15114657 Realization of a (GIS) for Drilling (DWS) through the Adrar Region
Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz
Abstract:
Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.Keywords: GIS, DWS, drilling, Adrar
Procedia PDF Downloads 30914656 Diagnosis of Choledocholithiasis with Endosonography
Authors: A. Kachmazova, A. Shadiev, Y. Teterin, P. Yartcev
Abstract:
Introduction: Biliary calculi disease (LCS) still occupies the leading position among urgent diseases of the abdominal cavity, manifesting itself from asymptomatic course to life-threatening states. Nowadays arsenal of diagnostic methods for choledocholithiasis is quite wide: ultrasound, hepatobiliscintigraphy (HBSG), magnetic resonance imaging (MRI), endoscopic retrograde cholangiography (ERCP). Among them, transabdominal ultrasound (TA ultrasound) is the most accessible and routine diagnostic method. Nowadays ERCG is the "gold" standard in diagnosis and one-stage treatment of biliary tract obstruction. However, transpapillary techniques are accompanied by serious postoperative complications (postmanipulative pancreatitis (3-5%), endoscopic papillosphincterotomy bleeding (2%), cholangitis (1%)), the lethality being 0.4%. GBSG and MRI are also quite informative methods in the diagnosis of choledocholithiasis. Small size of concrements, their localization in intrapancreatic and retroduodenal part of common bile duct significantly reduces informativity of all diagnostic methods described above, that demands additional studying of this problem. Materials and Methods: 890 patients with the diagnosis of cholelithiasis (calculous cholecystitis) were admitted to the Sklifosovsky Scientific Research Institute of Hospital Medicine in the period from August, 2020 to June, 2021. Of them 115 people with mechanical jaundice caused by concrements in bile ducts. Results: Final EUS diagnosis was made in all patients (100,0%). In all patients in whom choledocholithiasis diagnosis was revealed or confirmed after EUS, ERCP was performed urgently (within two days from the moment of its detection) as the X-ray operation room was provided; it confirmed the presence of concrements. All stones were removed by lithoextraction using Dormia basket. The postoperative period in these patients had no complications. Conclusions: EUS is the most informative and safe diagnostic method, which allows to detect choledocholithiasis in patients with discrepancies between clinical-laboratory and instrumental methods of diagnosis in shortest time, that in its turn will help to decide promptly on the further tactics of patient treatment. We consider it reasonable to include EUS in the diagnostic algorithm for choledocholithiasis. Disclosure: Nothing to disclose.Keywords: endoscopic ultrasonography, choledocholithiasis, common bile duct, concrement, ERCP
Procedia PDF Downloads 8514655 Research Study on the Environmental Conditions in the Foreign
Authors: Vahid Bairami Rad, Shapoor Norazar, Moslem Talebi Asl
Abstract:
The fast growing accessibility and capability of emerging technologies have fashioned enormous possibilities of designing, developing and implementing innovative teaching methods in the classroom. Using teaching methods and technology together have a fantastic results, because the global technological scenario has paved the way to new pedagogies in teaching-learning process. At the other side methods by focusing on students and the ways of learning in them, that can demonstrate logical ways of improving student achievement in English as a foreign language in Iran. The sample of study was 90 students of 10th grade of high school located in Ardebil. A pretest-posttest equivalent group designed to compare the achievement of groups. Students divided to 3 group, Control base, computer base, method and technology base. Pretest and post test contain 30 items each from English textbook were developed and administrated, then obtained data were analyzed. The results showed that there was an important difference. The 3rd group performance was better than other groups. On the basis of this result it was obviously counseled that teaching-learning capabilities.Keywords: method, technology based environment, computer based environment, english as a foreign language, student achievement
Procedia PDF Downloads 47414654 The Modelling of Real Time Series Data
Authors: Valeria Bondarenko
Abstract:
We proposed algorithms for: estimation of parameters fBm (volatility and Hurst exponent) and for the approximation of random time series by functional of fBm. We proved the consistency of the estimators, which constitute the above algorithms, and proved the optimal forecast of approximated time series. The adequacy of estimation algorithms, approximation, and forecasting is proved by numerical experiment. During the process of creating software, the system has been created, which is displayed by the hierarchical structure. The comparative analysis of proposed algorithms with the other methods gives evidence of the advantage of approximation method. The results can be used to develop methods for the analysis and modeling of time series describing the economic, physical, biological and other processes.Keywords: mathematical model, random process, Wiener process, fractional Brownian motion
Procedia PDF Downloads 35814653 Effect of Building Construction Sizes on Project Delivery Methods in Nigeria
Authors: Nuruddeen Usman, Mohammad Sani
Abstract:
The performance of project delivery methods has been an issue of concern to various stakeholders in the construction industry. The contracting system of project delivery is the traditional system used in the delivery of most public projects in Nigeria. The direct labor system is used most times as an alternative to the traditional system. There were so many complain about the performance of contracting system and the suitability of direct labor as an alternative to the delivery of public projects. Therefore, this paper is aimed at investigating the effect of project size on the project delivery methods in the completed public buildings. Questionnaires were self-administered to managerial staff in the study area and analyzed using descriptive statistics. The findings reveals that contracting system was choosing for large size building construction project delivery with higher frequency (F) of 40 (76.9%) against direct labor with 12 (23.1%). While the small size project, the result revealed a frequency (F) of 26 (50%) for contracting system and direct labor system respectively. Base on the research findings, the contracting system, was recommended for all sizes of building construction project delivery while direct labor system can only use as an alternative for small size building construction projects delivery.Keywords: construction size, contracting system, direct labour, effect
Procedia PDF Downloads 45714652 A Comparison Between Different Discretization Techniques for the Doyle-Fuller-Newman Li+ Battery Model
Authors: Davide Gotti, Milan Prodanovic, Sergio Pinilla, David Muñoz-Torrero
Abstract:
Since its proposal, the Doyle-Fuller-Newman (DFN) lithium-ion battery model has gained popularity in the electrochemical field. In fact, this model provides the user with theoretical support for designing the lithium-ion battery parameters, such as the material particle or the diffusion coefficient adjustment direction. However, the model is mathematically complex as it is composed of several partial differential equations (PDEs) such as Fick’s law of diffusion, the MacInnes and Ohm’s equations, among other phenomena. Thus, to efficiently use the model in a time-domain simulation environment, the selection of the discretization technique is of a pivotal importance. There are several numerical methods available in the literature that can be used to carry out this task. In this study, a comparison between the explicit Euler, Crank-Nicolson, and Chebyshev discretization methods is proposed. These three methods are compared in terms of accuracy, stability, and computational times. Firstly, the explicit Euler discretization technique is analyzed. This method is straightforward to implement and is computationally fast. In this work, the accuracy of the method and its stability properties are shown for the electrolyte diffusion partial differential equation. Subsequently, the Crank-Nicolson method is considered. It represents a combination of the implicit and explicit Euler methods that has the advantage of being of the second order in time and is intrinsically stable, thus overcoming the disadvantages of the simpler Euler explicit method. As shown in the full paper, the Crank-Nicolson method provides accurate results when applied to the DFN model. Its stability does not depend on the integration time step, thus it is feasible for both short- and long-term tests. This last remark is particularly important as this discretization technique would allow the user to implement parameter estimation and optimization techniques such as system or genetic parameter identification methods using this model. Finally, the Chebyshev discretization technique is implemented in the DFN model. This discretization method features swift convergence properties and, as other spectral methods used to solve differential equations, achieves the same accuracy with a smaller number of discretization nodes. However, as shown in the literature, these methods are not suitable for handling sharp gradients, which are common during the first instants of the charge and discharge phases of the battery. The numerical results obtained and presented in this study aim to provide the guidelines on how to select the adequate discretization technique for the DFN model according to the type of application to be performed, highlighting the pros and cons of the three methods. Specifically, the non-eligibility of the simple Euler method for longterm tests will be presented. Afterwards, the Crank-Nicolson and the Chebyshev discretization methods will be compared in terms of accuracy and computational times under a wide range of battery operating scenarios. These include both long-term simulations for aging tests, and short- and mid-term battery charge/discharge cycles, typically relevant in battery applications like grid primary frequency and inertia control and electrical vehicle breaking and acceleration.Keywords: Doyle-Fuller-Newman battery model, partial differential equations, discretization, numerical methods
Procedia PDF Downloads 2314651 A Comparative Study of the Maximum Power Point Tracking Methods for PV Systems Using Boost Converter
Authors: M. Doumi, A. Miloudi, A.G. Aissaoui, K. Tahir, C. Belfedal, S. Tahir
Abstract:
The studies on the photovoltaic system are extensively increasing because of a large, secure, essentially exhaustible and broadly available resource as a future energy supply. However, the output power induced in the photovoltaic modules is influenced by an intensity of solar cell radiation, temperature of the solar cells and so on. Therefore, to maximize the efficiency of the photovoltaic system, it is necessary to track the maximum power point of the PV array, for this Maximum Power Point Tracking (MPPT) technique is used. These algorithms are based on the Perturb-Observe, Conductance-Increment and the Fuzzy Logic methods. These techniques vary in many aspects as: simplicity, convergence speed, digital or analogical implementation, sensors required, cost, range of effectiveness, and in other aspects. This paper presents a comparative study of three widely-adopted MPPT algorithms; their performance is evaluated on the energy point of view, by using the simulation tool Simulink®, considering different solar irradiance variations. MPPT using fuzzy logic shows superior performance and more reliable control to the other methods for this application.Keywords: photovoltaic system, MPPT, perturb and observe (P&O), incremental conductance (INC), Fuzzy Logic (FLC)
Procedia PDF Downloads 41114650 Design of Black-Seed Pulp biomass-Derived New Bio-Sorbent by Combining Methods of Mineral Acids and High-Temperature for Arsenic Removal
Authors: Mozhgan Mohammadi, Arezoo Ghadi
Abstract:
Arsenic is known as a potential threat to the environment. Therefore, the aim of this research is to assess the arsenic removal efficiency from an aqueous solution, with a new biosorbent composed of a black seed pulp (BSP). To treat BSP, the combination of two methods (i.e. treating with mineral acids and use at high temperature) was used and designed bio-sorbent called BSP-activated/carbonized. The BSP-activated and BSP-carbonized were also prepared using HCL and 400°C temperature, respectively, to compare the results of each three methods. Followed by, adsorption parameters such as pH, initial ion concentration, biosorbent dosage, contact time, and temperature were assessed. It was found that the combination method has provided higher adsorption capacity so that up to ~99% arsenic removal was observed with BSP-activated/carbonized at pH of 7.0 and 40°C. The adsorption capacity for BSP-carbonized and BSP-activated were 87.92% (pH: 7, 60°C) and 78.50% (pH: 6, 90°C), respectively. Moreover, adsorption kinetics data indicated the best fit with the pseudo-second-order model. The maximum biosorption capacity, by the Langmuir isotherm model, was also recorded for BSP-activated/carbonized (53.47 mg/g). It is notable that arsenic adsorption on studied bio sorbents takes place as spontaneous and through chemisorption along with the endothermic nature of the biosorption process and reduction of random collision in the solid-liquid phase.Keywords: black seed pulp, bio-sorbents, treatment of sorbents, adsorption isotherms
Procedia PDF Downloads 9514649 An Analysis on the Appropriateness and Effectiveness of CCTV Location for Crime Prevention
Authors: Tae-Heon Moon, Sun-Young Heo, Sang-Ho Lee, Youn-Taik Leem, Kwang-Woo Nam
Abstract:
This study aims to investigate the possibility of crime prevention through CCTV by analyzing the appropriateness of the CCTV location, whether it is installed in the hotspot of crime-prone areas, and exploring the crime prevention effect and transition effect. The real crime and CCTV locations of case city were converted into the spatial data by using GIS. The data was analyzed by hotspot analysis and weighted displacement quotient(WDQ). As study methods, it analyzed existing relevant studies for identifying the trends of CCTV and crime studies based on big data from 1800 to 2014 and understanding the relation between CCTV and crime. Second, it investigated the current situation of nationwide CCTVs and analyzed the guidelines of CCTV installation and operation to draw attention to the problems and indicating points of domestic CCTV use. Third, it investigated the crime occurrence in case areas and the current situation of CCTV installation in the spatial aspects, and analyzed the appropriateness and effectiveness of CCTV installation to suggest a rational installation of CCTV and the strategic direction of crime prevention. The results demonstrate that there was no significant effect in the installation of CCTV on crime prevention. This indicates that CCTV should be installed and managed in a more scientific way reflecting local crime situations. In terms of CCTV, the methods of spatial analysis such as GIS, which can evaluate the installation effect, and the methods of economic analysis like cost-benefit analysis should be developed. In addition, these methods should be distributed to local governments across the nation for the appropriate installation of CCTV and operation. This study intended to find a design guideline of the optimum CCTV installation. In this regard, this study is meaningful in that it will contribute to the creation of a safe city.Keywords: CCTV, safe city, crime prevention, spatial analysis
Procedia PDF Downloads 438