Search results for: iterative methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15538

Search results for: iterative methods

14698 Artificial Intelligence Technologies Used in Healthcare: Its Implication on the Healthcare Workforce and Applications in the Diagnosis of Diseases

Authors: Rowanda Daoud Ahmed, Mansoor Abdulhak, Muhammad Azeem Afzal, Sezer Filiz, Usama Ahmad Mughal

Abstract:

This paper discusses important aspects of AI in the healthcare domain. The increase of data in healthcare both in size and complexity, opens more room for artificial intelligence applications. Our focus is to review the main AI methods within the scope of the health care domain. The results of the review show that recommendations for diagnosis and recommendations for treatment, patent engagement, and administrative tasks are the key applications of AI in healthcare. Understanding the potential of AI methods in the domain of healthcare would benefit healthcare practitioners and will improve patient outcomes.

Keywords: AI in healthcare, technologies of AI, neural network, future of AI in healthcare

Procedia PDF Downloads 112
14697 Advancing in Cricket Analytics: Novel Approaches for Pitch and Ball Detection Employing OpenCV and YOLOV8

Authors: Pratham Madnur, Prathamkumar Shetty, Sneha Varur, Gouri Parashetti

Abstract:

In order to overcome conventional obstacles, this research paper investigates novel approaches for cricket pitch and ball detection that make use of cutting-edge technologies. The research integrates OpenCV for pitch inspection and modifies the YOLOv8 model for cricket ball detection in order to overcome the shortcomings of manual pitch assessment and traditional ball detection techniques. To ensure flexibility in a range of pitch environments, the pitch detection method leverages OpenCV’s color space transformation, contour extraction, and accurate color range defining features. Regarding ball detection, the YOLOv8 model emphasizes the preservation of minor object details to improve accuracy and is specifically trained to the unique properties of cricket balls. The methods are more reliable because of the careful preparation of the datasets, which include novel ball and pitch information. These cutting-edge methods not only improve cricket analytics but also set the stage for flexible methods in more general sports technology applications.

Keywords: OpenCV, YOLOv8, cricket, custom dataset, computer vision, sports

Procedia PDF Downloads 81
14696 Enhancement of Long Term Peak Demand Forecast in Peninsular Malaysia Using Hourly Load Profile

Authors: Nazaitul Idya Hamzah, Muhammad Syafiq Mazli, Maszatul Akmar Mustafa

Abstract:

The peak demand forecast is crucial to identify the future generation plant up needed in the long-term capacity planning analysis for Peninsular Malaysia as well as for the transmission and distribution network planning activities. Currently, peak demand forecast (in Mega Watt) is derived from the generation forecast by using load factor assumption. However, a forecast using this method has underperformed due to the structural changes in the economy, emerging trends and weather uncertainty. The dynamic changes of these drivers will result in many possible outcomes of peak demand for Peninsular Malaysia. This paper will look into the independent model of peak demand forecasting. The model begins with the selection of driver variables to capture long-term growth. This selection and construction of variables, which include econometric, emerging trend and energy variables, will have an impact on the peak forecast. The actual framework begins with the development of system energy and load shape forecast by using the system’s hourly data. The shape forecast represents the system shape assuming all embedded technology and use patterns to continue in the future. This is necessary to identify the movements in the peak hour or changes in the system load factor. The next step would be developing the peak forecast, which involves an iterative process to explore model structures and variables. The final step is combining the system energy, shape, and peak forecasts into the hourly system forecast then modifying it with the forecast adjustments. Forecast adjustments are among other sales forecasts for electric vehicles, solar and other adjustments. The framework will result in an hourly forecast that captures growth, peak usage and new technologies. The advantage of this approach as compared to the current methodology is that the peaks capture new technology impacts that change the load shape.

Keywords: hourly load profile, load forecasting, long term peak demand forecasting, peak demand

Procedia PDF Downloads 172
14695 Task Scheduling and Resource Allocation in Cloud-based on AHP Method

Authors: Zahra Ahmadi, Fazlollah Adibnia

Abstract:

Scheduling of tasks and the optimal allocation of resources in the cloud are based on the dynamic nature of tasks and the heterogeneity of resources. Applications that are based on the scientific workflow are among the most widely used applications in this field, which are characterized by high processing power and storage capacity. In order to increase their efficiency, it is necessary to plan the tasks properly and select the best virtual machine in the cloud. The goals of the system are effective factors in scheduling tasks and resource selection, which depend on various criteria such as time, cost, current workload and processing power. Multi-criteria decision-making methods are a good choice in this field. In this research, a new method of work planning and resource allocation in a heterogeneous environment based on the modified AHP algorithm is proposed. In this method, the scheduling of input tasks is based on two criteria of execution time and size. Resource allocation is also a combination of the AHP algorithm and the first-input method of the first client. Resource prioritization is done with the criteria of main memory size, processor speed and bandwidth. What is considered in this system to modify the AHP algorithm Linear Max-Min and Linear Max normalization methods are the best choice for the mentioned algorithm, which have a great impact on the ranking. The simulation results show a decrease in the average response time, return time and execution time of input tasks in the proposed method compared to similar methods (basic methods).

Keywords: hierarchical analytical process, work prioritization, normalization, heterogeneous resource allocation, scientific workflow

Procedia PDF Downloads 145
14694 Drinking Water Quality Assessment Using Fuzzy Inference System Method: A Case Study of Rome, Italy

Authors: Yas Barzegar, Atrin Barzegar

Abstract:

Drinking water quality assessment is a major issue today; technology and practices are continuously improving; Artificial Intelligence (AI) methods prove their efficiency in this domain. The current research seeks a hierarchical fuzzy model for predicting drinking water quality in Rome (Italy). The Mamdani fuzzy inference system (FIS) is applied with different defuzzification methods. The Proposed Model includes three fuzzy intermediate models and one fuzzy final model. Each fuzzy model consists of three input parameters and 27 fuzzy rules. The model is developed for water quality assessment with a dataset considering nine parameters (Alkalinity, Hardness, pH, Ca, Mg, Fluoride, Sulphate, Nitrates, and Iron). Fuzzy-logic-based methods have been demonstrated to be appropriate to address uncertainty and subjectivity in drinking water quality assessment; it is an effective method for managing complicated, uncertain water systems and predicting drinking water quality. The FIS method can provide an effective solution to complex systems; this method can be modified easily to improve performance.

Keywords: water quality, fuzzy logic, smart cities, water attribute, fuzzy inference system, membership function

Procedia PDF Downloads 75
14693 Translation Methods Applied While Dealing With System-Bound Terms (Polish-English Translation)

Authors: Anna Kizinska

Abstract:

The research aims at discussing Polish and British incongruent terms that refer to company law. The Polish terms under analysis appear in the Polish Code of Commercial Partnerships and Companies and constitute legal terms or factual terms. The English equivalents of each Polish term under research appear in two Polish Code of Commercial Partnerships and Companies translations into English. The theoretical part of the paper includes the presentation of the definitions of a system-bound term and incongruity of terms. The aim of the analysis is to check if the classification of translation methods used in civil law terms translation comprehends the translation methods applied while translating company law terms into English. The translation procedures are defined according to Newmark. The stages of the research include 1) presentation of a definition of a Polish term, 2) enumerating the so-far published English equivalents of a given Polish term and comparing their definitions (as long as they appear in English law dictionaries ) with the definition of a given Polish term under analysis, 3) checking whether an English equivalent appears or not in, among others, the sources of the British law (legislation.gov.uk database) , 4) identifying the translation method that was applied while forming a given English equivalent.

Keywords: translation, legal terms, equivalence, company law, incongruency

Procedia PDF Downloads 90
14692 Impact of Enhanced Business Models on Technology Companies in the Pandemic: A Case Study about the Revolutionary Change in Management Styles

Authors: Murat Colak, Berkay Cakir Saridogan

Abstract:

Since the dawn of modern corporations, almost every single employee has been working in the same loop, which contains three basic steps: going to work, providing the needs for the work, and getting back home. Only a small amount of people were able to break that standard and live outside the box. As the 2019 pandemic hit the Earth and most companies shut down their physical offices, that loop had to change for everyone. This means that the old management styles had to be significantly re-arranged to the "work from home" type of business methods. The methods include online conferences and meetings, time and task tracking using algorithms, globalization of the work, and, most importantly, remote working. After the global epidemic started, even the tech giants were concerned. Now, it can be seen those technology companies have an incredible step-up in their shares compared to the other companies because they know how to manage such situations even better than every other industry. This study aims to take the old traditional management styles in big companies and compare them with the post-covid methods (2019-2022). As a result of this comparison made using the annual reports and shared statistics, this study aims to explain why the winners of this crisis are the technology companies.

Keywords: Covid-19, technology companies, business models, remote work

Procedia PDF Downloads 64
14691 An Image Stitching Approach for Scoliosis Analysis

Authors: Siti Salbiah Samsudin, Hamzah Arof, Ainuddin Wahid Abdul Wahab, Mohd Yamani Idna Idris

Abstract:

Standard X-ray spine images produced by conventional screen-film technique have a limited field of view. This limitation may obstruct a complete inspection of the spine unless images of different parts of the spine are placed next to each other contiguously to form a complete structure. Another solution to producing a whole spine image is by assembling the digitized x-ray images of its parts automatically using image stitching. This paper presents a new Medical Image Stitching (MIS) method that utilizes Minimum Average Correlation Energy (MACE) filters to identify and merge pairs of x-ray medical images. The effectiveness of the proposed method is demonstrated in two sets of experiments involving two databases which contain a total of 40 pairs of overlapping and non-overlapping spine images. The experimental results are compared to those produced by the Normalized Cross Correlation (NCC) and Phase Only Correlation (POC) methods for comparison. It is found that the proposed method outperforms those of the NCC and POC methods in identifying both the overlapping and non-overlapping medical images. The efficacy of the proposed method is further vindicated by its average execution time which is about two to five times shorter than those of the POC and NCC methods.

Keywords: image stitching, MACE filter, panorama image, scoliosis

Procedia PDF Downloads 458
14690 Artificial Neural Network Approach for Modeling and Optimization of Conidiospore Production of Trichoderma harzianum

Authors: Joselito Medina-Marin, Maria G. Serna-Diaz, Alejandro Tellez-Jurado, Juan C. Seck-Tuoh-Mora, Eva S. Hernandez-Gress, Norberto Hernandez-Romero, Iaina P. Medina-Serna

Abstract:

Trichoderma harzianum is a fungus that has been utilized as a low-cost fungicide for biological control of pests, and it is important to determine the optimal conditions to produce the highest amount of conidiospores of Trichoderma harzianum. In this work, the conidiospore production of Trichoderma harzianum is modeled and optimized by using Artificial Neural Networks (AANs). In order to gather data of this process, 30 experiments were carried out taking into account the number of hours of culture (10 distributed values from 48 to 136 hours) and the culture humidity (70, 75 and 80 percent), obtained as a response the number of conidiospores per gram of dry mass. The experimental results were used to develop an iterative algorithm to create 1,110 ANNs, with different configurations, starting from one to three hidden layers, and every hidden layer with a number of neurons from 1 to 10. Each ANN was trained with the Levenberg-Marquardt backpropagation algorithm, which is used to learn the relationship between input and output values. The ANN with the best performance was chosen in order to simulate the process and be able to maximize the conidiospores production. The obtained ANN with the highest performance has 2 inputs and 1 output, three hidden layers with 3, 10 and 10 neurons in each layer, respectively. The ANN performance shows an R2 value of 0.9900, and the Root Mean Squared Error is 1.2020. This ANN predicted that 644175467 conidiospores per gram of dry mass are the maximum amount obtained in 117 hours of culture and 77% of culture humidity. In summary, the ANN approach is suitable to represent the conidiospores production of Trichoderma harzianum because the R2 value denotes a good fitting of experimental results, and the obtained ANN model was used to find the parameters to produce the biggest amount of conidiospores per gram of dry mass.

Keywords: Trichoderma harzianum, modeling, optimization, artificial neural network

Procedia PDF Downloads 158
14689 Introduction to Various Innovative Techniques Suggested for Seismic Hazard Assessment

Authors: Deepshikha Shukla, C. H. Solanki, Mayank K. Desai

Abstract:

Amongst all the natural hazards, earthquakes have the potential for causing the greatest damages. Since the earthquake forces are random in nature and unpredictable, the quantification of the hazards becomes important in order to assess the hazards. The time and place of a future earthquake are both uncertain. Since earthquakes can neither be prevented nor be predicted, engineers have to design and construct in such a way, that the damage to life and property are minimized. Seismic hazard analysis plays an important role in earthquake design structures by providing a rational value of input parameter. In this paper, both mathematical, as well as computational methods adopted by researchers globally in the past five years, will be discussed. Some mathematical approaches involving the concepts of Poisson’s ratio, Convex Set Theory, Empirical Green’s Function, Bayesian probability estimation applied for seismic hazard and FOSM (first-order second-moment) algorithm methods will be discussed. Computational approaches and numerical model SSIFiBo developed in MATLAB to study dynamic soil-structure interaction problem is discussed in this paper. The GIS-based tool will also be discussed which is predominantly used in the assessment of seismic hazards.

Keywords: computational methods, MATLAB, seismic hazard, seismic measurements

Procedia PDF Downloads 340
14688 Requirements Gathering for Improved Software Usability and the Potential for Usage-Centred Design

Authors: Kholod J. Alotaibi, Andrew M. Gravell

Abstract:

Usability is an important software quality that is often neglected at the design stage. Although methods exist to incorporate elements of usability engineering, there is a need for more balanced usability focused methods that can enhance the experience of software usability for users. In this regard, the potential for Usage-Centered Design is explored with respect to requirements gathering and is shown to lead to high software usability besides other benefits. It achieves this through its focus on usage, defining essential use cases, by conducting task modeling, encouraging user collaboration, refining requirements, and so on. The requirements gathering process in UgCD is described in detail.

Keywords: requirements gathering, usability, usage-centred design, computer science

Procedia PDF Downloads 358
14687 Relevancy Measures of Errors in Displacements of Finite Elements Analysis Results

Authors: A. B. Bolkhir, A. Elshafie, T. K. Yousif

Abstract:

This paper highlights the methods of error estimation in finite element analysis (FEA) results. It indicates that the modeling error could be eliminated by performing finite element analysis with successively finer meshes or by extrapolating response predictions from an orderly sequence of relatively low degree of freedom analysis results. In addition, the paper eliminates the round-off error by running the code at a higher precision. The paper provides application in finite element analysis results. It draws a conclusion based on results of application of methods of error estimation.

Keywords: finite element analysis (FEA), discretization error, round-off error, mesh refinement, richardson extrapolation, monotonic convergence

Procedia PDF Downloads 495
14686 From Past to Present Awareness about Complementary Therapies

Authors: Olcay Çam, Ayşegül Bilge, Merve Uğuryol, Hacer Demirkol

Abstract:

Complementary and alternative medicine are important for human health. It has stood out that from past to present people have resorted to particularly Turkish bath houses, cupping therapy, mud bath, hirudotheraphy and healing waters for the purpose of recovering from diseases and refresh their souls. Now, methods such as herbal treatments, massage, aromatherapy, prayer, meditation, yoga and thermal springs have been recently observed to be the most frequently used complementary therapies in Turkey. These methods are not known by people exactly. As a result, complementary therapies are applied along with the modern therapies in Turkey, we are considered to be effective in maintaining and improving individuals’ health.

Keywords: complementary therapy, health, health services, modern therapies

Procedia PDF Downloads 279
14685 An Experimental Study for Assessing Email Classification Attributes Using Feature Selection Methods

Authors: Issa Qabaja, Fadi Thabtah

Abstract:

Email phishing classification is one of the vital problems in the online security research domain that have attracted several scholars due to its impact on the users payments performed daily online. One aspect to reach a good performance by the detection algorithms in the email phishing problem is to identify the minimal set of features that significantly have an impact on raising the phishing detection rate. This paper investigate three known feature selection methods named Information Gain (IG), Chi-square and Correlation Features Set (CFS) on the email phishing problem to separate high influential features from low influential ones in phishing detection. We measure the degree of influentially by applying four data mining algorithms on a large set of features. We compare the accuracy of these algorithms on the complete features set before feature selection has been applied and after feature selection has been applied. After conducting experiments, the results show 12 common significant features have been chosen among the considered features by the feature selection methods. Further, the average detection accuracy derived by the data mining algorithms on the reduced 12-features set was very slight affected when compared with the one derived from the 47-features set.

Keywords: data mining, email classification, phishing, online security

Procedia PDF Downloads 432
14684 Identifying a Drug Addict Person Using Artificial Neural Networks

Authors: Mustafa Al Sukar, Azzam Sleit, Abdullatif Abu-Dalhoum, Bassam Al-Kasasbeh

Abstract:

Use and abuse of drugs by teens is very common and can have dangerous consequences. The drugs contribute to physical and sexual aggression such as assault or rape. Some teenagers regularly use drugs to compensate for depression, anxiety or a lack of positive social skills. Teen resort to smoking should not be minimized because it can be "gateway drugs" for other drugs (marijuana, cocaine, hallucinogens, inhalants, and heroin). The combination of teenagers' curiosity, risk taking behavior, and social pressure make it very difficult to say no. This leads most teenagers to the questions: "Will it hurt to try once?" Nowadays, technological advances are changing our lives very rapidly and adding a lot of technologies that help us to track the risk of drug abuse such as smart phones, Wireless Sensor Networks (WSNs), Internet of Things (IoT), etc. This technique may help us to early discovery of drug abuse in order to prevent an aggravation of the influence of drugs on the abuser. In this paper, we have developed a Decision Support System (DSS) for detecting the drug abuse using Artificial Neural Network (ANN); we used a Multilayer Perceptron (MLP) feed-forward neural network in developing the system. The input layer includes 50 variables while the output layer contains one neuron which indicates whether the person is a drug addict. An iterative process is used to determine the number of hidden layers and the number of neurons in each one. We used multiple experiment models that have been completed with Log-Sigmoid transfer function. Particularly, 10-fold cross validation schemes are used to access the generalization of the proposed system. The experiment results have obtained 98.42% classification accuracy for correct diagnosis in our system. The data had been taken from 184 cases in Jordan according to a set of questions compiled from Specialists, and data have been obtained through the families of drug abusers.

Keywords: drug addiction, artificial neural networks, multilayer perceptron (MLP), decision support system

Procedia PDF Downloads 299
14683 The Effectiveness of Kinesiotaping Methods in Rehabilitation Therapy

Authors: Ana-Katarina Nikich

Abstract:

Background: The kinesiotaping method is often used in physiotherapy and rehabilitation. The purpose of this study was to evaluate the effectiveness of taping in the rehabilitation process of patients. Materials and methods: The study involved 90 male and female patients (the average age was 40-50 years) with various conditions requiring rehabilitation, such as injuries of the musculoskeletal system, sports injuries and other ailments. All patients were divided into two groups: experimental (n=40) and control (n=50). Both groups received 20 days of standard rehabilitation. In the experimental group, kinesiotaping methods were used, taking into account the individual characteristics of each patient. The control group performed regular exercises and physical therapy, but without using kinesiotape. During the study, physical parameters were monitored, interviews were conducted and the conditions of patients from both groups were compared. Results and discussion: The use of the kinesiotaping method in the rehabilitation process led to a significant improvement in physical parameters and pain reduction in patients. Significant improvement (p <0.005) was observed in all evaluated parameters among the patients of the experimental group. The control group also showed sufficient improvement (p <0.005), but the percentage of the experimental group was higher. As a result of the observation, the patients of the experimental group showed faster and more complete rehabilitation compared to the control group. The use of the kinesiotaping method allows to reduce the load on the damaged areas, improve blood circulation and lymphatic drainage, as well as increase stability and coordination of movements. Conclusions: Kinesiotaping as one of the modern therapeutic methods has shown its effectiveness in the rehabilitation process, contributing to the optimal recovery of patients with various conditions requiring rehabilitation. The use of tapes should be included in a comprehensive rehabilitation program to achieve the best results and reduce recovery time.

Keywords: kinesiotaping, rehabilitation, therapy, pain

Procedia PDF Downloads 71
14682 The Profit Trend of Cosmetics Products Using Bootstrap Edgeworth Approximation

Authors: Edlira Donefski, Lorenc Ekonomi, Tina Donefski

Abstract:

Edgeworth approximation is one of the most important statistical methods that has a considered contribution in the reduction of the sum of standard deviation of the independent variables’ coefficients in a Quantile Regression Model. This model estimates the conditional median or other quantiles. In this paper, we have applied approximating statistical methods in an economical problem. We have created and generated a quantile regression model to see how the profit gained is connected with the realized sales of the cosmetic products in a real data, taken from a local business. The Linear Regression of the generated profit and the realized sales was not free of autocorrelation and heteroscedasticity, so this is the reason that we have used this model instead of Linear Regression. Our aim is to analyze in more details the relation between the variables taken into study: the profit and the finalized sales and how to minimize the standard errors of the independent variable involved in this study, the level of realized sales. The statistical methods that we have applied in our work are Edgeworth Approximation for Independent and Identical distributed (IID) cases, Bootstrap version of the Model and the Edgeworth approximation for Bootstrap Quantile Regression Model. The graphics and the results that we have presented here identify the best approximating model of our study.

Keywords: bootstrap, edgeworth approximation, IID, quantile

Procedia PDF Downloads 159
14681 New Standardized Framework for Developing Mobile Applications (Based On Real Case Studies and CMMI)

Authors: Ammar Khader Almasri

Abstract:

The software processes play a vital role for delivering a high quality software system that meets the user’s needs. There are many software development models which are used by most system developers, which can be categorized into two categories (traditional and new methodologies). Mobile applications like other desktop applications need appropriate and well-working software development process. Nevertheless, mobile applications have different features which limit their performance and efficiency like application size, mobile hardware features. Moreover, this research aims to help developers in using a standardized model for developing mobile applications.

Keywords: software development process, agile methods , moblile application development, traditional methods

Procedia PDF Downloads 387
14680 Rapid Formation of Ortho-Boronoimines and Derivatives for Reversible and Dynamic Bioconjugation Under Physiological Conditions

Authors: Nicholas C. Rose, Christopher D. Spicer

Abstract:

The regeneration of damaged or diseased tissues would provide an invaluable therapeutic tool in biological research and medicine. Cells must be provided with a number of different biochemical signals in order to form mature tissue through complex signaling networks that are difficult to recreate in synthetic materials. The ability to attach and detach bioactive proteins from material in an iterative and dynamic manner would therefore present a powerful way to mimic natural biochemical signaling cascades for tissue growth. We propose to reversibly attach these bioactive proteins using ortho-boronoimine (oBI) linkages and related derivatives formed by the reaction of an ortho-boronobenzaldehyde with a nucleophilic amine derivative. To enable the use of oBIs for biomaterial modification, we have studied binding and cleavage processes with precise detail in the context of small molecule models. A panel of oBI complexes has been synthesized and screened using a novel Förster resonance energy transfer (FRET) assay, using a cyanine dye FRET pair (Cy3 and Cy5), to identify the most reactive boron-aldehyde/amine nucleophile pairs. Upon conjugation of the dyes, FRET occurs under Cy3 excitation and the resultant ratio of Cy3:Cy5 emission directly correlates to conversion. Reaction kinetics and equilibria can be accurately quantified for reactive pairs, with dissociation constants of oBI derivatives in water (KD) found to span 9-orders of magnitude (10⁻²-10⁻¹¹ M). These studies have provided us with a better understanding of oBI linkages that we hope to exploit to reversibly attach bioconjugates to materials. The long-term aim of the project is to develop a modular biomaterial platform that can be used to help combat chronic diseases such as osteoarthritis, heart disease, and chronic wounds by providing cells with potent biological stimuli for tissue engineering.

Keywords: dynamic, bioconjugation, bornoimine, rapid, physiological

Procedia PDF Downloads 96
14679 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution

Authors: Haiyan Wu, Ying Liu, Shaoyun Shi

Abstract:

Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.

Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction

Procedia PDF Downloads 136
14678 Effects of Performance Appraisal on Employee Productivity in Yobe State University, Damaturu, (A Case Study of the Department of Islamic Studies)

Authors: Adam Abdullahi Mohammed

Abstract:

Performance appraisal is an assessment made to ensure the level of a worker’s productivity in a given period of time. The appraisal system is divided into two categories that are traditional methods and modern methods, with emphasis based on the evaluation of work results. In the traditional approach of staff appraisal, which puts more emphasis on individual traits, supervisors are required to measure employees through interactions based on what they achieved with reference to job descriptions, as well as rating them based on questionnaires without staff interaction. These methods are not effective because staff may give biased information. The study will attempt to assess the effect of performance appraisal on employee productivity at Yobe State University, Damaturu. It is aimed at assessing the process, methods, and objectives of performance appraisal and its feedback to know how they affect the success of the appraisal, its results, and employee productivity. In this study, a quantitative research method is adopted in collecting and analyzing data, and a questionnaire will be used as data collecting instrument. As it is a case study, the target population is the staff of the department of Islamic Studies. The research will employ a census sampling technique where all the subjects in the target populations are given a chance to participate in the study. This sampling method was considered because the entire target population is considered researchable. The expected findings are that staff performance appraisal in the department of Islamic Studies has effects on employee productivity; this is to say if it is given due consideration and the needful being done will improve employee productivity.

Keywords: performance appraisal, employee productivity, Yobe state University, appraisal feedback

Procedia PDF Downloads 71
14677 BIM-based Construction Noise Management Approach With a Focus on Inner-City Construction

Authors: Nasim Babazadeh

Abstract:

Growing demand for a quieter dwelling environment has turned the attention of construction companies to reducing the propagated noise of their project. In inner-city constructions, close distance between the construction site and surrounding buildings lessens the efficiency of passive noise control methods. Dwellers of the nearby areas may file complaints and lawsuits against the construction companies due to the emitted construction noise, thereby leading to the interruption of processes, compensation costs, or even suspension of the project. Therefore, construction noise should be predicted along with the project schedule. The advantage of managing the noise in the pre-construction phase is two-fold. Firstly, changes in the time plan and construction methods can be applied more flexibly. Thus, the costs related to rescheduling can be avoided. Secondly, noise-related legal problems are expected to be reduced. To implement noise mapping methods for the mentioned prediction, the required detailed information (such as the location of the noisy process, duration of the noisy work) can be exported from the 4D BIM model. The results obtained from the noise maps would be used to help the planners to define different work scenarios. The proposed approach has been applied for the foundation and earthwork of a site located in a residential area, and the obtained results are discussed.

Keywords: building information modeling, construction noise management, noise mapping, 4D BIM

Procedia PDF Downloads 185
14676 Digital Retinal Images: Background and Damaged Areas Segmentation

Authors: Eman A. Gani, Loay E. George, Faisel G. Mohammed, Kamal H. Sager

Abstract:

Digital retinal images are more appropriate for automatic screening of diabetic retinopathy systems. Unfortunately, a significant percentage of these images are poor quality that hinders further analysis due to many factors (such as patient movement, inadequate or non-uniform illumination, acquisition angle and retinal pigmentation). The retinal images of poor quality need to be enhanced before the extraction of features and abnormalities. So, the segmentation of retinal image is essential for this purpose, the segmentation is employed to smooth and strengthen image by separating the background and damaged areas from the overall image thus resulting in retinal image enhancement and less processing time. In this paper, methods for segmenting colored retinal image are proposed to improve the quality of retinal image diagnosis. The methods generate two segmentation masks; i.e., background segmentation mask for extracting the background area and poor quality mask for removing the noisy areas from the retinal image. The standard retinal image databases DIARETDB0, DIARETDB1, STARE, DRIVE and some images obtained from ophthalmologists have been used to test the validation of the proposed segmentation technique. Experimental results indicate the introduced methods are effective and can lead to high segmentation accuracy.

Keywords: retinal images, fundus images, diabetic retinopathy, background segmentation, damaged areas segmentation

Procedia PDF Downloads 403
14675 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 357
14674 Detection of Parkinsonian Freezing of Gait

Authors: Sang-Hoon Park, Yeji Ho, Gwang-Moon Eom

Abstract:

Fast and accurate detection of Freezing of Gait (FOG) is desirable for appropriate application of cueing which has been shown to ameliorate FOG. Utilization of frequency spectrum of leg acceleration to derive the freeze index requires much calculation and it would lead to delayed cueing. We hypothesized that FOG can be reasonably detected from the time domain amplitude of foot acceleration. A time instant was recognized as FOG if the mean amplitude of the acceleration in the time window surrounding the time instant was in the specific FOG range. Parameters required in the FOG detection was optimized by simulated annealing. The suggested time domain methods showed performances comparable to those of frequency domain methods.

Keywords: freezing of gait, detection, Parkinson's disease, time-domain method

Procedia PDF Downloads 444
14673 Wired Network Services in Mobile Phones

Authors: Subhash Reddy

Abstract:

Mobile communication in today’s world means a lot to the human kind, through this many deals are made and others are broken, within seconds. That is because of our sophisticated methods of transporting the data at very high speeds and to very long distances, within no time. That is also because we kept on changing the method of serving the connections as the no of connections kept on increasing, that has led to many methods like TDMA, CDMA, and FDMA, etc. in wireless communications. And also the areas, where the connections are provided are also divided into CELLS, which are the basic blocks for cellular communications. Along with the wireless network, providing a wired network in mobile phones would serve as a very good alternative and would divert the extra traffic of a cell, so that a CELL which is providing wireless network can operate more efficiently.

Keywords: CDMA, FDMA, TDMA, CELL

Procedia PDF Downloads 486
14672 Analysis of Expert Possibilities While Identifying Human Teeth

Authors: Saule Mussabekova

Abstract:

Forensic investigation of human teeth plays an important role in detection of crime, particularly in cases of personal identification of dead bodies changed by putrefactive processes or skeletonized bodies as well as when finding bodies of unknown persons. 152 teeth have been investigated; 85 of them belonged to men and 67 belonged to women taken from alive people of different age. Teeth have been investigated after extraction. Two types of teeth have been investigated: teeth without integrity violation of dental crown and teeth with different degrees of its violation. Additionally, 517 teeth have been investigated that were collected from dead bodies, 252 of which belonged to women and 265 belonged to men, whatever the cause of death with death limitation from 1 month to 20 years. Isohemagglutinating serums and Coliclons of different series have been used for the research of tooth-group specificity by serological methods according to the AB0 system. Standard protocols of different techniques have been used for DNA purification from teeth (by reagent Chelex 100 produced by Bio-Rad using reagent kit 'DNA IQTM System' produced by Promega company (USA) and using columns 'QIAamp DNA Investigator Kit' produced by Qiagen company). Results of comparative forensic investigation of human teeth using serological and molecular genetic methods have shown that use of serological methods for forensic identification is sensible only in cases of preselection prior to the next molecular genetic investigation as well as in cases of impossibility of corresponding genetic investigation for different objective reasons. A number of advantages of methods of molecular genetics in the dental investigation have been marked, particularly in putrefactive changes, in personal identification. Key moments of modern condition of personal identification have been reflected according to dental state. Prospective directions of advance preparation of material have been emphasized for identification of teeth in forensic practice.

Keywords: dental state, forensic identification, molecular genetic analysis, teeth

Procedia PDF Downloads 141
14671 Computer-Aided Ship Design Approach for Non-Uniform Rational Basis Spline Based Ship Hull Surface Geometry

Authors: Anu S. Nair, V. Anantha Subramanian

Abstract:

This paper presents a surface development and fairing technique combining the features of a modern computer-aided design tool namely the Non-Uniform Rational Basis Spline (NURBS) with an algorithm to obtain a rapidly faired hull form. Some of the older series based designs give sectional area distribution such as in the Wageningen-Lap Series. Others such as the FORMDATA give more comprehensive offset data points. Nevertheless, this basic data still requires fairing to obtain an acceptable faired hull form. This method uses the input of sectional area distribution as an example and arrives at the faired form. Characteristic section shapes define any general ship hull form in the entrance, parallel mid-body and run regions. The method defines a minimum of control points at each section and using the Golden search method or the bisection method; the section shape converges to the one with the prescribed sectional area with a minimized error in the area fit. The section shapes combine into evolving the faired surface by NURBS and typically takes 20 iterations. The advantage of the method is that it is fast, robust and evolves the faired hull form through minimal iterations. The curvature criterion check for the hull lines shows the evolution of the smooth faired surface. The method is applicable to hull form from any parent series and the evolved form can be evaluated for hydrodynamic performance as is done in more modern design practice. The method can handle complex shape such as that of the bulbous bow. Surface patches developed fit together at their common boundaries with curvature continuity and fairness check. The development is coded in MATLAB and the example illustrates the development of the method. The most important advantage is quick time, the rapid iterative fairing of the hull form.

Keywords: computer-aided design, methodical series, NURBS, ship design

Procedia PDF Downloads 169
14670 An Investigation on Hot-Spot Temperature Calculation Methods of Power Transformers

Authors: Ahmet Y. Arabul, Ibrahim Senol, Fatma Keskin Arabul, Mustafa G. Aydeniz, Yasemin Oner, Gokhan Kalkan

Abstract:

In the standards of IEC 60076-2 and IEC 60076-7, three different hot-spot temperature estimation methods are suggested. In this study, the algorithms which used in hot-spot temperature calculations are analyzed by comparing the algorithms with the results of an experimental set-up made by a Transformer Monitoring System (TMS) in use. In tested system, TMS uses only top oil temperature and load ratio for hot-spot temperature calculation. And also, it uses some constants from standards which are on agreed statements tables. During the tests, it came out that hot-spot temperature calculation method is just making a simple calculation and not uses significant all other variables that could affect the hot-spot temperature.

Keywords: Hot-spot temperature, monitoring system, power transformer, smart grid

Procedia PDF Downloads 573
14669 Retail Strategy to Reduce Waste Keeping High Profit Utilizing Taylor's Law in Point-of-Sales Data

Authors: Gen Sakoda, Hideki Takayasu, Misako Takayasu

Abstract:

Waste reduction is a fundamental problem for sustainability. Methods for waste reduction with point-of-sales (POS) data are proposed, utilizing the knowledge of a recent econophysics study on a statistical property of POS data. Concretely, the non-stationary time series analysis method based on the Particle Filter is developed, which considers abnormal fluctuation scaling known as Taylor's law. This method is extended for handling incomplete sales data because of stock-outs by introducing maximum likelihood estimation for censored data. The way for optimal stock determination with pricing the cost of waste reduction is also proposed. This study focuses on the examination of the methods for large sales numbers where Taylor's law is obvious. Numerical analysis using aggregated POS data shows the effectiveness of the methods to reduce food waste maintaining a high profit for large sales numbers. Moreover, the way of pricing the cost of waste reduction reveals that a small profit loss realizes substantial waste reduction, especially in the case that the proportionality constant  of Taylor’s law is small. Specifically, around 1% profit loss realizes half disposal at =0.12, which is the actual  value of processed food items used in this research. The methods provide practical and effective solutions for waste reduction keeping a high profit, especially with large sales numbers.

Keywords: food waste reduction, particle filter, point-of-sales, sustainable development goals, Taylor's law, time series analysis

Procedia PDF Downloads 131