Search results for: AI algorithm internal audit
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6283

Search results for: AI algorithm internal audit

4873 Clinical Audit on the Introduction of Apremilast into Ireland

Authors: F. O’Dowd, G. Murphy, M. Roche, E. Shudell, F. Keane, M. O’Kane

Abstract:

Intoduction: Apremilast (Otezla®) is an oral phosphodiesterase-4 (PDE4) inhibitor indicated for treatment of adult patients with moderate to severe plaque psoriasis who have contraindications to have failed or intolerant of standard systemic therapy and/or phototherapy; and adult patients with active psoriatic arthritis. Apremilast influences intracellular regulation of inflammatory mediators. Two randomized, placebo-controlled trials evaluating apremilast in 1426 patients with moderate to severe plague psoriasis (ESTEEM 1 and 2) demonstrated that the commonest adverse reactions (AE’s) leading to discontinuation were nausea (1.6%), diarrhoea (1.0%), and headaches (0.8%). The overall proportion of subjects discontinuing due to adverse reactions was 6.1%. At week 16 these trials demonstrated significant more apremilast-treated patients (33.1%) achieved the primary end point PASI-75 than placebo (5.3%). We began prescribing apremilast in July 2015. Aim: To evaluate efficacy and tolerability of apremilast in an Irish teaching hospital psoriasis population. Methods: A proforma documenting clinical evaluation parameters, prior treatment experience and AE’s; was completed prospectively on all patients commenced on apremilast since July 2015 – July 2017. Data was collected at week 0,6,12,24,36 and week 52 with 20/71 patients having passed week 52. Efficacy was assessed using Psoriasis Area and Severity Index (PASI) and Dermatology Life Quality Index (DLQI). AE’s documented included GI effects, infections, changes in weight and mood. Retrospective chart review and telephone review was utilised for missing data. Results: A total of 71 adult subjects (38 male, 33 female; age range 23-57), with moderate to severe psoriasis, were evaluated. Prior treatment: 37/71 (52%) were systemic/biologic/phototherapy naïve; 14/71 (20%) has prior phototherapy alone;20/71 (28%) had previous systemic/biologic exposure; 12/71 (17%) had both psoriasis and psoriatic arthritis. PASI responses: mean baseline PASI was 10.1 and DLQI was 15.Week 6: N=71, n=15 (21%) achieved PASI 75. Week 12: N= 48, n=6 (13%) achieved a PASI 100%; n=16 (34.5%) achieved a PASI 75. Week 24: N=40, n=10 (25%) achieved a PASI 100; n=15 (37.5%) achieved a PASI 75. Week 52: N= 20, n=4 (20%) achieved a PASI 100; n= 16 (80%) achieved a PASI 75. (N= number of pts having passed the time point indicated, n= number of pts (out of N) achieving PASI or DLQI responses at that time). DLQI responses: week 24: N= 40, n=30 (75%) achieved a DLQI score of 0; n=5 (12.5%) achieved a DLQI score of 1; n=1 (2.5%) achieved a DLQI score of 10 (due to lack of efficacy). Adverse Events: The proportion of patients that discontinued treatment due to AE’s was n=7 (9.8%). One patient experienced nausea alleviated by dose reduction; another developed significant dysgeusia for certain foods, both continued therapy. Two patients lost 2-3 kg. Conclusion: Initial Irish patient experience of Apremilast appears comparable to that observed in trials with good efficacy and tolerability.

Keywords: Apremilast, introduction, Ireland, clinical audit

Procedia PDF Downloads 149
4872 Characteristic Sentence Stems in Academic English Texts: Definition, Identification, and Extraction

Authors: Jingjie Li, Wenjie Hu

Abstract:

Phraseological units in academic English texts have been a central focus in recent corpus linguistic research. A wide variety of phraseological units have been explored, including collocations, chunks, lexical bundles, patterns, semantic sequences, etc. This paper describes a special category of clause-level phraseological units, namely, Characteristic Sentence Stems (CSSs), with a view to describing their defining criteria and extraction method. CSSs are contiguous lexico-grammatical sequences which contain a subject-predicate structure and which are frame expressions characteristic of academic writing. The extraction of CSSs consists of six steps: Part-of-speech tagging, n-gram segmentation, structure identification, significance of occurrence calculation, text range calculation, and overlapping sequence reduction. Significance of occurrence calculation is the crux of this study. It includes the computing of both the internal association and the boundary independence of a CSS and tests the occurring significance of the CSS from both inside and outside perspectives. A new normalization algorithm is also introduced into the calculation of LocalMaxs for reducing overlapping sequences. It is argued that many sentence stems are so recurrent in academic texts that the most typical of them have become the habitual ways of making meaning in academic writing. Therefore, studies of CSSs could have potential implications and reference value for academic discourse analysis, English for Academic Purposes (EAP) teaching and writing.

Keywords: characteristic sentence stem, extraction method, phraseological unit, the statistical measure

Procedia PDF Downloads 166
4871 Optimization of Water Pipeline Routes Using a GIS-Based Multi-Criteria Decision Analysis and a Geometric Search Algorithm

Authors: Leon Mortari

Abstract:

The Metropolitan East region of Rio de Janeiro state, Brazil, faces a historic water scarcity. Among the alternatives studied to solve this situation, the possibility of adduction of the available water in the reservoir Lagoa de Juturnaíba to supply the region's municipalities stands out. The allocation of a linear engineering project must occur through an evaluation of different aspects, such as altitude, slope, proximity to roads, distance from watercourses, land use and occupation, and physical and chemical features of the soil. This work aims to apply a multi-criteria model that combines geoprocessing techniques, decision-making, and geometric search algorithm to optimize a hypothetical adductor system in the scenario of expanding the water supply system that serves this region, known as Imunana-Laranjal, using the Lagoa de Juturnaíba as the source. It is proposed in this study, the construction of a spatial database related to the presented evaluation criteria, treatment and rasterization of these data, and standardization and reclassification of this information in a Geographic Information System (GIS) platform. The methodology involves the integrated analysis of these criteria, using their relative importance defined by weighting them based on expert consultations and the Analytic Hierarchy Process (AHP) method. Three approaches are defined for weighting the criteria by AHP: the first treats all criteria as equally important, the second considers weighting based on a pairwise comparison matrix, and the third establishes a hierarchy based on the priority of the criteria. For each approach, a distinct group of weightings is defined. In the next step, map algebra tools are used to overlay the layers and generate cost surfaces, that indicates the resistance to the passage of the adductor route, using the three groups of weightings. The Dijkstra algorithm, a geometric search algorithm, is then applied to these cost surfaces to find an optimized path within the geographical space, aiming to minimize resources, time, investment, maintenance, and environmental and social impacts.

Keywords: geometric search algorithm, GIS, pipeline, route optimization, spatial multi-criteria analysis model

Procedia PDF Downloads 31
4870 Parametric Analysis and Optimal Design of Functionally Graded Plates Using Particle Swarm Optimization Algorithm and a Hybrid Meshless Method

Authors: Foad Nazari, Seyed Mahmood Hosseini, Mohammad Hossein Abolbashari, Mohammad Hassan Abolbashari

Abstract:

The present study is concerned with the optimal design of functionally graded plates using particle swarm optimization (PSO) algorithm. In this study, meshless local Petrov-Galerkin (MLPG) method is employed to obtain the functionally graded (FG) plate’s natural frequencies. Effects of two parameters including thickness to height ratio and volume fraction index on the natural frequencies and total mass of plate are studied by using the MLPG results. Then the first natural frequency of the plate, for different conditions where MLPG data are not available, is predicted by an artificial neural network (ANN) approach which is trained by back-error propagation (BEP) technique. The ANN results show that the predicted data are in good agreement with the actual one. To maximize the first natural frequency and minimize the mass of FG plate simultaneously, the weighted sum optimization approach and PSO algorithm are used. However, the proposed optimization process of this study can provide the designers of FG plates with useful data.

Keywords: optimal design, natural frequency, FG plate, hybrid meshless method, MLPG method, ANN approach, particle swarm optimization

Procedia PDF Downloads 367
4869 Design of IMC-PID Controller Cascaded Filter for Simplified Decoupling Control System

Authors: Le Linh, Truong Nguyen Luan Vu, Le Hieu Giang

Abstract:

In this work, the IMC-PID controller cascaded filter based on Internal Model Control (IMC) scheme is systematically proposed for the simplified decoupling control system. The simplified decoupling is firstly introduced for multivariable processes by using coefficient matching to obtain a stable, proper, and causal simplified decoupler. Accordingly, transfer functions of decoupled apparent processes can be expressed as a set of n equivalent independent processes and then derived as a ratio of the original open-loop transfer function to the diagonal element of the dynamic relative gain array. The IMC-PID controller in series with filter is then directly employed to enhance the overall performance of the decoupling control system while avoiding difficulties arising from properties inherent to simplified decoupling. Some simulation studies are considered to demonstrate the simplicity and effectiveness of the proposed method. Simulations were conducted by tuning various controllers of the multivariate processes with multiple time delays. The results indicate that the proposed method consistently performs well with fast and well-balanced closed-loop time responses.

Keywords: coefficient matching method, internal model control (IMC) scheme, PID controller cascaded filter, simplified decoupler

Procedia PDF Downloads 442
4868 Prediction of Bariatric Surgery Publications by Using Different Machine Learning Algorithms

Authors: Senol Dogan, Gunay Karli

Abstract:

Identification of relevant publications based on a Medline query is time-consuming and error-prone. An all based process has the potential to solve this problem without any manual work. To the best of our knowledge, our study is the first to investigate the ability of machine learning to identify relevant articles accurately. 5 different machine learning algorithms were tested using 23 predictors based on several metadata fields attached to publications. We find that the Boosted model is the best-performing algorithm and its overall accuracy is 96%. In addition, specificity and sensitivity of the algorithm is 97 and 93%, respectively. As a result of the work, we understood that we can apply the same procedure to understand cancer gene expression big data.

Keywords: prediction of publications, machine learning, algorithms, bariatric surgery, comparison of algorithms, boosted, tree, logistic regression, ANN model

Procedia PDF Downloads 209
4867 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 152
4866 Visualization Tool for EEG Signal Segmentation

Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh

Abstract:

This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.

Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation

Procedia PDF Downloads 397
4865 Equalization Algorithm for the Optical OFDM System Based on the Fractional Fourier Transform

Authors: A. Cherifi, B. Bouazza, A. O. Dahmane, B. Yagoubi

Abstract:

Transmission over Optical channels will introduce inter-symbol interference (ISI) as well as inter-channel (or inter-carrier) interference (ICI). To decrease the effects of ICI, this paper proposes equalizer for the Optical OFDM system based on the fractional Fourier transform (FrFFT). In this FrFT-OFDM system, traditional Fourier transform is replaced by fractional Fourier transform to modulate and demodulate the data symbols. The equalizer proposed consists of sampling the received signal in the different time per time symbol. Theoretical analysis and numerical simulation are discussed.

Keywords: OFDM, (FrFT) fractional fourier transform, optical OFDM, equalization algorithm

Procedia PDF Downloads 430
4864 Genetic Algorithms for Parameter Identification of DC Motor ARMAX Model and Optimal Control

Authors: A. Mansouri, F. Krim

Abstract:

This paper presents two techniques for DC motor parameters identification. We propose a numerical method using the adaptive extensive recursive least squares (AERLS) algorithm for real time parameters estimation. This algorithm, based on minimization of quadratic criterion, is realized in simulation for parameters identification of DC motor autoregressive moving average with extra inputs (ARMAX). As advanced technique, we use genetic algorithms (GA) identification with biased estimation for high dynamic performance speed regulation. DC motors are extensively used in variable speed drives, for robot and solar panel trajectory control. GA effectiveness is derived through comparison of the two approaches.

Keywords: ARMAX model, DC motor, AERLS, GA, optimization, parameter identification, PID speed regulation

Procedia PDF Downloads 379
4863 Insider Fraud and its Risks to FinTechs

Authors: Claire Maillet

Abstract:

Insider fraud, including its various forms such as employee fraud or internal fraud, is a major financial crime threat whereby an employee defrauds (or attempts to defraud) their current, prospective or past employer. ‘Employee’ covers anyone employed by the company, including contractors, agency workers, directors and part time staff. Insider fraud is even more of a concern given the impacts of the Coronavirus pandemic and the cost-of-living crisis, which have generated multiple opportunities to commit insider fraud. Insider fraud is something that is not necessarily thought of as a significant financial crime; Without the face-to-face, ‘over the shoulder’ capabilities of staff being able to keep an eye on their employees, there is a heightened reliance on trust and transparency. With this, naturally, comes an increased risk of insider fraud. Given that the number of FinTechs is on the rise and there is a significant lack of empirically based solutions for reducing insider fraud, these are gaps in the research space that this thesis aims to fill. Finally, Kassem (2022) notes that “academic research plays a crucial role in raising awareness about fraud and researching effective methods for countering it”. Thus, this thesis may be used as an opportune tool to provide an extensive list of controls spanning detection, deterrence and prevention, that are recommended to be implemented to help combat the insider threat.

Keywords: insider fraud, internal fraud, pandemic, Covid-19

Procedia PDF Downloads 22
4862 Advanced Machine Learning Algorithm for Credit Card Fraud Detection

Authors: Manpreet Kaur

Abstract:

When legitimate credit card users are mistakenly labelled as fraudulent in numerous financial delated applications, there are numerous ethical problems. The innovative machine learning approach we have suggested in this research outperforms the current models and shows how to model a data set for credit card fraud detection while minimizing false positives. As a result, we advise using random forests as the best machine learning method for predicting and identifying credit card transaction fraud. The majority of victims of these fraudulent transactions were discovered to be credit card users over the age of 60, with a higher percentage of fraudulent transactions taking place between the specific hours.

Keywords: automated fraud detection, isolation forest method, local outlier factor, ML algorithm, credit card

Procedia PDF Downloads 113
4861 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging

Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa

Abstract:

Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.

Keywords: breast, machine learning, MRI, radiomics

Procedia PDF Downloads 267
4860 Poly(S/DVB)HIPE Filled with Cellulose from Water Hyacinth

Authors: Metinee Kawsomboon, Thanchanok Tulaphol, Manit Nithitanakul, Jitima Preechawong

Abstract:

PolyHIPE is a porous polymeric material from polymerization of high internal phase emulsion (HIPE) which contains 74% of internal phase (disperse phase) and 26 % of external phase (continues phase). Typically, polyHIPE was prepared from styrene (S) and divinylbenzene (DVB) and they were used in various kind of applications such as catalyst support, gas adsorption, separation membranes, and tissue engineering scaffolds due to high specific surface areas, high porousity, ability to adsorb large quantities of liquid. In this research, cellulose from water hyacinth (Eichornia Crassipes), an aquatic plant that grows and spread rapidly in rivers and waterways in Thailand was added into polyHIPE to increase mechanical property of polyHIPE. Addition of unmodified and modified cellulose to poly(S/DVB)HIPE resulting in a decrease in the surface area and thermal stability of the resulting materials. Mechanical properties of the resulting polyHIPEs filled with both unmodified and modified cellulose exhibited higher compressive strength and Young’s modulus by 146.3% and 162.5% respectively, compared to unfilled polyHIPEs. The water adsorption capacity of filled polyHIPE was also improved.

Keywords: porous polymer, PolyHIPE, cellulose, surface modification, water hyacinth

Procedia PDF Downloads 142
4859 Scientific Linux Cluster for BIG-DATA Analysis (SLBD): A Case of Fayoum University

Authors: Hassan S. Hussein, Rania A. Abul Seoud, Amr M. Refaat

Abstract:

Scientific researchers face in the analysis of very large data sets that is increasing noticeable rate in today’s and tomorrow’s technologies. Hadoop and Spark are types of software that developed frameworks. Hadoop framework is suitable for many Different hardware platforms. In this research, a scientific Linux cluster for Big Data analysis (SLBD) is presented. SLBD runs open source software with large computational capacity and high performance cluster infrastructure. SLBD composed of one cluster contains identical, commodity-grade computers interconnected via a small LAN. SLBD consists of a fast switch and Gigabit-Ethernet card which connect four (nodes). Cloudera Manager is used to configure and manage an Apache Hadoop stack. Hadoop is a framework allows storing and processing big data across the cluster by using MapReduce algorithm. MapReduce algorithm divides the task into smaller tasks which to be assigned to the network nodes. Algorithm then collects the results and form the final result dataset. SLBD clustering system allows fast and efficient processing of large amount of data resulting from different applications. SLBD also provides high performance, high throughput, high availability, expandability and cluster scalability.

Keywords: big data platforms, cloudera manager, Hadoop, MapReduce

Procedia PDF Downloads 358
4858 A Method for Solving a Bi-Objective Transportation Problem under Fuzzy Environment

Authors: Sukhveer Singh, Sandeep Singh

Abstract:

A bi-objective fuzzy transportation problem with the objectives to minimize the total fuzzy cost and fuzzy time of transportation without according priorities to them is considered. To the best of our knowledge, there is no method in the literature to find efficient solutions of the bi-objective transportation problem under uncertainty. In this paper, a bi-objective transportation problem in an uncertain environment has been formulated. An algorithm has been proposed to find efficient solutions of the bi-objective transportation problem under uncertainty. The proposed algorithm avoids the degeneracy and gives the optimal solution faster than other existing algorithms for the given uncertain transportation problem.

Keywords: uncertain transportation problem, efficient solution, ranking function, fuzzy transportation problem

Procedia PDF Downloads 525
4857 Alternating Expectation-Maximization Algorithm for a Bilinear Model in Isoform Quantification from RNA-Seq Data

Authors: Wenjiang Deng, Tian Mou, Yudi Pawitan, Trung Nghia Vu

Abstract:

Estimation of isoform-level gene expression from RNA-seq data depends on simplifying assumptions, such as uniform reads distribution, that are easily violated in real data. Such violations typically lead to biased estimates. Most existing methods provide a bias correction step(s), which is based on biological considerations, such as GC content–and applied in single samples separately. The main problem is that not all biases are known. For example, new technologies such as single-cell RNA-seq (scRNA-seq) may introduce new sources of bias not seen in bulk-cell data. This study introduces a method called XAEM based on a more flexible and robust statistical model. Existing methods are essentially based on a linear model Xβ, where the design matrix X is known and derived based on the simplifying assumptions. In contrast, XAEM considers Xβ as a bilinear model with both X and β unknown. Joint estimation of X and β is made possible by simultaneous analysis of multi-sample RNA-seq data. Compared to existing methods, XAEM automatically performs empirical correction of potentially unknown biases. XAEM implements an alternating expectation-maximization (AEM) algorithm, alternating between estimation of X and β. For speed XAEM utilizes quasi-mapping for read alignment, thus leading to a fast algorithm. Overall XAEM performs favorably compared to other recent advanced methods. For simulated datasets, XAEM obtains higher accuracy for multiple-isoform genes, particularly for paralogs. In a differential-expression analysis of a real scRNA-seq dataset, XAEM achieves substantially greater rediscovery rates in an independent validation set.

Keywords: alternating EM algorithm, bias correction, bilinear model, gene expression, RNA-seq

Procedia PDF Downloads 142
4856 Anomaly: A Case of Babri Masjid Dispute

Authors: Karitikeya Sonker

Abstract:

Religion as a discrete system through its lawful internal working produces an output in the form of realised spatial order with its social logic and a social order with its spatial logic. Thus, it appears to exhibit its duality of spatial and trans-spatial. The components of this system share a relevance forming a collective. This shared relevance creates meaning forming a group where all collectives share one identity. This group with its new social order and its spatial logic revive the already existing spatial order. These religious groups do so having a tendency to expand resulting in the production of space in a situation of encounter where they have found relevance. But an encounter without a lawful internal working of a discrete system results in anomaly because groups do not find relevance due to the absence of collective identity. Events happen all around. One of the main reasons we could say that something became an event is because of conflict. Conflict not in its definitive sense but any occurrence that happens because of an intervention that creates an event worth remembering. The unfolding of such events creates Cities and Urban spaces which exhibit their duality of spatial and trans-spatial by behaving as a discrete system. This system through its lawful internal working produces an output in the form of realized spatial order with its social logic and a social order with spatial logic. The components of this system form a collective through a shared a relevance. This shared relevance creates meaning forming a group where all collectives share one identity. This group with its new social order and its spatial logic revives the already existing spatial order. These groups do so having a tendency to expand resulting in the production of space in a situation of encounter where they have found relevance. But an encounter without a lawful internal working of the discrete system results in anomaly because groups do not find relevance due to the absence of collective identity. This paper makes an effort to explore one such even in the case of Babri Mosque and Ramjanmabhumi, Ayodhya to explain the anomaly as transposition of social and spatial. The paper through the case studies makes an attempt to generate an equation explaining the two different situations of religious encounters, former reviving the social and spatial order and the other resulting in anomaly. Through the case study, it makes an attempt to generate an equation explaining the two different situations of religious encounters, former reviving the social and spatial order and the other resulting in anomaly.

Keywords: Babri Masjid, Ayodhya, conflict, religion

Procedia PDF Downloads 275
4855 MapReduce Algorithm for Geometric and Topological Information Extraction from 3D CAD Models

Authors: Ahmed Fradi

Abstract:

In a digital world in perpetual evolution and acceleration, data more and more voluminous, rich and varied, the new software solutions emerged with the Big Data phenomenon offer new opportunities to the company enabling it not only to optimize its business and to evolve its production model, but also to reorganize itself to increase competitiveness and to identify new strategic axes. Design and manufacturing industrial companies, like the others, face these challenges, data represent a major asset, provided that they know how to capture, refine, combine and analyze them. The objective of our paper is to propose a solution allowing geometric and topological information extraction from 3D CAD model (precisely STEP files) databases, with specific algorithm based on the programming paradigm MapReduce. Our proposal is the first step of our future approach to 3D CAD object retrieval.

Keywords: Big Data, MapReduce, 3D object retrieval, CAD, STEP format

Procedia PDF Downloads 540
4854 Development of Evolutionary Algorithm by Combining Optimization and Imitation Approach for Machine Learning in Gaming

Authors: Rohit Mittal, Bright Keswani, Amit Mithal

Abstract:

This paper provides a sense about the application of computational intelligence techniques used to develop computer games, especially car racing. For the deep sense and knowledge of artificial intelligence, this paper is divided into various sections that is optimization, imitation, innovation and combining approach of optimization and imitation. This paper is mainly concerned with combining approach which tells different aspects of using fitness measures and supervised learning techniques used to imitate aspects of behavior. The main achievement of this paper is based on modelling player behaviour and evolving new game content such as racing tracks as single car racing on single track.

Keywords: evolution algorithm, genetic, optimization, imitation, racing, innovation, gaming

Procedia PDF Downloads 646
4853 Nelder-Mead Parametric Optimization of Elastic Metamaterials with Artificial Neural Network Surrogate Model

Authors: Jiaqi Dong, Qing-Hua Qin, Yi Xiao

Abstract:

Some of the most fundamental challenges of elastic metamaterials (EMMs) optimization can be attributed to the high consumption of computational power resulted from finite element analysis (FEA) simulations that render the optimization process inefficient. Furthermore, due to the inherent mesh dependence of FEA, minuscule geometry features, which often emerge during the later stages of optimization, induce very fine elements, resulting in enormously high time consumption, particularly when repetitive solutions are needed for computing the objective function. In this study, a surrogate modelling algorithm is developed to reduce computational time in structural optimization of EMMs. The surrogate model is constructed based on a multilayer feedforward artificial neural network (ANN) architecture, trained with prepopulated eigenfrequency data prepopulated from FEA simulation and optimized through regime selection with genetic algorithm (GA) to improve its accuracy in predicting the location and width of the primary elastic band gap. With the optimized ANN surrogate at the core, a Nelder-Mead (NM) algorithm is established and its performance inspected in comparison to the FEA solution. The ANNNM model shows remarkable accuracy in predicting the band gap width and a reduction of time consumption by 47%.

Keywords: artificial neural network, machine learning, mechanical metamaterials, Nelder-Mead optimization

Procedia PDF Downloads 128
4852 A Laundry Algorithm for Colored Textiles

Authors: H. E. Budak, B. Arslan-Ilkiz, N. Cakmakci, I. Gocek, U. K. Sahin, H. Acikgoz-Tufan, M. H. Arslan

Abstract:

The aim of this study is to design a novel laundry algorithm for colored textiles which have significant decoloring problem. During the experimental work, bleached knitted single jersey fabric made of 100% cotton and dyed with reactive dyestuff was utilized, since according to a conducted survey textiles made of cotton are the most demanded textile products in the textile market by the textile consumers and for coloration of textiles reactive dyestuffs are the ones that are the most commonly used in the textile industry for dyeing cotton-made products. Therefore, the fabric used in this study was selected and purchased in accordance with the survey results. The fabric samples cut out of this fabric were dyed with different dyeing parameters by using Remazol Brilliant Red 3BS dyestuff in Gyrowash machine at laboratory conditions. From the alternative reactive-dyed cotton fabric samples, the ones that have high tendency to color loss were determined and examined. Accordingly, the parameters of the dyeing process used for these fabric samples were evaluated and the dyeing process which was chosen to be used for causing high tendency to color loss for the cotton fabrics was determined in order to reveal the level of improvement in color loss during this study clearly. Afterwards, all of the untreated fabric samples cut out of the fabric purchased were dyed with the dyeing process selected. When dyeing process was completed, an experimental design was created for the laundering process by using Minitab® program considering temperature, time and mechanical action as parameters. All of the washing experiments were performed in domestic washing machine. 16 washing experiments were performed with 8 different experimental conditions and 2 repeats for each condition. After each of the washing experiments, water samples of the main wash of the laundering process were measured with UV spectrophotometer. The values obtained were compared with the calibration curve of the materials used for the dyeing process. The results of the washing experiments were statistically analyzed with Minitab® program. According to the results, the most suitable washing algorithm to be used in terms of the parameters temperature, time and mechanical action for domestic washing machines for minimizing fabric color loss was chosen. The laundry algorithm proposed in this study have the ability of minimalizing the problem of color loss of colored textiles in washing machines by eliminating the negative effects of the parameters of laundering process on color of textiles without compromising the fundamental effects of basic cleaning action being performed properly. Therefore, since fabric color loss is minimized with this washing algorithm, dyestuff residuals will definitely be lower in the grey water released from the laundering process. In addition to this, with this laundry algorithm it is possible to wash and clean other types of textile products with proper cleaning effect and minimized color loss.

Keywords: color loss, laundry algorithm, textiles, domestic washing process

Procedia PDF Downloads 357
4851 Monocular Visual Odometry for Three Different View Angles by Intel Realsense T265 with the Measurement of Remote

Authors: Heru Syah Putra, Aji Tri Pamungkas Nurcahyo, Chuang-Jan Chang

Abstract:

MOIL-SDK method refers to the spatial angle that forms a view with a different perspective from the Fisheye image. Visual Odometry forms a trusted application for extending projects by tracking using image sequences. A real-time, precise, and persistent approach that is able to contribute to the work when taking datasets and generate ground truth as a reference for the estimates of each image using the FAST Algorithm method in finding Keypoints that are evaluated during the tracking process with the 5-point Algorithm with RANSAC, as well as produce accurate estimates the camera trajectory for each rotational, translational movement on the X, Y, and Z axes.

Keywords: MOIL-SDK, intel realsense T265, Fisheye image, monocular visual odometry

Procedia PDF Downloads 134
4850 A Multifactorial Algorithm to Automate Screening of Drug-Induced Liver Injury Cases in Clinical and Post-Marketing Settings

Authors: Osman Turkoglu, Alvin Estilo, Ritu Gupta, Liliam Pineda-Salgado, Rajesh Pandey

Abstract:

Background: Hepatotoxicity can be linked to a variety of clinical symptoms and histopathological signs, posing a great challenge in the surveillance of suspected drug-induced liver injury (DILI) cases in the safety database. Additionally, the majority of such cases are rare, idiosyncratic, highly unpredictable, and tend to demonstrate unique individual susceptibility; these qualities, in turn, lend to a pharmacovigilance monitoring process that is often tedious and time-consuming. Objective: Develop a multifactorial algorithm to assist pharmacovigilance physicians in identifying high-risk hepatotoxicity cases associated with DILI from the sponsor’s safety database (Argus). Methods: Multifactorial selection criteria were established using Structured Query Language (SQL) and the TIBCO Spotfire® visualization tool, via a combination of word fragments, wildcard strings, and mathematical constructs, based on Hy’s law criteria and pattern of injury (R-value). These criteria excluded non-eligible cases from monthly line listings mined from the Argus safety database. The capabilities and limitations of these criteria were verified by comparing a manual review of all monthly cases with system-generated monthly listings over six months. Results: On an average, over a period of six months, the algorithm accurately identified 92% of DILI cases meeting established criteria. The automated process easily compared liver enzyme elevations with baseline values, reducing the screening time to under 15 minutes as opposed to multiple hours exhausted using a cognitively laborious, manual process. Limitations of the algorithm include its inability to identify cases associated with non-standard laboratory tests, naming conventions, and/or incomplete/incorrectly entered laboratory values. Conclusions: The newly developed multifactorial algorithm proved to be extremely useful in detecting potential DILI cases, while heightening the vigilance of the drug safety department. Additionally, the application of this algorithm may be useful in identifying a potential signal for DILI in drugs not yet known to cause liver injury (e.g., drugs in the initial phases of development). This algorithm also carries the potential for universal application, due to its product-agnostic data and keyword mining features. Plans for the tool include improving it into a fully automated application, thereby completely eliminating a manual screening process.

Keywords: automation, drug-induced liver injury, pharmacovigilance, post-marketing

Procedia PDF Downloads 152
4849 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm

Authors: Xiang Jianhong, Wang Cong, Wang Linyu

Abstract:

With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.

Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal

Procedia PDF Downloads 128
4848 Investigating University Language Teacher’s Perception of Their Identities in the Algerian Multilingual Context

Authors: Yousra Drissi

Abstract:

This research explores language teacher identity in a multilingual context where both teachers and students come from different linguistic backgrounds. It seeks to understand how teachers perceive themselves as language teachers in this context in relation to different influencing factors, both internal and external. This study is being conducted due to the importance of language teacher identity (LTI) in the university context, which is being neglected in the present literature (in an attempt to address the gap in the present literature). The broader aim of this study is to bring attention to language teacher identity along with the different influencing elements which can either promote or hinder its development. In this research, we are using the sociocultural theory and post-structural theory. This research uses the mixed methods approach to collect and analyse relevant data. A structured survey was distributed to language teachers from different universities around Algeria, followed by in-depth interviews. Results are supposed to show the different points in self-perception that these teachers share or differ in. they will also help us identify the different internal and external factors that can be of influence. However, the results of this research can be used by institutions as well as decision-makers to better understand university teachers and help them improve their teaching practices by empowering their language teacher identity, starting from teacher education programs to continuous teacher development programs.

Keywords: identity, language teacher identity, multilingualism, university teacher

Procedia PDF Downloads 77
4847 Design and Implementation of DC-DC Converter with Inc-Cond Algorithm

Authors: Mustafa Engin Başoğlu, Bekir Çakır

Abstract:

The most important component affecting the efficiency of photovoltaic power systems are solar panels. Efficiency of these systems are significantly affected because of being low efficiency of solar panel. Therefore, solar panels should be operated under maximum power point conditions through a power converter. In this study, design boost converter with maximum power point tracking (MPPT) operation has been designed and performed with Incremental Conductance (Inc-Cond) algorithm by using direct duty control. Furthermore, it is shown that performance of boost converter with MPPT operation fails under low load resistance connection.

Keywords: boost converter, incremental conductance (Inc-Cond), MPPT, solar panel

Procedia PDF Downloads 1046
4846 Stochastic Simulation of Random Numbers Using Linear Congruential Method

Authors: Melvin Ballera, Aldrich Olivar, Mary Soriano

Abstract:

Digital computers nowadays must be able to have a utility that is capable of generating random numbers. Usually, computer-generated random numbers are not random given predefined values such as starting point and end points, making the sequence almost predictable. There are many applications of random numbers such business simulation, manufacturing, services domain, entertainment sector and other equally areas making worthwhile to design a unique method and to allow unpredictable random numbers. Applying stochastic simulation using linear congruential algorithm, it shows that as it increases the numbers of the seed and range the number randomly produced or selected by the computer becomes unique. If this implemented in an environment where random numbers are very much needed, the reliability of the random number is guaranteed.

Keywords: stochastic simulation, random numbers, linear congruential algorithm, pseudorandomness

Procedia PDF Downloads 316
4845 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data

Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.

Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry

Procedia PDF Downloads 219
4844 Selling Electric Vehicles: Experiences from Car Salesmen in Sweden

Authors: Jens Hagman, Jenny Janhager Stier, Ellen Olausson, Anne Y. Faxer, Ana Magazinius

Abstract:

Sweden has the second highest electric vehicle (plug-in hybrid and battery electric vehicle) sales per capita in Europe but in relation to sales of internal combustion engine electric vehicles sales are still minuscular (< 4%). Much research effort has been placed on various technical and user focused barriers and enablers for adoption of electric vehicles. Less effort has been placed on investigating the retail (dealership-customer) sales process of vehicles in general and electric vehicles in particular. Arguably, no one ought to be better informed about needs and desires of potential electric vehicle buyers than car salesmen, originating from their daily encounters with customers at the dealership. The aim of this paper is to explore the conditions of selling electric vehicle from a car salesmen’s perspective. This includes identifying barriers and enablers for electric vehicle sales originating from internal (dealership and brand) and external (customer, government) sources. In this interview study five car brands (manufacturers) that sell both electric and internal combustion engine vehicles have been investigated. A total of 15 semi-structured interviews have been conducted (three per brand, in rural and urban settings and at different dealerships). Initial analysis reveals several barriers and enablers, experienced by car salesmen, which influence electric vehicle sales. Examples of as reported by car salesmen identified barriers are: -Electric vehicles earn car salesmen less commission on average compared to internal combustion engine vehicles. -It takes more time to sell and deliver an electric vehicle than an internal combustion engine vehicle. -Current leasing contracts entails relatively low second-hand value estimations for electric vehicles and thus a high leasing fee, which negatively affects the attractiveness of electric vehicles for private consumers in particular. -High purchasing price discourages many consumers from considering electric vehicles. -The education and knowledge level of electric vehicles differs between car salesmen, which could affect their self-confidence in meeting well prepared and question prone electric vehicle buyers. Examples of identified enablers are: -Company car tax regulation promotes sales of electric vehicles; in particular, plug-in hybrid electric vehicles are sold extensively to companies (up to 95 % of sales). -Low operating cost of electric vehicles such as fuel and service is an advantage when understood by consumers. -The drive performance of electric vehicles (quick, silent and fun to drive) is attractive to consumers. -Environmental aspects are considered important for certain consumer groups. -Fast technological improvements, such as increased range are opening up a wider market for electric vehicles. -For one of the brands; attractive private lease campaigns have proved effective to promote sales. This paper gives insights of an important but often overlooked aspect for the diffusion of electric vehicles (and durable products in general); the interaction between car salesmen and customers at the critical acquiring moment. Extracted through interviews with multiple car salesmen. The results illuminate untapped potential for sellers (salesmen, dealerships and brands) to mitigating sales barriers and strengthening sales enablers and thus becoming a more important actor in the electric vehicle diffusion process.

Keywords: customer barriers, electric vehicle promotion, sales of electric vehicles, interviews with car salesmen

Procedia PDF Downloads 229