Search results for: multi-objective linear programming
2585 Cloud Computing in Data Mining: A Technical Survey
Authors: Ghaemi Reza, Abdollahi Hamid, Dashti Elham
Abstract:
Cloud computing poses a diversity of challenges in data mining operation arising out of the dynamic structure of data distribution as against the use of typical database scenarios in conventional architecture. Due to immense number of users seeking data on daily basis, there is a serious security concerns to cloud providers as well as data providers who put their data on the cloud computing environment. Big data analytics use compute intensive data mining algorithms (Hidden markov, MapReduce parallel programming, Mahot Project, Hadoop distributed file system, K-Means and KMediod, Apriori) that require efficient high performance processors to produce timely results. Data mining algorithms to solve or optimize the model parameters. The challenges that operation has to encounter is the successful transactions to be established with the existing virtual machine environment and the databases to be kept under the control. Several factors have led to the distributed data mining from normal or centralized mining. The approach is as a SaaS which uses multi-agent systems for implementing the different tasks of system. There are still some problems of data mining based on cloud computing, including design and selection of data mining algorithms.Keywords: cloud computing, data mining, computing models, cloud services
Procedia PDF Downloads 4792584 The Relationship between Amplitude and Stability of Circadian Rhythm with Sleep Quality and Sleepiness: A Population Study, Kerman 2018
Authors: Akram Sadat Jafari Roodbandi, Farzaneh Akbari, Vafa Feyzi, Zahra Zare, Zohreh Foroozanfar
Abstract:
Introduction: Circadian rhythm or sleep-awake cycle in 24 hours is one of the important factors affecting the physiological and psychological characteristics in humans that contribute to biochemical, physiological and behavioral processes and helps people to set up brain and body for sleep or active awakening during certain hours. The purpose of this study was to investigate the relationship between the characteristics of circadian rhythms on the sleep quality and sleepiness according to their demographic characteristics such as age. Methods: This cross-sectional descriptive-analytic study was carried out among the general population of Kerman, aged 15-84 years. After dividing the age groups into 10-year demographic characteristics questionnaire, the type of circadian questionnaire, Pittsburgh sleep quality questionnaire and Euporth sleepiness questionnaire were completed in equal numbers between men and women of that age group. Using cluster sampling with effect design equal 2, 1300 questionnaires were distributed during the various hours of 24 hours in public places in Kerman city. Data analysis was done using SPSS software and univariate tests and linear regressions at a significance level of 0.05. Results: In this study, 1147 subjects were included in the study, 584 (50.9%) were male and the rest were women. The mean age was 39.50 ± 15.38. 133 (11.60%) subjects from the study participants had sleepiness and 308 (26.90%) subjects had undesirable sleep quality. Using linear regression test, sleep quality was the significant correlation with sex, hours needed for sleep at 24 hours, chronic illness, sleepiness, and circadian rhythm amplitude. Sleepiness was the meaningful relationship with marital status, sleep-wake schedule of other family members and the stability of circadian rhythm. Both women and men, with age, decrease the quality of sleep and increase the rate of sleepiness. Conclusion: Age, sex, and type of circadian people, the need for sleep at 24 hours, marital status, sleep-wake schedule of other family members are significant factors related to the sleep quality and sleepiness and their adaptation to night shift work.Keywords: circadian type, sleep quality, sleepiness, age, shift work
Procedia PDF Downloads 1542583 Dependence of the Photoelectric Exponent on the Source Spectrum of the CT
Authors: Rezvan Ravanfar Haghighi, V. C. Vani, Suresh Perumal, Sabyasachi Chatterjee, Pratik Kumar
Abstract:
X-ray attenuation coefficient [µ(E)] of any substance, for energy (E), is a sum of the contributions from the Compton scattering [ μCom(E)] and photoelectric effect [µPh(E)]. In terms of the, electron density (ρe) and the effective atomic number (Zeff) we have µCom(E) is proportional to [(ρe)fKN(E)] while µPh(E) is proportional to [(ρeZeffx)/Ey] with fKN(E) being the Klein-Nishina formula, with x and y being the exponents for photoelectric effect. By taking the sample's HU at two different excitation voltages (V=V1, V2) of the CT machine, we can solve for X=ρe, Y=ρeZeffx from these two independent equations, as is attempted in DECT inversion. Since µCom(E) and µPh(E) are both energy dependent, the coefficients of inversion are also dependent on (a) the source spectrum S(E,V) and (b) the detector efficiency D(E) of the CT machine. In the present paper we tabulate these coefficients of inversion for different practical manifestations of S(E,V) and D(E). The HU(V) values from the CT follow: <µ(V)>=<µw(V)>[1+HU(V)/1000] where the subscript 'w' refers to water and the averaging process <….> accounts for the source spectrum S(E,V) and the detector efficiency D(E). Linearity of μ(E) with respect to X and Y implies that (a) <µ(V)> is a linear combination of X and Y and (b) for inversion, X and Y can be written as linear combinations of two independent observations <µ(V1)>, <µ(V2)> with V1≠V2. These coefficients of inversion would naturally depend upon S(E, V) and D(E). We numerically investigate this dependence for some practical cases, by taking V = 100 , 140 kVp, as are used for cardiological investigations. The S(E,V) are generated by using the Boone-Seibert source spectrum, being superposed on aluminium filters of different thickness lAl with 7mm≤lAl≤12mm and the D(E) is considered to be that of a typical Si[Li] solid state and GdOS scintilator detector. In the values of X and Y, found by using the calculated inversion coefficients, errors are below 2% for data with solutions of glycerol, sucrose and glucose. For low Zeff materials like propionic acid, Zeffx is overestimated by 20% with X being within1%. For high Zeffx materials like KOH the value of Zeffx is underestimated by 22% while the error in X is + 15%. These imply that the source may have additional filtering than the aluminium filter specified by the manufacturer. Also it is found that the difference in the values of the inversion coefficients for the two types of detectors is negligible. The type of the detector does not affect on the DECT inversion algorithm to find the unknown chemical characteristic of the scanned materials. The effect of the source should be considered as an important factor to calculate the coefficients of inversion.Keywords: attenuation coefficient, computed tomography, photoelectric effect, source spectrum
Procedia PDF Downloads 4002582 Development of a Thermodynamic Model for Ladle Metallurgy Steel Making Processes Using Factsage and Its Macro Facility
Authors: Prasenjit Singha, Ajay Kumar Shukla
Abstract:
To produce high-quality steel in larger volumes, dynamic control of composition and temperature throughout the process is essential. In this paper, we developed a mass transfer model based on thermodynamics to simulate the ladle metallurgy steel-making process using FactSage and its macro facility. The overall heat and mass transfer processes consist of one equilibrium chamber, two non-equilibrium chambers, and one adiabatic reactor. The flow of material, as well as heat transfer, occurs across four interconnected unit chambers and a reactor. We used the macro programming facility of FactSage™ software to understand the thermochemical model of the secondary steel making process. In our model, we varied the oxygen content during the process and studied their effect on the composition of the final hot metal and slag. The model has been validated with respect to the plant data for the steel composition, which is similar to the ladle metallurgy steel-making process in the industry. The resulting composition profile serves as a guiding tool to optimize the process of ladle metallurgy in steel-making industries.Keywords: desulphurization, degassing, factsage, reactor
Procedia PDF Downloads 2172581 Mapping of Urban Green Spaces Towards a Balanced Planning in a Coastal Landscape
Authors: Rania Ajmi, Faiza Allouche Khebour, Aude Nuscia Taibi, Sirine Essasi
Abstract:
Urban green spaces (UGS) as an important contributor can be a significant part of sustainable development. A spatial method was employed to assess and map the spatial distribution of UGS in five districts in Sousse, Tunisia. Ecological management of UGS is an essential factor for the sustainable development of the city; hence the municipality of Sousse has decided to support the districts according to different green spaces characters. And to implement this policy, (1) a new GIS web application was developed, (2) then the implementation of the various green spaces was carried out, (3) a spatial mapping of UGS using Quantum GIS was realized, and (4) finally a data processing and statistical analysis with RStudio programming language was executed. The intersection of the results of the spatial and statistical analyzes highlighted the presence of an imbalance in terms of the spatial UGS distribution in the study area. The discontinuity between the coast and the city's green spaces was not designed in a spirit of network and connection, hence the lack of a greenway that connects these spaces to the city. Finally, this GIS support will be used to assess and monitor green spaces in the city of Sousse by decision-makers and will contribute to improve the well-being of the local population.Keywords: distributions, GIS, green space, imbalance, spatial analysis
Procedia PDF Downloads 2042580 Design of Labview Based DAQ System
Authors: Omar A. A. Shaebi, Matouk M. Elamari, Salaheddin Allid
Abstract:
The Information Computing System of Monitoring (ICSM) for the Research Reactor of Tajoura Nuclear Research Centre (TNRC) stopped working since early 1991. According to the regulations, the computer is necessary to operate the reactor up to its maximum power (10 MW). The fund is secured via IAEA to develop a modern computer based data acquisition system to replace the old computer. This paper presents the development of the Labview based data acquisition system to allow automated measurements using National Instruments Hardware and its labview software. The developed system consists of SCXI 1001 chassis, the chassis house four SCXI 1100 modules each can maintain 32 variables. The chassis is interfaced with the PC using NI PCI-6023 DAQ Card. Labview, developed by National Instruments, is used to run and operate the DAQ System. Labview is graphical programming environment suited for high level design. It allows integrating different signal processing components or subsystems within a graphical framework. The results showed system capabilities in monitoring variables, acquiring and saving data. Plus the capability of the labview to control the DAQ.Keywords: data acquisition, labview, signal conditioning, national instruments
Procedia PDF Downloads 4942579 An Overview of Posterior Fossa Associated Pathologies and Segmentation
Authors: Samuel J. Ahmad, Michael Zhu, Andrew J. Kobets
Abstract:
Segmentation tools continue to advance, evolving from manual methods to automated contouring technologies utilizing convolutional neural networks. These techniques have evaluated ventricular and hemorrhagic volumes in the past but may be applied in novel ways to assess posterior fossa-associated pathologies such as Chiari malformations. Herein, we summarize literature pertaining to segmentation in the context of this and other posterior fossa-based diseases such as trigeminal neuralgia, hemifacial spasm, and posterior fossa syndrome. A literature search for volumetric analysis of the posterior fossa identified 27 papers where semi-automated, automated, manual segmentation, linear measurement-based formulas, and the Cavalieri estimator were utilized. These studies produced superior data than older methods utilizing formulas for rough volumetric estimations. The most commonly used segmentation technique was semi-automated segmentation (12 studies). Manual segmentation was the second most common technique (7 studies). Automated segmentation techniques (4 studies) and the Cavalieri estimator (3 studies), a point-counting method that uses a grid of points to estimate the volume of a region, were the next most commonly used techniques. The least commonly utilized segmentation technique was linear measurement-based formulas (1 study). Semi-automated segmentation produced accurate, reproducible results. However, it is apparent that there does not exist a single semi-automated software, open source or otherwise, that has been widely applied to the posterior fossa. Fully-automated segmentation via such open source software as FSL and Freesurfer produced highly accurate posterior fossa segmentations. Various forms of segmentation have been used to assess posterior fossa pathologies and each has its advantages and disadvantages. According to our results, semi-automated segmentation is the predominant method. However, atlas-based automated segmentation is an extremely promising method that produces accurate results. Future evolution of segmentation technologies will undoubtedly yield superior results, which may be applied to posterior fossa related pathologies. Medical professionals will save time and effort analyzing large sets of data due to these advances.Keywords: chiari, posterior fossa, segmentation, volumetric
Procedia PDF Downloads 1062578 The Collaboration between Resident and Non-resident Patent Applicants as a Strategy to Accelerate Technological Advance in Developing Nations
Authors: Hugo Rodríguez
Abstract:
Migrations of researchers, scientists, and inventors are a widespread phenomenon in modern times. In some cases, migrants stay linked to research groups in their countries of origin, either out of their own conviction or because of government policies. We examine different linear models of technological development (using the Ordinary Least Squares (OLS) technique) in eight selected countries and find that the collaborations between resident and nonresident patent applicants correlate with different levels of performance of the technological policies in three different scenarios. Therefore, the reinforcement of that link must be considered a powerful tool for technological development.Keywords: development, collaboration, patents, technology
Procedia PDF Downloads 1272577 The Use of Language as a Cognitive Tool in French Immersion Teaching
Authors: Marie-Josée Morneau
Abstract:
A literacy-based approach, centred on the use of the language of instruction as a cognitive tool, can increase the L2 communication skills of French immersion students. Academic subject areas such as science and mathematics offer an authentic language learning context where students can become more proficient speakers while using specific vocabulary and language structures to learn, interact and communicate their reasoning, when provided the opportunities and guidance to do so. In this Canadian quasi-experimental study, the effects of teaching specific language elements during mathematic classes through literacy-based activities in Early French Immersion programming were compared between two Grade 7/8 groups: the experimental group, which received literacy-based teaching for a 6-week period, and the control group, which received regular teaching instruction. The results showed that the participants from the experimental group made more progress in their mathematical communication skills, which suggests that targeting L2 language as a cognitive tool can be beneficial to immersion learners who learn mathematic concepts and remind us that all L2 teachers are language teachers.Keywords: mathematics, French immersion, literacy-based, oral communication, L2
Procedia PDF Downloads 762576 Purification and Characterization of a Novel Extracellular Chitinase from Bacillus licheniformis LHH100
Authors: Laribi-Habchi Hasiba, Bouanane-Darenfed Amel, Drouiche Nadjib, Pausse André, Mameri Nabil
Abstract:
Chitin, a linear 1, 4-linked N-acetyl-d-glucosamine (GlcNAc) polysaccharide is the major structural component of fungal cell walls, insect exoskeletons and shells of crustaceans. It is one of the most abundant naturally occurring polysaccharides and has attracted tremendous attention in the fields of agriculture, pharmacology and biotechnology. Each year, a vast amount of chitin waste is released from the aquatic food industry, where crustaceans (prawn, crab, Shrimp and lobster) constitute one of the main agricultural products. This creates a serious environmental problem. This linear polymer can be hydrolyzed by bases, acids or enzymes such as chitinase. In this context an extracellular chitinase (ChiA-65) was produced and purified from a newly isolated LHH100. Pure protein was obtained after heat treatment and ammonium sulphate precipitation followed by Sephacryl S-200 chromatography. Based on matrix assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF/MS) analysis, the purified enzyme is a monomer with a molecular mass of 65,195.13 Da. The sequence of the 27 N-terminal residues of the mature ChiA-65 showed high homology with family-18 chitinases. Optimal activity was achieved at pH 4 and 75◦C. Among the inhibitors and metals tested p-chloromercuribenzoic acid, N-ethylmaleimide, Hg2+ and Hg + completelyinhibited enzyme activity. Chitinase activity was high on colloidal chitin, glycol chitin, glycol chitosane, chitotriose and chitooligosaccharide. Chitinase activity towards synthetic substrates in the order of p-NP-(GlcNAc) n (n = 2–4) was p-NP-(GlcNAc)2> p-NP-(GlcNAc)4> p-NP-(GlcNAc)3. Our results suggest that ChiA-65 preferentially hydrolyzed the second glycosidic link from the non-reducing end of (GlcNAc) n. ChiA-65 obeyed Michaelis Menten kinetics the Km and kcat values being 0.385 mg, colloidal chitin/ml and5000 s−1, respectively. ChiA-65 exhibited remarkable biochemical properties suggesting that this enzyme is suitable for bioconversion of chitin waste.Keywords: Bacillus licheniformis LHH100, characterization, extracellular chitinase, purification
Procedia PDF Downloads 4372575 Classifications of Images for the Recognition of People’s Behaviors by SIFT and SVM
Authors: Henni Sid Ahmed, Belbachir Mohamed Faouzi, Jean Caelen
Abstract:
Behavior recognition has been studied for realizing drivers assisting system and automated navigation and is an important studied field in the intelligent Building. In this paper, a recognition method of behavior recognition separated from a real image was studied. Images were divided into several categories according to the actual weather, distance and angle of view etc. SIFT was firstly used to detect key points and describe them because the SIFT (Scale Invariant Feature Transform) features were invariant to image scale and rotation and were robust to changes in the viewpoint and illumination. My goal is to develop a robust and reliable system which is composed of two fixed cameras in every room of intelligent building which are connected to a computer for acquisition of video sequences, with a program using these video sequences as inputs, we use SIFT represented different images of video sequences, and SVM (support vector machine) Lights as a programming tool for classification of images in order to classify people’s behaviors in the intelligent building in order to give maximum comfort with optimized energy consumption.Keywords: video analysis, people behavior, intelligent building, classification
Procedia PDF Downloads 3782574 Automated Heart Sound Classification from Unsegmented Phonocardiogram Signals Using Time Frequency Features
Authors: Nadia Masood Khan, Muhammad Salman Khan, Gul Muhammad Khan
Abstract:
Cardiologists perform cardiac auscultation to detect abnormalities in heart sounds. Since accurate auscultation is a crucial first step in screening patients with heart diseases, there is a need to develop computer-aided detection/diagnosis (CAD) systems to assist cardiologists in interpreting heart sounds and provide second opinions. In this paper different algorithms are implemented for automated heart sound classification using unsegmented phonocardiogram (PCG) signals. Support vector machine (SVM), artificial neural network (ANN) and cartesian genetic programming evolved artificial neural network (CGPANN) without the application of any segmentation algorithm has been explored in this study. The signals are first pre-processed to remove any unwanted frequencies. Both time and frequency domain features are then extracted for training the different models. The different algorithms are tested in multiple scenarios and their strengths and weaknesses are discussed. Results indicate that SVM outperforms the rest with an accuracy of 73.64%.Keywords: pattern recognition, machine learning, computer aided diagnosis, heart sound classification, and feature extraction
Procedia PDF Downloads 2632573 Discrimination between Defective and Non-Defective Coffee Beans Using a Laser Prism Spectrometer
Abstract:
The concentration- and temperature-dependent refractive indices of solutions extracted from defective and non-defective coffee beans have been investigated using a He–Ne laser. The refractive index has a linear relationship with the presumed concentration of the coffee solutions in the range of 0.5–3%. Higher and lower values of refractive index were obtained for immature and non-defective coffee beans, respectively. The Refractive index of bean extracts can be successfully used to separate defective from non-defective beans.Keywords: coffee extract, refractive index, temperature dependence
Procedia PDF Downloads 1502572 Training of Future Computer Science Teachers Based on Machine Learning Methods
Authors: Meruert Serik, Nassipzhan Duisegaliyeva, Danara Tleumagambetova
Abstract:
The article highlights and describes the characteristic features of real-time face detection in images and videos using machine learning algorithms. Students of educational programs reviewed the research work "6B01511-Computer Science", "7M01511-Computer Science", "7M01525- STEM Education," and "8D01511-Computer Science" of Eurasian National University named after L.N. Gumilyov. As a result, the advantages and disadvantages of Haar Cascade (Haar Cascade OpenCV), HoG SVM (Histogram of Oriented Gradients, Support Vector Machine), and MMOD CNN Dlib (Max-Margin Object Detection, convolutional neural network) detectors used for face detection were determined. Dlib is a general-purpose cross-platform software library written in the programming language C++. It includes detectors used for determining face detection. The Cascade OpenCV algorithm is efficient for fast face detection. The considered work forms the basis for the development of machine learning methods by future computer science teachers.Keywords: algorithm, artificial intelligence, education, machine learning
Procedia PDF Downloads 732571 Innovative Design of Spherical Robot with Hydraulic Actuator
Authors: Roya Khajepour, Alireza B. Novinzadeh
Abstract:
In this paper, the spherical robot is modeled using the Band-Graph approach. This breed of robots is typically employed in expedition missions to unknown territories. Its motion mechanism is based on convection of a fluid in a set of three donut vessels, arranged orthogonally in space. This robot is a non-linear, non-holonomic system. This paper utilizes the Band-Graph technique to derive the torque generation mechanism in a spherical robot. Eventually, this paper describes the motion of a sphere due to the exerted torque components.Keywords: spherical robot, Band-Graph, modeling, torque
Procedia PDF Downloads 3502570 External Business Environment and Sustainability of Micro, Small and Medium Enterprises in Jigawa State, Nigeria
Authors: Shehu Isyaku
Abstract:
The general objective of the study was to investigate ‘the relationship between the external business environment and the sustainability of micro, small and medium enterprises (MSMEs) in Jigawa state’, Nigeria. Specifically, the study was to examine the relationship between 1) the economic environment, 2) the social environment, 3) the technological environment, and 4) the political environment and the sustainability of MSMEs in Jigawa state, Nigeria. The study was drawn on Resource-Based View (RBV) Theory and Knowledge-Based View (KBV). The study employed a descriptive cross-sectional survey design. A researcher-made questionnaire was used to collect data from the 350 managers/owners who were selected using stratified, purposive and simple random sampling techniques. Data analysis was done using means and standard deviations, factor analysis, Correlation Coefficient, and Pearson Linear Regression analysis. The findings of the study revealed that the sustainability potentials of the managers/owners were rated as high potential (economic, environmental, and social sustainability using 5 5-point Likert scale. Mean ratings of effectiveness of the external business environment were; as highly effective. The results from the Pearson Linear Regression Analysis rejected the hypothesized non-significant effect of the external business environment on the sustainability of MSMEs. Specifically, there is a positive significant relationship between 1) economic environment and sustainability; 2) social environment and sustainability; 3) technological environment and sustainability and political environment and sustainability. The researcher concluded that MSME managers/owners have a high potential for economic, social and environmental sustainability and that all the constructs of the external business environment (economic environment, social environment, technological environment and political environment) have a positive significant relationship with the sustainability of MSMEs. Finally, the researcher recommended that 1) MSME managers/owners need to develop marketing strategies and intelligence systems to accumulate information about the competitors and customers' demands, 2) managers/owners should utilize the customers’ cultural and religious beliefs as an opportunity that should be utilized while formulating business strategies.Keywords: business environment, sustainability, small and medium enterprises, external business environment
Procedia PDF Downloads 532569 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 1502568 Integral Image-Based Differential Filters
Authors: Kohei Inoue, Kenji Hara, Kiichi Urahama
Abstract:
We describe a relationship between integral images and differential images. First, we derive a simple difference filter from conventional integral image. In the derivation, we show that an integral image and the corresponding differential image are related to each other by simultaneous linear equations, where the numbers of unknowns and equations are the same, and therefore, we can execute the integration and differentiation by solving the simultaneous equations. We applied the relationship to an image fusion problem, and experimentally verified the effectiveness of the proposed method.Keywords: integral images, differential images, differential filters, image fusion
Procedia PDF Downloads 5062567 Solving the Transportation Problem for Warehouses and Dealers in Bangalore City
Authors: S. Aditya, K. T. Nideesh, N. Guruprasad
Abstract:
Being a subclass of linear programing problem, the Transportation Problem is a classic Operations Research problem where the objective is to determine the schedule for transporting goods from source to destination in a way that minimizes the shipping cost while satisfying supply and demand constraints. In this paper, we are representing the transportation problem for various warehouses along with various dealers situated in Bangalore city to reduce the transportation cost incurred by them as of now. The problem is solved by obtaining the Initial Basic feasible Solution through various methods and further proceeding to obtain optimal cost.Keywords: NW method, optimum utilization, transportation problem, Vogel’s approximation method
Procedia PDF Downloads 4382566 Improved Computational Efficiency of Machine Learning Algorithm Based on Evaluation Metrics to Control the Spread of Coronavirus in the UK
Authors: Swathi Ganesan, Nalinda Somasiri, Rebecca Jeyavadhanam, Gayathri Karthick
Abstract:
The COVID-19 crisis presents a substantial and critical hazard to worldwide health. Since the occurrence of the disease in late January 2020 in the UK, the number of infected people confirmed to acquire the illness has increased tremendously across the country, and the number of individuals affected is undoubtedly considerably high. The purpose of this research is to figure out a predictive machine learning archetypal that could forecast COVID-19 cases within the UK. This study concentrates on the statistical data collected from 31st January 2020 to 31st March 2021 in the United Kingdom. Information on total COVID cases registered, new cases encountered on a daily basis, total death registered, and patients’ death per day due to Coronavirus is collected from World Health Organisation (WHO). Data preprocessing is carried out to identify any missing values, outliers, or anomalies in the dataset. The data is split into 8:2 ratio for training and testing purposes to forecast future new COVID cases. Support Vector Machines (SVM), Random Forests, and linear regression algorithms are chosen to study the model performance in the prediction of new COVID-19 cases. From the evaluation metrics such as r-squared value and mean squared error, the statistical performance of the model in predicting the new COVID cases is evaluated. Random Forest outperformed the other two Machine Learning algorithms with a training accuracy of 99.47% and testing accuracy of 98.26% when n=30. The mean square error obtained for Random Forest is 4.05e11, which is lesser compared to the other predictive models used for this study. From the experimental analysis Random Forest algorithm can perform more effectively and efficiently in predicting the new COVID cases, which could help the health sector to take relevant control measures for the spread of the virus.Keywords: COVID-19, machine learning, supervised learning, unsupervised learning, linear regression, support vector machine, random forest
Procedia PDF Downloads 1212565 Response Regimes and Vibration Mitigation in Equivalent Mechanical Model of Strongly Nonlinear Liquid Sloshing
Authors: Maor Farid, Oleg Gendelman
Abstract:
Equivalent mechanical model of liquid sloshing in partially-filled cylindrical vessel is treated in the cases of free oscillations and of horizontal base excitation. The model is designed to cover both the linear and essentially nonlinear sloshing regimes. The latter fluid behaviour might involve hydraulic impacts interacting with the inner walls of the tank. These impulsive interactions are often modeled by high-power potential and dissipation functions. For the sake of analytical description, we use the traditional approach by modeling the impacts with velocity-dependent restitution coefficient. This modelling is similar to vibro-impact nonlinear energy sink (VI NES) which was recently explored for its vibration mitigation performances and nonlinear response regimes. Steady-state periodic regimes and chaotic strongly modulated responses (CSMR) are detected. Those dynamical regimes were described by the system's slow motion on the slow invariant manifold (SIM). There is a good agreement between the analytical results and numerical simulations. Subsequently, Finite-Element (FE) method is used to determine and verify the model parameters and to identify dominant dynamical regimes, natural modes and frequencies. The tank failure modes are identified and critical locations are identified. Mathematical relation is found between degrees-of-freedom (DOFs) motion and the mechanical stress applied in the tank critical section. This is the prior attempt to take under consideration large-amplitude nonlinear sloshing and tank structure elasticity effects for design, regulation definition and resistance analysis purposes. Both linear (tuned mass damper, TMD) and nonlinear (nonlinear energy sink, NES) passive energy absorbers contribution to the overall system mitigation is firstly examined, in terms of both stress reduction and time for vibration decay.Keywords: nonlinear energy sink (NES), reduced-order modelling, liquid sloshing, vibration mitigation, vibro-impact dynamics
Procedia PDF Downloads 1452564 How to Use E-Learning to Increase Job Satisfaction in Large Commercial Bank in Bangkok
Authors: Teerada Apibunyopas, Nithinant Thammakoranonta
Abstract:
Many organizations bring e-Learning to use as a tool in their training and human development department. It is getting more popular because it is easy to access to get knowledge all the time and also it provides a rich content, which can develop the employees skill efficiently. This study focused on the factors that affect using e-Learning efficiently, so it will make job satisfaction increased. The questionnaires were sent to employees in large commercial banks, which use e-Learning located in Bangkok, the results from multiple linear regression analysis showed that employee’s characteristics, characteristics of e-Learning, learning and growth have influence on job satisfaction.Keywords: e-Learning, job satisfaction, learning and growth, Bangkok
Procedia PDF Downloads 4912563 Intelligent Staff Scheduling: Optimizing the Solver with Tabu Search
Authors: Yu-Ping Chiu, Dung-Ying Lin
Abstract:
Traditional staff scheduling methods, relying on employee experience, often lead to inefficiencies and resource waste. The challenges of transferring scheduling expertise and adapting to changing labor regulations further complicate this process. Manual approaches become increasingly impractical as companies accumulate complex scheduling rules over time. This study proposes an algorithmic optimization approach to address these issues, aiming to expedite scheduling while ensuring strict compliance with labor regulations and company policies. The method focuses on generating optimal schedules that minimize weighted company objectives within a compressed timeframe. Recognizing the limitations of conventional commercial software in modeling and solving complex real-world scheduling problems efficiently, this research employs Tabu Search with both long-term and short-term memory structures. The study will present numerical results and managerial insights to demonstrate the effectiveness of this approach in achieving intelligent and efficient staff scheduling.Keywords: intelligent memory structures, mixed integer programming, meta-heuristics, staff scheduling problem, tabu search
Procedia PDF Downloads 252562 Numerical Study of Fatigue Crack Growth at a Web Stiffener of Ship Structural Details
Authors: Wentao He, Jingxi Liu, De Xie
Abstract:
It is necessary to manage the fatigue crack growth (FCG) once those cracks are detected during in-service inspections. In this paper, a simulation program (FCG-System) is developed utilizing the commercial software ABAQUS with its object-oriented programming interface to simulate the fatigue crack path and to compute the corresponding fatigue life. In order to apply FCG-System in large-scale marine structures, the substructure modeling technique is integrated in the system under the consideration of structural details and load shedding during crack growth. Based on the nodal forces and nodal displacements obtained from finite element analysis, a formula for shell elements to compute stress intensity factors is proposed in the view of virtual crack closure technique. The cracks initiating from the intersection of flange and the end of the web-stiffener are investigated for fatigue crack paths and growth lives under water pressure loading and axial force loading, separately. It is found that the FCG-System developed by authors could be an efficient tool to perform fatigue crack growth analysis on marine structures.Keywords: crack path, fatigue crack, fatigue live, FCG-system, virtual crack closure technique
Procedia PDF Downloads 5682561 A Robust Optimization Model for Multi-Objective Closed-Loop Supply Chain
Authors: Mohammad Y. Badiee, Saeed Golestani, Mir Saman Pishvaee
Abstract:
In recent years consumers and governments have been pushing companies to design their activities in such a way as to reduce negative environmental impacts by producing renewable product or threat free disposal policy more and more. It is therefore important to focus more accurate to the optimization of various aspect of total supply chain. Modeling a supply chain can be a challenging process due to the fact that there are a large number of factors that need to be considered in the model. The use of multi-objective optimization can lead to overcome those problems since more information is used when designing the model. Uncertainty is inevitable in real world. Considering uncertainty on parameters in addition to use multi-objectives are ways to give more flexibility to the decision making process since the process can take into account much more constraints and requirements. In this paper we demonstrate a stochastic scenario based robust model to cope with uncertainty in a closed-loop multi-objective supply chain. By applying the proposed model in a real world case, the power of proposed model in handling data uncertainty is shown.Keywords: supply chain management, closed-loop supply chain, multi-objective optimization, goal programming, uncertainty, robust optimization
Procedia PDF Downloads 4162560 Analysis of the Inverse Kinematics for 5 DOF Robot Arm Using D-H Parameters
Authors: Apurva Patil, Maithilee Kulkarni, Ashay Aswale
Abstract:
This paper proposes an algorithm to develop the kinematic model of a 5 DOF robot arm. The formulation of the problem is based on finding the D-H parameters of the arm. Brute Force iterative method is employed to solve the system of non linear equations. The focus of the paper is to obtain the accurate solutions by reducing the root mean square error. The result obtained will be implemented to grip the objects. The trajectories followed by the end effector for the required workspace coordinates are plotted. The methodology used here can be used in solving the problem for any other kinematic chain of up to six DOF.Keywords: 5 DOF robot arm, D-H parameters, inverse kinematics, iterative method, trajectories
Procedia PDF Downloads 2022559 Nonlinear Observer Canonical Form for Genetic Regulation Process
Authors: Bououden Soraya
Abstract:
This paper aims to study the existence of the change of coordinates which permits to transform a class of nonlinear dynamical systems into the so-called nonlinear observer canonical form (NOCF). Moreover, an algorithm to construct such a change of coordinates is given. Based on this form, we can design an observer with a linear error dynamic. This enables us to estimate the state of a nonlinear dynamical system. A concrete example (biological model) is provided to illustrate the feasibility of the proposed results.Keywords: nonlinear observer canonical form, observer, design, gene regulation, gene expression
Procedia PDF Downloads 4332558 An Unexpected Helping Hand: Consequences of Redistribution on Personal Ideology
Authors: Simon B.A. Egli, Katja Rost
Abstract:
Literature on redistributive preferences has proliferated in past decades. A core assumption behind it is that variation in redistributive preferences can explain different levels of redistribution. In contrast, this paper considers the reverse. What if it is redistribution that changes redistributive preferences? The core assumption behind the argument is that if self-interest - which we label concrete preferences - and ideology - which we label abstract preferences - come into conflict, the former will prevail and lead to an adjustment of the latter. To test the hypothesis, data from a survey conducted in Switzerland during the first wave of the COVID-19 crisis is used. A significant portion of the workforce at the time unexpectedly received state money through the short-time working program. Short-time work was used as a proxy for self-interest and was tested (1) on the support given to hypothetical, ailing firms during the crisis and (2) on the prioritization of justice principles guiding state action. In a first step, several models using OLS-regressions on political orientation were estimated to test our hypothesis as well as to check for non-linear effects. We expected support for ailing firms to be the same regardless of ideology but only for people on short-time work. The results both confirm our hypothesis and suggest a non-linear effect. Far-right individuals on short-time work were disproportionally supportive compared to moderate ones. In a second step, ordered logit models were estimated to test the impact of short-time work and political orientation on the rankings of the distributive justice principles need, performance, entitlement, and equality. The results show that being on short-time work significantly alters the prioritization of justice principles. Right-wing individuals are much more likely to prioritize need and equality over performance and entitlement when they receive government assistance. No such effect is found among left-wing individuals. In conclusion, we provide moderate to strong evidence that unexpectedly finding oneself at the receiving end changes redistributive preferences if personal ideology is antithetical to redistribution. The implications of our findings on the study of populism, personal ideologies, and political change are discussed.Keywords: COVID-19, ideology, redistribution, redistributive preferences, self-interest
Procedia PDF Downloads 1402557 A Multi-Tenant Problem Oriented Medical Record System for Representing Patient Care Cases using SOAP (Subjective-Objective-Assessment-Plan) Note
Authors: Sabah Mohammed, Jinan Fiaidhi, Darien Sawyer
Abstract:
Describing clinical cases according to a clinical charting standard that enforces interoperability and enables connected care services can save lives in the event of a medical emergency or provide efficient and effective interventions for the benefit of the patients through the integration of bedside and bench side clinical research. This article presented a multi-tenant extension to the problem-oriented medical record that we have prototyped previously upon using the GraphQL Application Programming Interface to represent the notion of a problem list. Our implemented extension enables physicians and patients to collaboratively describe the patient case via using multi chatbots to collaboratively describe the patient case using the SOAP charting standard. Our extension also connects the described SOAP patient case with the HL7 FHIR (Health Interoperability Resources) medical record for connecting the patient case to the bench data.Keywords: problem-oriented medical record, graphQL, chatbots, SOAP
Procedia PDF Downloads 912556 Optimizing Logistics for Courier Organizations with Considerations of Congestions and Pickups: A Courier Delivery System in Amman as Case Study
Authors: Nader A. Al Theeb, Zaid Abu Manneh, Ibrahim Al-Qadi
Abstract:
Traveling salesman problem (TSP) is a combinatorial integer optimization problem that asks "What is the optimal route for a vehicle to traverse in order to deliver requests to a given set of customers?”. It is widely used by the package carrier companies’ distribution centers. The main goal of applying the TSP in courier organizations is to minimize the time that it takes for the courier in each trip to deliver or pick up the shipments during a day. In this article, an optimization model is constructed to create a new TSP variant to optimize the routing in a courier organization with a consideration of congestion in Amman, the capital of Jordan. Real data were collected by different methods and analyzed. Then, concert technology - CPLEX was used to solve the proposed model for some random generated data instances and for the real collected data. At the end, results have shown a great improvement in time compared with the current trip times, and an economic study was conducted afterwards to figure out the impact of using such models.Keywords: travel salesman problem, congestions, pick-up, integer programming, package carriers, service engineering
Procedia PDF Downloads 429