Search results for: probabilistic methods
14776 The Anatomy and Characteristics of Online Romance Scams
Authors: Danuvasin Charoen
Abstract:
Online romance scams are conducted by criminals using social networks and dating sites. These criminals use love to deceive the victims to send them money. The victims not only lose money to the criminals, but they are also heartbroken. This study investigates how online romance scams work and why people become victims to them. The researcher also identifies the characteristics of the perpetrators and victims. The data were collected from in-depth interviews with former victims and police officers responsible for the cases. By studying the methods and characteristics of the online romance scam, we can develop effective methods and policies to reduce the rates of such crimes.Keywords: romance scam, online scam, phishing, cybercrime
Procedia PDF Downloads 15714775 A Combinatorial Approach of Treatment for Landfill Leachate
Authors: Anusha Atmakuri, R. D. Tyagi, Patrick Drogui
Abstract:
Landfilling is the most familiar and easy way to dispose solid waste. Landfill is generally received via wastes from municipal near to a landfill. The waste collected is from commercial, industrial, and residential areas and many more. Landfill leachate (LFL) is formed when rainwater passes through the waste placed in landfills and consists of several dissolved organic materials, for instance, aquatic humic substances (AHS), volatile fatty acids (VFAs), heavy metals, inorganic macro components, and xenobiotic organic matters, highly toxic to the environment. These components of LFL put a load on it, hence it necessitates the treatment of LFL prior to its discharge into the environment. Various methods have been used to treat LFL over the years, such as physical, chemical, biological, physicochemical, electrical, and advanced oxidation methods. This study focuses on the combination of biological and electrochemical methods- extracellular polymeric substances and electrocoagulation(EC). The coupling of electro-coagulation process with extracellular polymeric substances (EPS) (as flocculant) as pre and\or post treatment strategy provides efficient and economical process for the decontamination of landfill leachate contaminated with suspended matter, metals (e.g., Fe, Mn) and ammonical nitrogen. Electro-coagulation and EPS mediated coagulation approach could be an economically viable for the treatment of landfill leachate, along with possessing several other advantages over several other methods. This study utilised waste substrates such as activated sludge, crude glycerol and waste cooking oil for the production of EPS using fermentation technology. A comparison of different scenarios for the treatment of landfill leachate is presented- such as using EPS alone as bioflocculant, EPS and EC with EPS being the 1st stage, and EPS and EC with EC being the 1st stage. The work establishes the use of crude EPS as a bioflocculant for the treatment of landfill leachate and wastewater from a site near a landfill, along with EC being successful in removal of some major pollutants such as COD, turbidity, total suspended solids. A combination of these two methods is to be explored more for the complete removal of all pollutants from landfill leachate.Keywords: landfill leachate, extracellular polymeric substances, electrocoagulation, bioflocculant.
Procedia PDF Downloads 8614774 CMMI Key Process Areas and FDD Practices
Authors: Rituraj Deka, Nomi Baruah
Abstract:
The development of information technology during the past few years resulted in designing of more and more complex software. The outsourcing of software development makes a higher requirement for the management of software development project. Various software enterprises follow various paths in their pursuit of excellence, applying various principles, methods and techniques along the way. The new research is proving that CMMI and Agile methodologies can benefit from using both methods within organizations with the potential to dramatically improve business performance. The paper describes a mapping between CMMI key process areas (KPAs) and Feature-Driven Development (FDD) communication perspective, so as to increase the understanding of how improvements can be made in the software development process.Keywords: Agile, CMMI, FDD, KPAs
Procedia PDF Downloads 45914773 Historical Studies on Gilt Decorations on Glazed Surfaces
Authors: Sabra Saeidi
Abstract:
This research focuses on the historical techniques associated with the lajevardina and Haft-Rangi production methods in creating tiles, with emphasis on the identification of the techniques of inserting gold sheets on the surface of such historical glazed tiles. In this regard, firstly, the history of the production of enamel, gold plated, and Lajevardina glazed pottery work made during the Khwarizmanshahid and Mongol era (eleventh to the thirteenth century) have been assessed to reach a better understanding of the background and the history associated with historical glazing methods. After the historical overview of the production technique of glazed pottery work and introductions of the civilizations using those techniques, we focused on the niches production methods of enamel and Lajevardina glazing, which are two categories of decorations usually found in tiles. Next, a general classification method for various types of gilt tiles has been introduced, which is applicable to the tile works up to Safavid period (Sixteenth to the seventeenth century). Gilded lajevardina glazed tiles, gilt Haft-Rangi tiles, monolithic glazed gilt tiles, and gilt mosaic tiles are included in the categories.Keywords: gilt tiles, Islamic art, Iranian art, historical studies, gilding
Procedia PDF Downloads 12314772 New Method for the Synthesis of Different Pyrroloquinazolinoquinolin Alkaloids
Authors: Abdulkareem M. Hamid, Yaseen Elhebshi, Adam Daïch
Abstract:
Luotonins and its derivatives (Isoluotonins) are alkaloids from the aerial parts of Peganum nigellastrum Bunge that display three major skeleton types. Luotonins A, B, and E are pyrroloquinazolinoquinoline alkaloids. A few methods were known for the sysnthesis of Isoluotonin. All luotonins have shown promising cytotoxicities towards selected human cancer cell lines, especially against leukemia P-388 cells. Luotonin A is the most active one, with its activity stemming from topoisomerase I-dependent DNA-cleavage. Such intriguing biological activities and unique structures have led not only to the development of synthetic methods for the efficient synthesis of these compounds, but also to interest in structural modifications for improving the biological properties. Recent progress in the study of luotonins is covered.Keywords: luotonin A, isoluotonin, pyrroloquiolines, alkaloids
Procedia PDF Downloads 41714771 Alternative Animal Feed Additive Obtain with Different Drying Methods from Carrot Unsuitable for Human Consumption
Authors: Rabia Göçmen, Gülşah Kanbur, Sinan Sefa Parlat
Abstract:
This study was conducted to determine that carrot powder obtain by different drying methods (oven and vacuum-freeze dryer) of carrot unfit for human consumption that whether feed additives in animal nutrition or not. Carrots randomly divided 2 groups. First group was dried by using oven, second group was by using vacuum freeze dryer methods. Dried carrot prepared from fresh carrot was analysed nutrient matter (energy, crude protein, crude oil, crude ash, beta carotene, mineral concentration and colour). The differences between groups in terms of energy, crude protein, ash, Ca and Mg was not significant (P> 0,05). Crude oil, P, beta carotene content and colour values (L, a, b) with vacuum-freeze dryer group was greater than oven group (P<0,05). Consequently, carrot powder obtained by drying the vacuum-freeze dryer method can be used as a source of carotene.Keywords: carrot, vacuum freeze dryer, oven, beta carotene
Procedia PDF Downloads 32414770 Theoretical Studies on the Formation Constant, Geometry, Vibrational Frequencies and Electronic Properties Dinuclear Molybdenum Complexes
Authors: Mahboobeh Mohadeszadeh, Behzad Padidaran Moghaddam
Abstract:
In order to measuring dinuclear molybdenum complexes formation constant First,the reactants and the products were optimized separately and then, their frequencies were measured. In next level , with using Hartree-fock (HF) and density functional theory (DFT) methods ,Theoretical studies on the geometrical parameters, electronic properties and vibrational frequencies of dinuclear molybdenum complexes [C40H44Mo2N2O20] were investigated . These calculations were performed with the B3LYP, BPV86, B3PW91 and HF theoretical method using the LANL2DZ (for Mo’s) + 6-311G (for others) basis sets. To estimate the error rate between theoretical data and experimental data, RSquare , SError and RMS values that according with the theoretical and experimental parameters found out DFT methods has more integration with experimental data compare to HF methods. In addition, through electron specification of compounds, the percentage of atomic orbital’s attendance in making molecular orbital’s, atoms electrical charge, the sustainable energy resulting and also HOMO and LUMO orbital’s energy achieved.Keywords: geometrical parameters, hydrogen bonding, electronic properties, vibrational frequencies
Procedia PDF Downloads 27414769 Effect of Extraction Methods on the Fatty Acids and Physicochemical Properties of Serendipity Berry Seed Oil
Authors: Olufunmilola A. Abiodun, Adegbola O. Dauda, Ayobami Ojo, Samson A. Oyeyinka
Abstract:
Serendipity berry (Dioscoreophyllum cumminsii diel) is a tropical dioecious rainforest vine and native to tropical Africa. The vine grows during the raining season and is used mainly as sweetener. The sweetener in the berry is known as monellin which is sweeter than sucrose. The sweetener is extracted from the fruits and the seed is discarded. The discarded seeds contain bitter principles but had high yield of oil. Serendipity oil was extracted using three methods (N-hexane, expression and expression/n-hexane). Fatty acids and physicochemical properties of the oil obtained were determined. The oil obtained was clear, liquid and have odour similar to hydrocarbon. The percentage oil yield was 38.59, 12.34 and 49.57% for hexane, expression and expression-hexane method respectively. The seed contained high percentage of oil especially using combination of expression and hexane. Low percentage of oil was obtained using expression method. The refractive index values obtained were 1.443, 1.442 and 1.478 for hexane, expression and expression-hexane methods respectively. Peroxide value obtained for expression-hexane was higher than those for hexane and expression. The viscosities of the oil were 125.8, 128.76 and 126.87 cm³/s for hexane, expression and expression-hexane methods respectively which showed that the oil from expression method was more viscous than the other oils. The major fatty acids in serendipity seed oil were oleic acid (62.81%), linoleic acid (22.65%), linolenic (6.11%), palmitic acid (5.67%), stearic acid (2.21%) in decreasing order. Oleic acid which is monounsaturated fatty acid had the highest value. Total unsaturated fatty acids were 91.574, 92.256 and 90.426% for hexane, expression, and expression-hexane respectively. Combination of expression and hexane for extraction of serendipity oil produced high yield of oil. The oil could be refined for food and non-food application.Keywords: serendipity seed oil, expression method, fatty acid, hexane
Procedia PDF Downloads 27314768 Semi-Automatic Segmentation of Mitochondria on Transmission Electron Microscopy Images Using Live-Wire and Surface Dragging Methods
Authors: Mahdieh Farzin Asanjan, Erkan Unal Mumcuoglu
Abstract:
Mitochondria are cytoplasmic organelles of the cell, which have a significant role in the variety of cellular metabolic functions. Mitochondria act as the power plants of the cell and are surrounded by two membranes. Significant morphological alterations are often due to changes in mitochondrial functions. A powerful technique in order to study the three-dimensional (3D) structure of mitochondria and its alterations in disease states is Electron microscope tomography. Detection of mitochondria in electron microscopy images due to the presence of various subcellular structures and imaging artifacts is a challenging problem. Another challenge is that each image typically contains more than one mitochondrion. Hand segmentation of mitochondria is tedious and time-consuming and also special knowledge about the mitochondria is needed. Fully automatic segmentation methods lead to over-segmentation and mitochondria are not segmented properly. Therefore, semi-automatic segmentation methods with minimum manual effort are required to edit the results of fully automatic segmentation methods. Here two editing tools were implemented by applying spline surface dragging and interactive live-wire segmentation tools. These editing tools were applied separately to the results of fully automatic segmentation. 3D extension of these tools was also studied and tested. Dice coefficients of 2D and 3D for surface dragging using splines were 0.93 and 0.92. This metric for 2D and 3D for live-wire method were 0.94 and 0.91 respectively. The root mean square symmetric surface distance values of 2D and 3D for surface dragging was measured as 0.69, 0.93. The same metrics for live-wire tool were 0.60 and 2.11. Comparing the results of these editing tools with the results of automatic segmentation method, it shows that these editing tools, led to better results and these results were more similar to ground truth image but the required time was higher than hand-segmentation timeKeywords: medical image segmentation, semi-automatic methods, transmission electron microscopy, surface dragging using splines, live-wire
Procedia PDF Downloads 16914767 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems
Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic
Abstract:
Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method
Procedia PDF Downloads 13314766 Assessing Usability of Behavior Coaching Organizer
Authors: Nathaniel A. Hoston
Abstract:
Teacher coaching is necessary for improving student behaviors. While coaching technologies (e.g., bug-in-ear coaching, video-coaching) can assist the coaching process, little is known about the usability of those tools. This study assessed the usability and perceived efficacy of the Behavior Coaching Organizer (BCO) using usability testing methods (i.e., concurrent think-aloud, retrospective probing) in a simulated learning environment. Participants found that the BCO is moderately usable while perceiving the tool as highly effective for addressing concerning student behaviors. Additionally, participants noted a general need for continued coaching support. The results indicate a need for further usability testing with education research.Keywords: behavioral interventions, Behavior Coaching Organizer, coaching technologies, usability methods
Procedia PDF Downloads 12414765 Status of Bio-Graphene Extraction from Biomass: A Review
Authors: Simon Peter Wafula, Ziporah Nakabazzi Kitooke
Abstract:
Graphene is a carbon allotrope made of a two-dimensional shape. This material has got a number of materials researchers’ interest due to its properties that are special compared to ordinary material. Graphene is thought to enhance a number of material properties in the manufacturing, energy, and construction industries. Many studies consider graphene to be a wonder material, just like plastic in the 21st century. This shows how much should be invested in graphene research. This review highlights the status of graphene extracted from various biomass sources together with their appropriate extraction techniques, including the pretreatment methods for a better product. The functional groups and structure of graphene extracted using several common methods of synthesis are in this paper as well. The review explores methods like chemical vapor deposition (CVD), hydrothermal, chemical exfoliation method, liquid exfoliation, and Hummers. Comparative analysis of the various extraction techniques gives an insight into each of their advantages, challenges, and potential scalability. The review also highlights the pretreatment process for biomass before carbonation for better quality of bio-graphene. The various graphene modes, as well as their applications, are in this study. Recommendations for future research for improving the efficiency and sustainability of bio-graphene are highlighted.Keywords: exfoliation, nanomaterials, biochar, large-scale, two-dimension
Procedia PDF Downloads 4914764 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting
Authors: Divya Haridas, C. B. Sobhan
Abstract:
Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate
Procedia PDF Downloads 36414763 Elastohydrodynamic Lubrication Study Using Discontinuous Finite Volume Method
Authors: Prawal Sinha, Peeyush Singh, Pravir Dutt
Abstract:
Problems in elastohydrodynamic lubrication have attracted a lot of attention in the last few decades. Solving a two-dimensional problem has always been a big challenge. In this paper, a new discontinuous finite volume method (DVM) for two-dimensional point contact Elastohydrodynamic Lubrication (EHL) problem has been developed and analyzed. A complete algorithm has been presented for solving such a problem. The method presented is robust and easily parallelized in MPI architecture. GMRES technique is implemented to solve the matrix obtained after the formulation. A new approach is followed in which discontinuous piecewise polynomials are used for the trail functions. It is natural to assume that the advantages of using discontinuous functions in finite element methods should also apply to finite volume methods. The nature of the discontinuity of the trail function is such that the elements in the corresponding dual partition have the smallest support as compared with the Classical finite volume methods. Film thickness calculation is done using singular quadrature approach. Results obtained have been presented graphically and discussed. This method is well suited for solving EHL point contact problem and can probably be used as commercial software.Keywords: elastohydrodynamic, lubrication, discontinuous finite volume method, GMRES technique
Procedia PDF Downloads 25714762 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years
Authors: Yu. I. Pigolkin, M. A. Garcia Corro
Abstract:
Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.Keywords: age–related changes in bone structures, forensic personal identification, sella turcica morphometry, body identification
Procedia PDF Downloads 27514761 Local Radial Basis Functions for Helmholtz Equation in Seismic Inversion
Authors: Hebert Montegranario, Mauricio Londoño
Abstract:
Solutions of Helmholtz equation are essential in seismic imaging methods like full wave inversion, which needs to solve many times the wave equation. Traditional methods like Finite Element Method (FEM) or Finite Differences (FD) have sparse matrices but may suffer the so called pollution effect in the numerical solutions of Helmholtz equation for large values of the wave number. On the other side, global radial basis functions have a better accuracy but produce full matrices that become unstable. In this research we combine the virtues of both approaches to find numerical solutions of Helmholtz equation, by applying a meshless method that produce sparse matrices by local radial basis functions. We solve the equation with absorbing boundary conditions of the kind Clayton-Enquist and PML (Perfect Matched Layers) and compared with results in standard literature, showing a promising performance by tackling both the pollution effect and matrix instability.Keywords: Helmholtz equation, meshless methods, seismic imaging, wavefield inversion
Procedia PDF Downloads 54714760 Maintaining User-Level Security in Short Message Service
Authors: T. Arudchelvam, W. W. E. N. Fernando
Abstract:
Mobile phone has become as an essential thing in our life. Therefore, security is the most important thing to be considered in mobile communication. Short message service is the cheapest way of communication via the mobile phones. Therefore, security is very important in the short message service as well. This paper presents a method to maintain the security at user level. Different types of encryption methods are used to implement the user level security in mobile phones. Caesar cipher, Rail Fence, Vigenere cipher and RSA are used as encryption methods in this work. Caesar cipher and the Rail Fence methods are enhanced and implemented. The beauty in this work is that the user can select the encryption method and the key. Therefore, by changing the encryption method and the key time to time, the user can ensure the security of messages. By this work, while users can safely send/receive messages, they can save their information from unauthorised and unwanted people in their own mobile phone as well.Keywords: SMS, user level security, encryption, decryption, short message service, mobile communication
Procedia PDF Downloads 39614759 The Effect of Artificial Intelligence on Decoration Designs
Authors: Ayed Mouris Gad Elsayed Khalil
Abstract:
This research focuses on historical techniques associated with the Lajevardin and Haft-Rangi production methods in tile production, with particular attention to identifying techniques for applying gold leaf to the surface of these historical glazed tiles. In this context, the history of the production of glazed, gilded and glazed Lajevardin ceramics from the Khwarizmanshahid and Mongol periods (11th to 13th centuries) was first evaluated in order to better understand the context and history of the methods of historical enameling. After a historical overview of glazed ceramic production techniques and the adoption of these techniques by civilizations, we focused on the niche production methods of glazes and Lajevardin glazes, two categories of decoration commonly found on tiles. A general method for classifying the different types of gold tiles was then introduced, applicable to tiles from to the Safavid period (16th-17th centuries). These categories include gold glazed Lajevardina tiles, haft rangi gold tiles, gold glazed monolithic tiles and gold mosaic tiles.Keywords: ethnicity, multi-cultural, jewelry, craft techniquemycenaean, ceramic, provenance, pigmentAmorium, glass bracelets, image, Byzantine empire
Procedia PDF Downloads 5614758 Net Fee and Commission Income Determinants of European Cooperative Banks
Authors: Karolína Vozková, Matěj Kuc
Abstract:
Net fee and commission income is one of the key elements of a bank’s core income. In the current low-interest rate environment, this type of income is gaining importance relative to net interest income. This paper analyses the effects of bank and country specific determinants of net fee and commission income on a set of cooperative banks from European countries in the 2007-2014 period. In order to do that, dynamic panel data methods (system Generalized Methods of Moments) were employed. Subsequently, alternative panel data methods were run as robustness checks of the analysis. Strong positive impact of bank concentration on the share of net fee and commission income was found, which proves that cooperative banks tend to display a higher share of fee income in less competitive markets. This is probably connected with the fact that they stick with their traditional deposit-taking and loan-providing model and fees on these services are driven down by the competitors. Moreover, compared to commercial banks, cooperatives do not expand heavily into non-traditional fee bearing services under competition and their overall fee income share is therefore decreasing with the increased competitiveness of the sector.Keywords: cooperative banking, dynamic panel data models, net fee and commission income, system GMM
Procedia PDF Downloads 33014757 Parallel Asynchronous Multi-Splitting Methods for Differential Algebraic Systems
Authors: Malika Elkyal
Abstract:
We consider an iterative parallel multi-splitting method for differential algebraic equations. The main feature of the proposed idea is to use the asynchronous form. We prove that the multi-splitting technique can effectively accelerate the convergent performance of the iterative process. The main characteristic of an asynchronous mode is that the local algorithm does not have to wait at predetermined messages to become available. We allow some processors to communicate more frequently than others, and we allow the communication delays to be substantial and unpredictable. Accordingly, we note that synchronous algorithms in the computer science sense are particular cases of our formulation of asynchronous one.Keywords: parallel methods, asynchronous mode, multisplitting, differential algebraic equations
Procedia PDF Downloads 56014756 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods
Authors: Devatha Kalyan Kumar, R. Poovarasan
Abstract:
In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric
Procedia PDF Downloads 25614755 Modern Trends in Foreign Direct Investments in Georgia
Authors: Rusudan Kinkladze, Guguli Kurashvili, Ketevan Chitaladze
Abstract:
Foreign direct investment is a driving force in the development of the interdependent national economies, and the study and analysis of investments is an urgent problem. It is particularly important for transitional economies, such as Georgia, and the study and analysis of investments is an urgent problem. Consequently, the goal of the research is the study and analysis of direct foreign investments in Georgia, and identification and forecasting of modern trends, and covers the period of 2006-2015. The study uses the methods of statistical observation, grouping and analysis, the methods of analytical indicators of time series, trend identification and the predicted values are calculated, as well as various literary and Internet sources relevant to the research. The findings showed that modern investment policy In Georgia is favorable for domestic as well as foreign investors. Georgia is still a net importer of investments. In 2015, the top 10 investing countries was led by Azerbaijan, United Kingdom and Netherlands, and the largest share of FDIs were allocated in the transport and communication sector; the financial sector was the second, followed by the health and social work sector, and the same trend will continue in the future.Keywords: foreign direct investments, methods, statistics, analysis
Procedia PDF Downloads 33114754 Data Mining in Medicine Domain Using Decision Trees and Vector Support Machine
Authors: Djamila Benhaddouche, Abdelkader Benyettou
Abstract:
In this paper, we used data mining to extract biomedical knowledge. In general, complex biomedical data collected in studies of populations are treated by statistical methods, although they are robust, they are not sufficient in themselves to harness the potential wealth of data. For that you used in step two learning algorithms: the Decision Trees and Support Vector Machine (SVM). These supervised classification methods are used to make the diagnosis of thyroid disease. In this context, we propose to promote the study and use of symbolic data mining techniques.Keywords: biomedical data, learning, classifier, algorithms decision tree, knowledge extraction
Procedia PDF Downloads 55914753 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy
Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie
Abstract:
In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data
Procedia PDF Downloads 32114752 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques
Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah
Abstract:
Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.Keywords: BIM, construction projects, cost estimation, NRM, ontology
Procedia PDF Downloads 55114751 Stress Analysis of Vertebra Using Photoelastic and Finite Element Methods
Authors: Jamal A. Hassan, Ali Q. Abdulrazzaq, Sadiq J. Abass
Abstract:
In this study, both the photoelastic, as well as the finite element methods, are used to study the stress distribution within human vertebra (L4) under forces similar to those that occur during normal life. Two & three dimensional models of vertebra were created by the software AutoCAD. The coordinates obtained were fed into a computer numerical control (CNC) tensile machine to fabricate the models from photoelastic sheets. Completed models were placed in a transmission polariscope and loaded with static force (up to 1500N). Stresses can be quantified and localized by counting the number of fringes. In both methods the Principle stresses were calculated at different regions. The results noticed that the maximum von-mises stress on the area of the extreme superior vertebral body surface and the facet surface with high normal stress (σ) and shear stress (τ). The facets and other posterior elements have a load-bearing function to help support the weight of the upper body and anything that it carries, and are also acted upon by spinal muscle forces. The numerical FE results have been compared with the experimental method using photoelasticity which shows good agreement between experimental and simulation results.Keywords: photoelasticity, stress, load, finite element
Procedia PDF Downloads 28614750 Curvature Based-Methods for Automatic Coarse and Fine Registration in Dimensional Metrology
Authors: Rindra Rantoson, Hichem Nouira, Nabil Anwer, Charyar Mehdi-Souzani
Abstract:
Multiple measurements by means of various data acquisition systems are generally required to measure the shape of freeform workpieces for accuracy, reliability and holisticity. The obtained data are aligned and fused into a common coordinate system within a registration technique involving coarse and fine registrations. Standardized iterative methods have been established for fine registration such as Iterative Closest Points (ICP) and its variants. For coarse registration, no conventional method has been adopted yet despite a significant number of techniques which have been developed in the literature to supply an automatic rough matching between data sets. Two main issues are addressed in this paper: the coarse registration and the fine registration. For coarse registration, two novel automated methods based on the exploitation of discrete curvatures are presented: an enhanced Hough Transformation (HT) and an improved Ransac Transformation. The use of curvature features in both methods aims to reduce computational cost. For fine registration, a new variant of ICP method is proposed in order to reduce registration error using curvature parameters. A specific distance considering the curvature similarity has been combined with Euclidean distance to define the distance criterion used for correspondences searching. Additionally, the objective function has been improved by combining the point-to-point (P-P) minimization and the point-to-plane (P-Pl) minimization with automatic weights. These ones are determined from the preliminary calculated curvature features at each point of the workpiece surface. The algorithms are applied on simulated and real data performed by a computer tomography (CT) system. The obtained results reveal the benefit of the proposed novel curvature-based registration methods.Keywords: discrete curvature, RANSAC transformation, hough transformation, coarse registration, ICP variant, point-to-point and point-to-plane minimization combination, computer tomography
Procedia PDF Downloads 42414749 KCBA, A Method for Feature Extraction of Colonoscopy Images
Authors: Vahid Bayrami Rad
Abstract:
In recent years, the use of artificial intelligence techniques, tools, and methods in processing medical images and health-related applications has been highlighted and a lot of research has been done in this regard. For example, colonoscopy and diagnosis of colon lesions are some cases in which the process of diagnosis of lesions can be improved by using image processing and artificial intelligence algorithms, which help doctors a lot. Due to the lack of accurate measurements and the variety of injuries in colonoscopy images, the process of diagnosing the type of lesions is a little difficult even for expert doctors. Therefore, by using different software and image processing, doctors can be helped to increase the accuracy of their observations and ultimately improve their diagnosis. Also, by using automatic methods, the process of diagnosing the type of disease can be improved. Therefore, in this paper, a deep learning framework called KCBA is proposed to classify colonoscopy lesions which are composed of several methods such as K-means clustering, a bag of features and deep auto-encoder. Finally, according to the experimental results, the proposed method's performance in classifying colonoscopy images is depicted considering the accuracy criterion.Keywords: colorectal cancer, colonoscopy, region of interest, narrow band imaging, texture analysis, bag of feature
Procedia PDF Downloads 5714748 Domain Adaptive Dense Retrieval with Query Generation
Authors: Rui Yin, Haojie Wang, Xun Li
Abstract:
Recently, mainstream dense retrieval methods have obtained state-of-the-art results on some datasets and tasks. However, they require large amounts of training data, which is not available in most domains. The severe performance degradation of dense retrievers on new data domains has limited the use of dense retrieval methods to only a few domains with large training datasets. In this paper, we propose an unsupervised domain-adaptive approach based on query generation. First, a generative model is used to generate relevant queries for each passage in the target corpus, and then, the generated queries are used for mining negative passages. Finally, the query-passage pairs are labeled with a cross-encoder and used to train a domain-adapted dense retriever. We also explore contrastive learning as a method for training domain-adapted dense retrievers and show that it leads to strong performance in various retrieval settings. Experiments show that our approach is more robust than previous methods in target domains that require less unlabeled data.Keywords: dense retrieval, query generation, contrastive learning, unsupervised training
Procedia PDF Downloads 10414747 Statistical Characteristics of Code Formula for Design of Concrete Structures
Authors: Inyeol Paik, Ah-Ryang Kim
Abstract:
In this research, a statistical analysis is carried out to examine the statistical properties of the formula given in the design code for concrete structures. The design formulas of the Korea highway bridge design code - the limit state design method (KHBDC) which is the current national bridge design code and the design code for concrete structures by Korea Concrete Institute (KCI) are applied for the analysis. The safety levels provided by the strength formulas of the design codes are defined based on the probabilistic and statistical theory.KHBDC is a reliability-based design code. The load and resistance factors of this code were calibrated to attain the target reliability index. It is essential to define the statistical properties for the design formulas in this calibration process. In general, the statistical characteristics of a member strength are due to the following three factors. The first is due to the difference between the material strength of the actual construction and that used in the design calculation. The second is the difference between the actual dimensions of the constructed sections and those used in design calculation. The third is the difference between the strength of the actual member and the formula simplified for the design calculation. In this paper, the statistical study is focused on the third difference. The formulas for calculating the shear strength of concrete members are presented in different ways in KHBDC and KCI. In this study, the statistical properties of design formulas were obtained through comparison with the database which comprises the experimental results from the reference publications. The test specimen was either reinforced with the shear stirrup or not. For an applied database, the bias factor was about 1.12 and the coefficient of variation was about 0.18. By applying the statistical properties of the design formula to the reliability analysis, it is shown that the resistance factors of the current design codes satisfy the target reliability indexes of both codes. Also, the minimum resistance factors of the KHBDC which is written in the material resistance factor format and KCE which is in the member resistance format are obtained and the results are presented. A further research is underway to calibrate the resistance factors of the high strength and high-performance concrete design guide.Keywords: concrete design code, reliability analysis, resistance factor, shear strength, statistical property
Procedia PDF Downloads 319