Search results for: predicting model
15132 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services
Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen
Abstract:
Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes
Procedia PDF Downloads 27715131 A Comparative Study on the Dimensional Error of 3D CAD Model and SLS RP Model for Reconstruction of Cranial Defect
Authors: L. Siva Rama Krishna, Sriram Venkatesh, M. Sastish Kumar, M. Uma Maheswara Chary
Abstract:
Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modelling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyse the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.Keywords: rapid prototyping, selective laser sintering, cranial defect, dimensional error
Procedia PDF Downloads 32515130 A Pattern Recognition Neural Network Model for Detection and Classification of SQL Injection Attacks
Authors: Naghmeh Moradpoor Sheykhkanloo
Abstract:
Structured Query Language Injection (SQLI) attack is a code injection technique in which malicious SQL statements are inserted into a given SQL database by simply using a web browser. Losing data, disclosing confidential information or even changing the value of data are the severe damages that SQLI attack can cause on a given database. SQLI attack has also been rated as the number-one attack among top ten web application threats on Open Web Application Security Project (OWASP). OWASP is an open community dedicated to enabling organisations to consider, develop, obtain, function, and preserve applications that can be trusted. In this paper, we propose an effective pattern recognition neural network model for detection and classification of SQLI attacks. The proposed model is built from three main elements of: a Uniform Resource Locator (URL) generator in order to generate thousands of malicious and benign URLs, a URL classifier in order to: 1) classify each generated URL to either a benign URL or a malicious URL and 2) classify the malicious URLs into different SQLI attack categories, and an NN model in order to: 1) detect either a given URL is a malicious URL or a benign URL and 2) identify the type of SQLI attack for each malicious URL. The model is first trained and then evaluated by employing thousands of benign and malicious URLs. The results of the experiments are presented in order to demonstrate the effectiveness of the proposed approach.Keywords: neural networks, pattern recognition, SQL injection attacks, SQL injection attack classification, SQL injection attack detection
Procedia PDF Downloads 47015129 CFD Study on the Effect of Primary Air on Combustion of Simulated MSW Process in the Fixed Bed
Authors: Rui Sun, Tamer M. Ismail, Xiaohan Ren, M. Abd El-Salam
Abstract:
Incineration of municipal solid waste (MSW) is one of the key scopes in the global clean energy strategy. A computational fluid dynamics (CFD) model was established. In order to reveal these features of the combustion process in a fixed porous bed of MSW. Transporting equations and process rate equations of the waste bed were modeled and set up to describe the incineration process, according to the local thermal conditions and waste property characters. Gas phase turbulence was modeled using k-ε turbulent model and the particle phase was modeled using the kinetic theory of granular flow. The heterogeneous reaction rates were determined using Arrhenius eddy dissipation and the Arrhenius-diffusion reaction rates. The effects of primary air flow rate and temperature in the burning process of simulated MSW are investigated experimentally and numerically. The simulation results in bed are accordant with experimental data well. The model provides detailed information on burning processes in the fixed bed, which is otherwise very difficult to obtain by conventional experimental techniques.Keywords: computational fluid dynamics (CFD) model, waste incineration, municipal solid waste (MSW), fixed bed, primary air
Procedia PDF Downloads 40315128 A Machine Learning Approach for Efficient Resource Management in Construction Projects
Authors: Soheila Sadeghi
Abstract:
Construction projects are complex and often subject to significant cost overruns due to the multifaceted nature of the activities involved. Accurate cost estimation is crucial for effective budget planning and resource allocation. Traditional methods for predicting overruns often rely on expert judgment or analysis of historical data, which can be time-consuming, subjective, and may fail to consider important factors. However, with the increasing availability of data from construction projects, machine learning techniques can be leveraged to improve the accuracy of overrun predictions. This study applied machine learning algorithms to enhance the prediction of cost overruns in a case study of a construction project. The methodology involved the development and evaluation of two machine learning models: Random Forest and Neural Networks. Random Forest can handle high-dimensional data, capture complex relationships, and provide feature importance estimates. Neural Networks, particularly Deep Neural Networks (DNNs), are capable of automatically learning and modeling complex, non-linear relationships between input features and the target variable. These models can adapt to new data, reduce human bias, and uncover hidden patterns in the dataset. The findings of this study demonstrate that both Random Forest and Neural Networks can significantly improve the accuracy of cost overrun predictions compared to traditional methods. The Random Forest model also identified key cost drivers and risk factors, such as changes in the scope of work and delays in material delivery, which can inform better project risk management. However, the study acknowledges several limitations. First, the findings are based on a single construction project, which may limit the generalizability of the results to other projects or contexts. Second, the dataset, although comprehensive, may not capture all relevant factors influencing cost overruns, such as external economic conditions or political factors. Third, the study focuses primarily on cost overruns, while schedule overruns are not explicitly addressed. Future research should explore the application of machine learning techniques to a broader range of projects, incorporate additional data sources, and investigate the prediction of both cost and schedule overruns simultaneously.Keywords: resource allocation, machine learning, optimization, data-driven decision-making, project management
Procedia PDF Downloads 4115127 Predicting and Obtaining New Solvates of Curcumin, Demethoxycurcumin and Bisdemethoxycurcumin Based on the Ccdc Statistical Tools and Hansen Solubility Parameters
Authors: J. Ticona Chambi, E. A. De Almeida, C. A. Andrade Raymundo Gaiotto, A. M. Do Espírito Santo, L. Infantes, S. L. Cuffini
Abstract:
The solubility of active pharmaceutical ingredients (APIs) is challenging for the pharmaceutical industry. The new multicomponent crystalline forms as cocrystal and solvates present an opportunity to improve the solubility of APIs. Commonly, the procedure to obtain multicomponent crystalline forms of a drug starts by screening the drug molecule with the different coformers/solvents. However, it is necessary to develop methods to obtain multicomponent forms in an efficient way and with the least possible environmental impact. The Hansen Solubility Parameters (HSPs) is considered a tool to obtain theoretical knowledge of the solubility of the target compound in the chosen solvent. H-Bond Propensity (HBP), Molecular Complementarity (MC), Coordination Values (CV) are tools used for statistical prediction of cocrystals developed by the Cambridge Crystallographic Data Center (CCDC). The HSPs and the CCDC tools are based on inter- and intra-molecular interactions. The curcumin (Cur), target molecule, is commonly used as an anti‐inflammatory. The demethoxycurcumin (Demcur) and bisdemethoxycurcumin (Bisdcur) are natural analogues of Cur from turmeric. Those target molecules have differences in their solubilities. In this way, the work aimed to analyze and compare different tools for multicomponent forms prediction (solvates) of Cur, Demcur and Biscur. The HSP values were calculated for Cur, Demcur, and Biscur using the chemical group contribution methods and the statistical optimization from experimental data. The HSPmol software was used. From the HSPs of the target molecules and fifty solvents (listed in the HSP books), the relative energy difference (RED) was determined. The probability of the target molecules would be interacting with the solvent molecule was determined using the CCDC tools. A dataset of fifty molecules of different organic solvents was ranked for each prediction method and by a consensus ranking of different combinations: HSP, CV, HBP and MC values. Based on the prediction, 15 solvents were selected as Dimethyl Sulfoxide (DMSO), Tetrahydrofuran (THF), Acetonitrile (ACN), 1,4-Dioxane (DOX) and others. In a starting analysis, the slow evaporation technique from 50°C at room temperature and 4°C was used to obtain solvates. The single crystals were collected by using a Bruker D8 Venture diffractometer, detector Photon100. The data processing and crystal structure determination were performed using APEX3 and Olex2-1.5 software. According to the results, the HSPs (theoretical and optimized) and the Hansen solubility sphere for Cur, Demcur and Biscur were obtained. With respect to prediction analyses, a way to evaluate the predicting method was through the ranking and the consensus ranking position of solvates already reported in the literature. It was observed that the combination of HSP-CV obtained the best results when compared to the other methods. Furthermore, as a result of solvent selected, six new solvates, Cur-DOX, Cur-DMSO, Bicur-DOX, Bircur-THF, Demcur-DOX, Demcur-ACN and a new Biscur hydrate, were obtained. Crystal structures were determined for Cur-DOX, Biscur-DOX, Demcur-DOX and Bicur-Water. Moreover, the unit-cell parameter information for Cur-DMSO, Biscur-THF and Demcur-ACN were obtained. The preliminary results showed that the prediction method is showing a promising strategy to evaluate the possibility of forming multicomponent. It is currently working on obtaining multicomponent single crystals.Keywords: curcumin, HSPs, prediction, solvates, solubility
Procedia PDF Downloads 6415126 Developing Cucurbitacin a Minimum Inhibition Concentration of Meloidogyne Incognita Using a Computer-Based Model
Authors: Zakheleni P. Dube, Phatu W. Mashela
Abstract:
Minimum inhibition concentration (MIC) is the lowest concentration of a chemical that brings about significant inhibition of target organism. The conventional method for establishing the MIC for phytonematicides is tedious. The objective of this study was to use the Curve-fitting Allelochemical Response Data (CARD) to determine the MIC for pure cucurbitacin A on Meloidogyne incognita second-stage juveniles (J2) hatch, immobility and mortality. Meloidogyne incognita eggs and freshly hatched J2 were separately exposed to a series of pure cucurbitacin A concentrations of 0.00, 0.25, 0.50, 0.75, 1.00, 1.25, 1.50, 1.75, 2.00, 2.25 and 2.50 μg.mL⁻¹for 12, 24, 48 and 72 h in an incubator set at 25 ± 2°C. Meloidogyne incognita J2 hatch, immobility and mortality counts were determined using a stereomicroscope and the significant means were subjected to the CARD model. The model exhibited density-dependent growth (DDG) patterns of J2 hatch, immobility and mortality to increasing concentrations of cucurbitacin A. The average MIC for cucurbitacin A on M. incognita J2 hatch, immobility and mortality were 2.2, 0.58 and 0.63 µg.mL⁻¹, respectively. Meloidogyne incognita J2 hatch had the highest average MIC value followed by mortality and immobility had the least. In conclusion, the CARD model was able to generate MIC for cucurbitacin A, hence it could serve as a valuable tool in the chemical-nematode bioassay studies.Keywords: inhibition concentration, phytonematicide, sensitivity index, threshold stimulation, triterpenoids.
Procedia PDF Downloads 19215125 Urban Energy Demand Modelling: Spatial Analysis Approach
Authors: Hung-Chu Chen, Han Qi, Bauke de Vries
Abstract:
Energy consumption in the urban environment has attracted numerous researches in recent decades. However, it is comparatively rare to find literary works which investigated 3D spatial analysis of urban energy demand modelling. In order to analyze the spatial correlation between urban morphology and energy demand comprehensively, this paper investigates their relation by using the spatial regression tool. In addition, the spatial regression tool which is applied in this paper is ordinary least squares regression (OLS) and geographically weighted regression (GWR) model. Normalized Difference Built-up Index (NDBI), Normalized Difference Vegetation Index (NDVI), and building volume are explainers of urban morphology, which act as independent variables of Energy-land use (E-L) model. NDBI and NDVI are used as the index to describe five types of land use: urban area (U), open space (O), artificial green area (G), natural green area (V), and water body (W). Accordingly, annual electricity, gas demand and energy demand are dependent variables of the E-L model. Based on the analytical result of E-L model relation, it revealed that energy demand and urban morphology are closely connected and the possible causes and practical use are discussed. Besides, the spatial analysis methods of OLS and GWR are compared.Keywords: energy demand model, geographically weighted regression, normalized difference built-up index, normalized difference vegetation index, spatial statistics
Procedia PDF Downloads 15015124 Elastic and Plastic Collision Comparison Using Finite Element Method
Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier
Abstract:
The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.Keywords: collision, impact models, finite element method, Hertz Theory
Procedia PDF Downloads 17515123 A Hybrid Traffic Model for Smoothing Traffic Near Merges
Authors: Shiri Elisheva Decktor, Sharon Hornstein
Abstract:
Highway merges and unmarked junctions are key components in any urban road network, which can act as bottlenecks and create traffic disruption. Inefficient highway merges may trigger traffic instabilities such as stop-and-go waves, pose safety conditions and lead to longer journey times. These phenomena occur spontaneously if the average vehicle density exceeds a certain critical value. This study focuses on modeling the traffic using a microscopic traffic flow model. A hybrid traffic model, which combines human-driven and controlled vehicles is assumed. The controlled vehicles obey different driving policies when approaching the merge, or in the vicinity of other vehicles. We developed a co-simulation model in SUMO (Simulation of Urban Mobility), in which the human-driven cars are modeled using the IDM model, and the controlled cars are modeled using a dedicated controller. The scenario chosen for this study is a closed track with one merge and one exit, which could be later implemented using a scaled infrastructure on our lab setup. This will enable us to benchmark the results of this study obtained in simulation, to comparable results in similar conditions in the lab. The metrics chosen for the comparison of the performance of our algorithm on the overall traffic conditions include the average speed, wait time near the merge, and throughput after the merge, measured under different travel demand conditions (low, medium, and heavy traffic).Keywords: highway merges, traffic modeling, SUMO, driving policy
Procedia PDF Downloads 10815122 Construction of a Dynamic Migration Model of Extracellular Fluid in Brain for Future Integrated Control of Brain State
Authors: Tomohiko Utsuki, Kyoka Sato
Abstract:
In emergency medicine, it is recognized that brain resuscitation is very important for the reduction of mortality rate and neurological sequelae. Especially, the control of brain temperature (BT), intracranial pressure (ICP), and cerebral blood flow (CBF) are most required for stabilizing brain’s physiological state in the treatment for such as brain injury, stroke, and encephalopathy. However, the manual control of BT, ICP, and CBF frequently requires the decision and operation of medical staff, relevant to medication and the setting of therapeutic apparatus. Thus, the integration and the automation of the control of those is very effective for not only improving therapeutic effect but also reducing staff burden and medical cost. For realizing such integration and automation, a mathematical model of brain physiological state is necessary as the controlled object in simulations, because the performance test of a prototype of the control system using patients is not ethically allowed. A model of cerebral blood circulation has already been constructed, which is the most basic part of brain physiological state. Also, a migration model of extracellular fluid in brain has been constructed, however the condition that the total volume of intracranial cavity is almost changeless due to the hardness of cranial bone has not been considered in that model. Therefore, in this research, the dynamic migration model of extracellular fluid in brain was constructed on the consideration of the changelessness of intracranial cavity’s total volume. This model is connectable to the cerebral blood circulation model. The constructed model consists of fourteen compartments, twelve of which corresponds to perfused area of bilateral anterior, middle and posterior cerebral arteries, the others corresponds to cerebral ventricles and subarachnoid space. This model enable to calculate the migration of tissue fluid from capillaries to gray matter and white matter, the flow of tissue fluid between compartments, the production and absorption of cerebrospinal fluid at choroid plexus and arachnoid granulation, and the production of metabolic water. Further, the volume, the colloid concentration, and the tissue pressure of/in each compartment are also calculable by solving 40-dimensional non-linear simultaneous differential equations. In this research, the obtained model was analyzed for its validation under the four condition of a normal adult, an adult with higher cerebral capillary pressure, an adult with lower cerebral capillary pressure, and an adult with lower colloid concentration in cerebral capillary. In the result, calculated fluid flow, tissue volume, colloid concentration, and tissue pressure were all converged to suitable value for the set condition within 60 minutes at a maximum. Also, because these results were not conflict with prior knowledge, it is certain that the model can enough represent physiological state of brain under such limited conditions at least. One of next challenges is to integrate this model and the already constructed cerebral blood circulation model. This modification enable to simulate CBF and ICP more precisely due to calculating the effect of blood pressure change to extracellular fluid migration and that of ICP change to CBF.Keywords: dynamic model, cerebral extracellular migration, brain resuscitation, automatic control
Procedia PDF Downloads 15715121 A Fuzzy Linear Regression Model Based on Dissemblance Index
Authors: Shih-Pin Chen, Shih-Syuan You
Abstract:
Fuzzy regression models are useful for investigating the relationship between explanatory variables and responses in fuzzy environments. To overcome the deficiencies of previous models and increase the explanatory power of fuzzy data, the graded mean integration (GMI) representation is applied to determine representative crisp regression coefficients. A fuzzy regression model is constructed based on the modified dissemblance index (MDI), which can precisely measure the actual total error. Compared with previous studies based on the proposed MDI and distance criterion, the results from commonly used test examples show that the proposed fuzzy linear regression model has higher explanatory power and forecasting accuracy.Keywords: dissemblance index, fuzzy linear regression, graded mean integration, mathematical programming
Procedia PDF Downloads 44315120 Mathematical Model of Corporate Bond Portfolio and Effective Border Preview
Authors: Sergey Podluzhnyy
Abstract:
One of the most important tasks of investment and pension fund management is building decision support system which helps to make right decision on corporate bond portfolio formation. Today there are several basic methods of bond portfolio management. They are duration management, immunization and convexity management. Identified methods have serious disadvantage: they do not take into account credit risk or insolvency risk of issuer. So, identified methods can be applied only for management and evaluation of high-quality sovereign bonds. Applying article proposes mathematical model for building an optimal in case of risk and yield corporate bond portfolio. Proposed model takes into account the default probability in formula of assessment of bonds which results to more correct evaluation of bonds prices. Moreover, applied model provides tools for visualization of the efficient frontier of corporate bonds portfolio taking into account the exposure to credit risk, which will increase the quality of the investment decisions of portfolio managers.Keywords: corporate bond portfolio, default probability, effective boundary, portfolio optimization task
Procedia PDF Downloads 31915119 Human Brain Organoids-on-a-Chip Systems to Model Neuroinflammation
Authors: Feng Guo
Abstract:
Human brain organoids, 3D brain tissue cultures derived from human pluripotent stem cells, hold promising potential in modeling neuroinflammation for a variety of neurological diseases. However, challenges remain in generating standardized human brain organoids that can recapitulate key physiological features of a human brain. Here, this study presents a series of organoids-on-a-chip systems to generate better human brain organoids and model neuroinflammation. By employing 3D printing and microfluidic 3D cell culture technologies, the study’s systems enable the reliable, scalable, and reproducible generation of human brain organoids. Compared with conventional protocols, this study’s method increased neural progenitor proliferation and reduced heterogeneity of human brain organoids. As a proof-of-concept application, the study applied this method to model substance use disorders.Keywords: human brain organoids, microfluidics, organ-on-a-chip, neuroinflammation
Procedia PDF Downloads 20415118 Computer-Based Model for Design Selection of Lightning Arrester for 132/33kV Substation
Authors: Uma U. Uma, Uzoechi Laz
Abstract:
Protection of equipment insulation against lightning over voltages and selection of lightning arrester that will discharge at lower voltage level than the voltage required to breakdown the electrical equipment insulation is examined. The objectives of this paper are to design a computer based model using standard equations for the selection of appropriate lightning arrester with the lowest rated surge arrester that will provide adequate protection of equipment insulation and equally have a satisfactory service life when connected to a specified line voltage in power system network. The effectiveness and non-effectiveness of the earthing system of substation determine arrester properties. MATLAB program with GUI (graphic user interphase) its subprogram is used in the development of the model for the determination of required parameters like voltage rating, impulse spark over voltage, power frequency spark over voltage, discharge current, current rating and protection level of lightning arrester of a specified voltage level of a particular line.Keywords: lightning arrester, GUIs, MatLab program, computer based model
Procedia PDF Downloads 42015117 An Optimal Bayesian Maintenance Policy for a Partially Observable System Subject to Two Failure Modes
Authors: Akram Khaleghei Ghosheh Balagh, Viliam Makis, Leila Jafari
Abstract:
In this paper, we present a new maintenance model for a partially observable system subject to two failure modes, namely a catastrophic failure and a failure due to the system degradation. The system is subject to condition monitoring and the degradation process is described by a hidden Markov model. A cost-optimal Bayesian control policy is developed for maintaining the system. The control problem is formulated in the semi-Markov decision process framework. An effective computational algorithm is developed and illustrated by a numerical example.Keywords: partially observable system, hidden Markov model, competing risks, multivariate Bayesian control
Procedia PDF Downloads 45815116 Target and Equalizer Design for Perpendicular Heat-Assisted Magnetic Recording
Authors: P. Tueku, P. Supnithi, R. Wongsathan
Abstract:
Heat-Assisted Magnetic Recording (HAMR) is one of the leading technologies identified to enable areal density beyond 1 Tb/in2 of magnetic recording systems. A key challenge to HAMR designing is accuracy of positioning, timing of the firing laser, power of the laser, thermo-magnetic head, head-disk interface and cooling system. We study the effect of HAMR parameters on transition center and transition width. The HAMR is model using Thermal Williams-Comstock (TWC) and microtrack model. The target and equalizer are designed by the minimum mean square error (MMSE). The result shows that the unit energy constraint outperforms other constraints.Keywords: heat-assisted magnetic recording, thermal Williams-Comstock equation, microtrack model, equalizer
Procedia PDF Downloads 35415115 Random Subspace Ensemble of CMAC Classifiers
Authors: Somaiyeh Dehghan, Mohammad Reza Kheirkhahan Haghighi
Abstract:
The rapid growth of domains that have data with a large number of features, while the number of samples is limited has caused difficulty in constructing strong classifiers. To reduce the dimensionality of the feature space becomes an essential step in classification task. Random subspace method (or attribute bagging) is an ensemble classifier that consists of several classifiers that each base learner in ensemble has subset of features. In the present paper, we introduce Random Subspace Ensemble of CMAC neural network (RSE-CMAC), each of which has training with subset of features. Then we use this model for classification task. For evaluation performance of our model, we compare it with bagging algorithm on 36 UCI datasets. The results reveal that the new model has better performance.Keywords: classification, random subspace, ensemble, CMAC neural network
Procedia PDF Downloads 33315114 ELD79-LGD2006 Transformation Techniques Implementation and Accuracy Comparison in Tripoli Area, Libya
Authors: Jamal A. Gledan, Othman A. Azzeidani
Abstract:
During the last decade, Libya established a new Geodetic Datum called Libyan Geodetic Datum 2006 (LGD 2006) by using GPS, whereas the ground traversing method was used to establish the last Libyan datum which was called the Europe Libyan Datum 79 (ELD79). The current research paper introduces ELD79 to LGD2006 coordinate transformation technique, the accurate comparison of transformation between multiple regression equations and the three-parameters model (Bursa-Wolf). The results had been obtained show that the overall accuracy of stepwise multi regression equations is better than that can be determined by using Bursa-Wolf transformation model.Keywords: geodetic datum, horizontal control points, traditional similarity transformation model, unconventional transformation techniques
Procedia PDF Downloads 30815113 First Digit Lucas, Fibonacci and Benford Number in Financial Statement
Authors: Teguh Sugiarto, Amir Mohamadian Amiri
Abstract:
Background: This study aims to explore if there is fraud in the company's financial report distribution using the number first digit Lucas, Fibonacci and Benford. Research methods: In this study, the author uses a number model contained in the first digit of the model Lucas, Fibonacci and Benford, to make a distinction between implementation by using the scale above and below 5%, the rate of occurrence of a difference against the digit number contained on Lucas, Fibonacci and Benford. If there is a significant difference above and below 5%, then the process of follow-up and detection of occurrence of fraud against the financial statements can be made. Findings: From research that has been done can be concluded that the number of frequency levels contained in the financial statements of PT Bank BRI Tbk in a year in the same conscientious results for model Lucas, Fibonacci and Benford.Keywords: Lucas, Fibonacci, Benford, first digit
Procedia PDF Downloads 27515112 Image Classification with Localization Using Convolutional Neural Networks
Authors: Bhuyain Mobarok Hossain
Abstract:
Image classification and localization research is currently an important strategy in the field of computer vision. The evolution and advancement of deep learning and convolutional neural networks (CNN) have greatly improved the capabilities of object detection and image-based classification. Target detection is important to research in the field of computer vision, especially in video surveillance systems. To solve this problem, we will be applying a convolutional neural network of multiple scales at multiple locations in the image in one sliding window. Most translation networks move away from the bounding box around the area of interest. In contrast to this architecture, we consider the problem to be a classification problem where each pixel of the image is a separate section. Image classification is the method of predicting an individual category or specifying by a shoal of data points. Image classification is a part of the classification problem, including any labels throughout the image. The image can be classified as a day or night shot. Or, likewise, images of cars and motorbikes will be automatically placed in their collection. The deep learning of image classification generally includes convolutional layers; the invention of it is referred to as a convolutional neural network (CNN).Keywords: image classification, object detection, localization, particle filter
Procedia PDF Downloads 30715111 Ensemble Methods in Machine Learning: An Algorithmic Approach to Derive Distinctive Behaviors of Criminal Activity Applied to the Poaching Domain
Authors: Zachary Blanks, Solomon Sonya
Abstract:
Poaching presents a serious threat to endangered animal species, environment conservations, and human life. Additionally, some poaching activity has even been linked to supplying funds to support terrorist networks elsewhere around the world. Consequently, agencies dedicated to protecting wildlife habitats have a near intractable task of adequately patrolling an entire area (spanning several thousand kilometers) given limited resources, funds, and personnel at their disposal. Thus, agencies need predictive tools that are both high-performing and easily implementable by the user to help in learning how the significant features (e.g. animal population densities, topography, behavior patterns of the criminals within the area, etc) interact with each other in hopes of abating poaching. This research develops a classification model using machine learning algorithms to aid in forecasting future attacks that is both easy to train and performs well when compared to other models. In this research, we demonstrate how data imputation methods (specifically predictive mean matching, gradient boosting, and random forest multiple imputation) can be applied to analyze data and create significant predictions across a varied data set. Specifically, we apply these methods to improve the accuracy of adopted prediction models (Logistic Regression, Support Vector Machine, etc). Finally, we assess the performance of the model and the accuracy of our data imputation methods by learning on a real-world data set constituting four years of imputed data and testing on one year of non-imputed data. This paper provides three main contributions. First, we extend work done by the Teamcore and CREATE (Center for Risk and Economic Analysis of Terrorism Events) research group at the University of Southern California (USC) working in conjunction with the Department of Homeland Security to apply game theory and machine learning algorithms to develop more efficient ways of reducing poaching. This research introduces ensemble methods (Random Forests and Stochastic Gradient Boosting) and applies it to real-world poaching data gathered from the Ugandan rain forest park rangers. Next, we consider the effect of data imputation on both the performance of various algorithms and the general accuracy of the method itself when applied to a dependent variable where a large number of observations are missing. Third, we provide an alternate approach to predict the probability of observing poaching both by season and by month. The results from this research are very promising. We conclude that by using Stochastic Gradient Boosting to predict observations for non-commercial poaching by season, we are able to produce statistically equivalent results while being orders of magnitude faster in computation time and complexity. Additionally, when predicting potential poaching incidents by individual month vice entire seasons, boosting techniques produce a mean area under the curve increase of approximately 3% relative to previous prediction schedules by entire seasons.Keywords: ensemble methods, imputation, machine learning, random forests, statistical analysis, stochastic gradient boosting, wildlife protection
Procedia PDF Downloads 29415110 Meta Mask Correction for Nuclei Segmentation in Histopathological Image
Authors: Jiangbo Shi, Zeyu Gao, Chen Li
Abstract:
Nuclei segmentation is a fundamental task in digital pathology analysis and can be automated by deep learning-based methods. However, the development of such an automated method requires a large amount of data with precisely annotated masks which is hard to obtain. Training with weakly labeled data is a popular solution for reducing the workload of annotation. In this paper, we propose a novel meta-learning-based nuclei segmentation method which follows the label correction paradigm to leverage data with noisy masks. Specifically, we design a fully conventional meta-model that can correct noisy masks by using a small amount of clean meta-data. Then the corrected masks are used to supervise the training of the segmentation model. Meanwhile, a bi-level optimization method is adopted to alternately update the parameters of the main segmentation model and the meta-model. Extensive experimental results on two nuclear segmentation datasets show that our method achieves the state-of-the-art result. In particular, in some noise scenarios, it even exceeds the performance of training on supervised data.Keywords: deep learning, histopathological image, meta-learning, nuclei segmentation, weak annotations
Procedia PDF Downloads 14215109 A Multi-Attribute Utility Model for Performance Evaluation of Sustainable Banking
Authors: Sonia Rebai, Mohamed Naceur Azaiez, Dhafer Saidane
Abstract:
In this study, we develop a performance evaluation model based on a multi-attribute utility approach aiming at reaching the sustainable banking (SB) status. This model is built accounting for various banks’ stakeholders in a win-win paradigm. In addition, it offers the opportunity for adopting a global measure of performance as an indication of a bank’s sustainability degree. This measure is referred to as banking sustainability performance index (BSPI). This index may constitute a basis for ranking banks. Moreover, it may constitute a bridge between the assessment types of financial and extra-financial rating agencies. A real application is performed on three French banks.Keywords: multi-attribute utility theory, performance, sustainable banking, financial rating
Procedia PDF Downloads 46915108 Early Design Prediction of Submersible Maneuvers
Authors: Hernani Brinati, Mardel de Conti, Moyses Szajnbok, Valentina Domiciano
Abstract:
This study brings a mathematical model and examples for the numerical prediction of submersible maneuvers in the horizontal and in the vertical planes. The geometry of the submarine is here taken as a body of revolution plus a sail, two horizontal and two vertical rudders. The model includes the representation of the hull resistance and of the propeller thrust and torque, what enables to consider the variation of the longitudinal component of the velocity of the ship when maneuvering. The hydrodynamic forces are represented through power series expansions of the acceleration and velocity components. The hydrodynamic derivatives for the body of revolution are mostly estimated based on fundamental principles applicable to the flow around airplane fuselages in the subsonic regime. The hydrodynamic forces for the sail and rudders are estimated based on a finite aspect ratio wing theory. The objective of this study is to build an expedite model for submarine maneuvers prediction, based on fundamental principles, which may be convenient in the early stages of the ship design. This model is tested against available numerical and experimental data.Keywords: submarine maneuvers, submarine, maneuvering, dynamics
Procedia PDF Downloads 63815107 Median-Based Nonparametric Estimation of Returns in Mean-Downside Risk Portfolio Frontier
Authors: H. Ben Salah, A. Gannoun, C. de Peretti, A. Trabelsi
Abstract:
The Downside Risk (DSR) model for portfolio optimisation allows to overcome the drawbacks of the classical mean-variance model concerning the asymetry of returns and the risk perception of investors. This model optimization deals with a positive definite matrix that is endogenous with respect to portfolio weights. This aspect makes the problem far more difficult to handle. For this purpose, Athayde (2001) developped a new recurcive minimization procedure that ensures the convergence to the solution. However, when a finite number of observations is available, the portfolio frontier presents an appearance which is not very smooth. In order to overcome that, Athayde (2003) proposed a mean kernel estimation of the returns, so as to create a smoother portfolio frontier. This technique provides an effect similar to the case in which we had continuous observations. In this paper, taking advantage on the the robustness of the median, we replace the mean estimator in Athayde's model by a nonparametric median estimator of the returns. Then, we give a new version of the former algorithm (of Athayde (2001, 2003)). We eventually analyse the properties of this improved portfolio frontier and apply this new method on real examples.Keywords: Downside Risk, Kernel Method, Median, Nonparametric Estimation, Semivariance
Procedia PDF Downloads 49315106 Optimized Text Summarization Model on Mobile Screens for Sight-Interpreters: An Empirical Study
Authors: Jianhua Wang
Abstract:
To obtain key information quickly from long texts on small screens of mobile devices, sight-interpreters need to establish optimized summarization model for fast information retrieval. Four summarization models based on previous studies were studied including title+key words (TKW), title+topic sentences (TTS), key words+topic sentences (KWTS) and title+key words+topic sentences (TKWTS). Psychological experiments were conducted on the four models for three different genres of interpreting texts to establish the optimized summarization model for sight-interpreters. This empirical study shows that the optimized summarization model for sight-interpreters to quickly grasp the key information of the texts they interpret is title+key words (TKW) for cultural texts, title+key words+topic sentences (TKWTS) for economic texts and topic sentences+key words (TSKW) for political texts.Keywords: different genres, mobile screens, optimized summarization models, sight-interpreters
Procedia PDF Downloads 31615105 Overcoming the Impacts of Covid-19 Outbreak Using Value Integrated Project Delivery Model
Authors: G. Ramya
Abstract:
Value engineering is a systematic approach, widely used to optimize the design or process or product in the designing stage. It used to achieve the client's obligation by increasing the functionality and attain the targeted cost in the cost planning. Value engineering effectiveness and benefits decrease along with the progress of the project since the change in the scope of the work and design will account for more cost all along the lifecycle of the project. Integrating the value engineering with other project management activities will promote cost minimization, client satisfaction, and ensure early completion of the project in time. Previous research studies suggested that value engineering can integrate with other project delivery activities, but research studies unable to frame a model that collaborates the project management activities with the job plan of value engineering approach. I analyzed various project management activities and their synergy between each other. The project management activities and processes like a)risk analysis b)lifecycle cost analysis c)lean construction d)facility management e)Building information modelling f)Contract administration, collaborated, and project delivery model planned along with the RIBA plan of work. The key outcome of the research is a value-driven project delivery model, which will succeed in dealing with the economic impact, constraints and conflicts arise due to the COVID-19 outbreak in the Indian construction sector. Benefits associated with the structured framework is construction project delivery that ensures early contractor involvement, mutual risk sharing, and reviving the project with a cost overrun and delay back on track ,are discussed. Keywords: Value-driven project delivery model, Integration, RIBA plan of work Themes: Design EconomicsKeywords: value-driven project delivery model, Integration, RIBA
Procedia PDF Downloads 12215104 A Numerical Method to Evaluate the Elastoplastic Material Properties of Fiber Reinforced Composite
Authors: M. Palizvan, M. H. Sadr, M. T. Abadi
Abstract:
The representative volume element (RVE) plays a central role in the mechanics of random heterogeneous materials with a view to predicting their effective properties. In this paper, a computational homogenization methodology, developed to determine effective linear elastic properties of composite materials, is extended to predict the effective nonlinear elastoplastic response of long fiber reinforced composite. Finite element simulations of volumes of different sizes and fiber volume fractures are performed for calculation of the overall response RVE. The dependencies of the overall stress-strain curves on the number of fibers inside the RVE are studied in the 2D cases. Volume averaged stress-strain responses are generated from RVEs and compared with the finite element calculations available in the literature at moderate and high fiber volume fractions. For these materials, the existence of an RVE is demonstrated for the sizes of RVE corresponding to 10–100 times the diameter of the fibers. In addition, the response of small size RVE is found anisotropic, whereas the average of all large ones leads to recover the isotropic material properties.Keywords: homogenization, periodic boundary condition, elastoplastic properties, RVE
Procedia PDF Downloads 15615103 Model for Calculating Traffic Mass and Deceleration Delays Based on Traffic Field Theory
Authors: Liu Canqi, Zeng Junsheng
Abstract:
This study identifies two typical bottlenecks that occur when a vehicle cannot change lanes: car following and car stopping. The ideas of traffic field and traffic mass are presented in this work. When there are other vehicles in front of the target vehicle within a particular distance, a force is created that affects the target vehicle's driving speed. The characteristics of the driver and the vehicle collectively determine the traffic mass; the driving speed of the vehicle and external variables have no bearing on this. From a physical level, this study examines the vehicle's bottleneck when following a car, identifies the outside factors that have an impact on how it drives, takes into account that the vehicle will transform kinetic energy into potential energy during deceleration, and builds a calculation model for traffic mass. The energy-time conversion coefficient is created from an economic standpoint utilizing the social average wage level and the average cost of motor fuel. Vissim simulation program measures the vehicle's deceleration distance and delays under the Wiedemann car-following model. The difference between the measured value of deceleration delay acquired by simulation and the theoretical value calculated by the model is compared using the conversion calculation model of traffic mass and deceleration delay. The experimental data demonstrate that the model is reliable since the error rate between the theoretical calculation value of the deceleration delay obtained by the model and the measured value of simulation results is less than 10%. The article's conclusion is that the traffic field has an impact on moving cars on the road and that physical and socioeconomic factors should be taken into account while studying vehicle-following behavior. The deceleration delay value of a vehicle's driving and traffic mass have a socioeconomic relationship that can be utilized to calculate the energy-time conversion coefficient when dealing with the bottleneck of cars stopping and starting.Keywords: traffic field, social economics, traffic mass, bottleneck, deceleration delay
Procedia PDF Downloads 69