Search results for: conceptual and funnel methods
14573 Co-Creating Value between Public Financial Management Institutions: An Integrated Approach towards Financial Sustainability
Authors: Pascal Horni, Sandro Fuchs
Abstract:
In presence of increasing deficits and public debt among OECD countries, the debate on fiscal disciple and mechanisms to constrain public spending policy heated up and gave rise to the institutionalization of fiscal rules. Considering the notions from political economy literature and the therein advocated axiom of maximization of votes, introduction of institutional mechanisms and rules to govern public spending is likely to be coined by electoral motives. While there exists a series of research concerned with the rise of creative accounting in the presence fiscal rules, implementation of accrual government accounting and its impact on the biting of fiscal rules has to authors’ best knowledge never been explored. This paper serves the illumination of the connection between debt break mechanisms and the adoption of accrual public sector accounting standards such as the IPSAS in the interface of political economy in the Swiss context. By explicitly considering the technical accounting dimension, this paper develops an integrated conceptual view on well-established Public Financial Management (PFM) institutions and elaborates how their interdependencies can co-create value with regard to the contemporary challenge of fiscal sustainability. Derivation of this integrated view follows an explorative approach, taking into account expert interviews with director level staff from cantonal finance administrations and policy documents, as well as literature from both research areas – public sector accounting and political economy.Keywords: accounting, fiscal rules, International Public Sector Accounting Standards (IPSAS), public financial management
Procedia PDF Downloads 15914572 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 26014571 Comparison of Receiver Operating Characteristic Curve Smoothing Methods
Authors: D. Sigirli
Abstract:
The Receiver Operating Characteristic (ROC) curve is a commonly used statistical tool for evaluating the diagnostic performance of screening and diagnostic test with continuous or ordinal scale results which aims to predict the presence or absence probability of a condition, usually a disease. When the test results were measured as numeric values, sensitivity and specificity can be computed across all possible threshold values which discriminate the subjects as diseased and non-diseased. There are infinite numbers of possible decision thresholds along the continuum of the test results. The ROC curve presents the trade-off between sensitivity and the 1-specificity as the threshold changes. The empirical ROC curve which is a non-parametric estimator of the ROC curve is robust and it represents data accurately. However, especially for small sample sizes, it has a problem of variability and as it is a step function there can be different false positive rates for a true positive rate value and vice versa. Besides, the estimated ROC curve being in a jagged form, since the true ROC curve is a smooth curve, it underestimates the true ROC curve. Since the true ROC curve is assumed to be smooth, several smoothing methods have been explored to smooth a ROC curve. These include using kernel estimates, using log-concave densities, to fit parameters for the specified density function to the data with the maximum-likelihood fitting of univariate distributions or to create a probability distribution by fitting the specified distribution to the data nd using smooth versions of the empirical distribution functions. In the present paper, we aimed to propose a smooth ROC curve estimation based on the boundary corrected kernel function and to compare the performances of ROC curve smoothing methods for the diagnostic test results coming from different distributions in different sample sizes. We performed simulation study to compare the performances of different methods for different scenarios with 1000 repetitions. It is seen that the performance of the proposed method was typically better than that of the empirical ROC curve and only slightly worse compared to the binormal model when in fact the underlying samples were generated from the normal distribution.Keywords: empirical estimator, kernel function, smoothing, receiver operating characteristic curve
Procedia PDF Downloads 15214570 The Economic Limitations of Defining Data Ownership Rights
Authors: Kacper Tomasz Kröber-Mulawa
Abstract:
This paper will address the topic of data ownership from an economic perspective, and examples of economic limitations of data property rights will be provided, which have been identified using methods and approaches of economic analysis of law. To properly build a background for the economic focus, in the beginning a short perspective of data and data ownership in the EU’s legal system will be provided. It will include a short introduction to its political and social importance and highlight relevant viewpoints. This will stress the importance of a Single Market for data but also far-reaching regulations of data governance and privacy (including the distinction of personal and non-personal data, data held by public bodies and private businesses). The main discussion of this paper will build upon the briefly referred to legal basis as well as methods and approaches of economic analysis of law.Keywords: antitrust, data, data ownership, digital economy, property rights
Procedia PDF Downloads 8214569 Cultural and Historical Roots of Plagiarism in Georgia
Authors: Lali Khurtsia, Vano Tsertsvadze
Abstract:
The purpose of the study was to find out incentives and expectations, methods and ways, which are influential to students during working with their thesis. Research findings shows that the use of plagiarism has cultural links deep in the history - on the one hand, the tradition of sharing knowledge in the oral manner, with its different interpretations, and on the other hand the lack of fair and honest methods in the academic process. Research results allow us to determine general ideas about preventive policy to reduce the use of plagiarism. We conducted surveys in three different groups – we interviewed so-called diploma writers, students on bachelors and masters level and the focus group of lecturers. We found that the problem with plagiarism in Georgia has cultural-mental character. We think that nearest years’ main task should be breaking of barriers existed between lecturers and students and acknowledgement of honest principals of study process among students and pupils.Keywords: education, Georgia, plagiarism, study process, school, university
Procedia PDF Downloads 22914568 Advanced Numerical and Analytical Methods for Assessing Concrete Sewers and Their Remaining Service Life
Authors: Amir Alani, Mojtaba Mahmoodian, Anna Romanova, Asaad Faramarzi
Abstract:
Pipelines are extensively used engineering structures which convey fluid from one place to another. Most of the time, pipelines are placed underground and are encumbered by soil weight and traffic loads. Corrosion of pipe material is the most common form of pipeline deterioration and should be considered in both the strength and serviceability analysis of pipes. The study in this research focuses on concrete pipes in sewage systems (concrete sewers). This research firstly investigates how to involve the effect of corrosion as a time dependent process of deterioration in the structural and failure analysis of this type of pipe. Then three probabilistic time dependent reliability analysis methods including the first passage probability theory, the gamma distributed degradation model and the Monte Carlo simulation technique are discussed and developed. Sensitivity analysis indexes which can be used to identify the most important parameters that affect pipe failure are also discussed. The reliability analysis methods developed in this paper contribute as rational tools for decision makers with regard to the strengthening and rehabilitation of existing pipelines. The results can be used to obtain a cost-effective strategy for the management of the sewer system.Keywords: reliability analysis, service life prediction, Monte Carlo simulation method, first passage probability theory, gamma distributed degradation model
Procedia PDF Downloads 45714567 A Methodology for Automatic Diversification of Document Categories
Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.Keywords: big data analysis, document classification, multi-category, text mining, topic analysis
Procedia PDF Downloads 27214566 The Differences in Skill Performance Between Online and Conventional Learning Among Nursing Students
Authors: Nurul Nadrah
Abstract:
As a result of the COVID-19 pandemic, a movement control order was implemented, leading to the adoption of online learning as a substitute for conventional classroom instruction. Thus, this study aims to determine the differences in skill performance between online learning and conventional methods among nursing students. We employed a quasi-experimental design with purposive sampling, involving a total of 59 nursing students, and used online learning as the intervention. As a result, the study found there was a significant difference in student skill performance between online learning and conventional methods. As a conclusion, in times of hardship, it is necessary to implement alternative pedagogical approaches, especially in critical fields like nursing, to ensure the uninterrupted progression of educational programs. This study suggests that online learning can be effectively employed as a means of imparting knowledge to nursing students during their training.Keywords: nursing education, online learning, skill performance, conventional learning method
Procedia PDF Downloads 4914565 Advanced Concrete Crack Detection Using Light-Weight MobileNetV2 Neural Network
Authors: Li Hui, Riyadh Hindi
Abstract:
Concrete structures frequently suffer from crack formation, a critical issue that can significantly reduce their lifespan by allowing damaging agents to enter. Traditional methods of crack detection depend on manual visual inspections, which heavily relies on the experience and expertise of inspectors using tools. In this study, a more efficient, computer vision-based approach is introduced by using the lightweight MobileNetV2 neural network. A dataset of 40,000 images was used to develop a specialized crack evaluation algorithm. The analysis indicates that MobileNetV2 matches the accuracy of traditional CNN methods but is more efficient due to its smaller size, making it well-suited for mobile device applications. The effectiveness and reliability of this new method were validated through experimental testing, highlighting its potential as an automated solution for crack detection in concrete structures.Keywords: Concrete crack, computer vision, deep learning, MobileNetV2 neural network
Procedia PDF Downloads 6614564 Modelling and Simulation of Biomass Pyrolysis
Authors: P. Ahuja, K. S. S. Sai Krishna
Abstract:
There is a concern over the energy shortage in the modern societies as it is one of the primary necessities. Renewable energy, mainly biomass, is found to be one feasible solution as it is inexhaustible and clean energy source all over the world. Out of various methods, thermo chemical conversion is considered to be the most common and convenient method to extract energy from biomass. The thermo-chemical methods that are employed are gasification, liquefaction and combustion. On gasification biomass yields biogas, on liquefaction biomass yields bio-oil and on combustion biomass yields bio-char. Any attempt to biomass gasification, liquefaction or combustion calls for a good understanding of biomass pyrolysis. So, Irrespective of the method used the first step towards the thermo-chemical treatment of biomass is pyrolysis. Pyrolysis mainly converts the solid mass into liquid with gas and residual char as the byproducts. Liquid is used for the production of heat, power and many other chemicals whereas the gas and char can be used as fuels to generate heat.Keywords: biomass, fluidisation, pyrolysis, simulation
Procedia PDF Downloads 34214563 Analysis and Forecasting of Bitcoin Price Using Exogenous Data
Authors: J-C. Leneveu, A. Chereau, L. Mansart, T. Mesbah, M. Wyka
Abstract:
Extracting and interpreting information from Big Data represent a stake for years to come in several sectors such as finance. Currently, numerous methods are used (such as Technical Analysis) to try to understand and to anticipate market behavior, with mixed results because it still seems impossible to exactly predict a financial trend. The increase of available data on Internet and their diversity represent a great opportunity for the financial world. Indeed, it is possible, along with these standard financial data, to focus on exogenous data to take into account more macroeconomic factors. Coupling the interpretation of these data with standard methods could allow obtaining more precise trend predictions. In this paper, in order to observe the influence of exogenous data price independent of other usual effects occurring in classical markets, behaviors of Bitcoin users are introduced in a model reconstituting Bitcoin value, which is elaborated and tested for prediction purposes.Keywords: big data, bitcoin, data mining, social network, financial trends, exogenous data, global economy, behavioral finance
Procedia PDF Downloads 35514562 Knowledge, Attitude and Practice of the Congolese Population from Basic Territorial Entities on Family Planning:a Forgotten issue. Case of Murara Sector(City of Goma, Democratic Republic of Congo)
Authors: Mwamba Mwamini Ruth
Abstract:
For many authors,the percentage of married or in union persons using family planning methods has increased significantly since the 1960s, despite this progress, important differences across régions are observer.These différences become even greater,to present a paradox,when studying the issue in smallest territorial entities in developing countries.In line with the above,the general objective of this research is to investigate into "knowledge , attitude and practice"of households from a basic territorial entity,here in"Murara Sector"(in the city of Goma, province of North Kivu,Democratic Republic of Congo,Africa)on family planning (as defined and provisioned by the four World Health Organization-WHO key texts on the matter)Keywords: DRC, family planning methods, information technology, Murara
Procedia PDF Downloads 13914561 Research on the Planning and Design of National Park Gateway Communities from the Perspective of Nature Education
Authors: Yulin Liang
Abstract:
Under the background of protecting ecology, natural education is an effective way for people to understand nature. At the same time, it is a new means of sustainable development of eco-tourism, which can improve the functions of China's protected areas and develop new business formats for the development of national parks. This study takes national park gateway communities as the research object and uses literature review, inductive reasoning and other research methods to sort out the development process of natural education in China and the research progress of natural education design in national park gateway communities. Finally, we discuss how gateway communities can use natural education to transform their development methods and provide a theoretical and practical basis for the development of gateway communities in national parks.Keywords: natural education, gateway communities, national parks, sustainable development
Procedia PDF Downloads 6614560 Role of Social Media for Institutional Branding: Ethics of Communication Review
Authors: Iva Ariani, Mohammad Alvi Pratama
Abstract:
Currently, the world of communication experiences a rapid development. There are many ways of communication utilized in line with the development of science which creates many technologies that encourage a rapid development of communication system. However, despite giving convenience for the society, the development of communication system is not accompanied by the development of applicable values and regulations. Therefore, it raises many issues regarding false information or hoax which can influence the society’s mindset. This research aims to know the role of social media towards the reputation of an institution using a communication ethics study. It is a qualitative research using interview, observation, and literature study for collecting data. Then, the data will be analyzed using philosophical methods which are hermeneutic and deduction methods. This research is expected to show the role of social media in developing an institutional reputation in ethical review.Keywords: social media, ethics, communication, reputation
Procedia PDF Downloads 20714559 Morphological Characteristics and Pollination Requirement in Red Pitaya (Hylocereus Spp.)
Authors: Dinh Ha, Tran, Chung-Ruey Yen
Abstract:
This study explored the morphological characteristics and effects of pollination methods on fruit set and characteristics in four red pitaya (Hylocereus spp.) clones. The distinctive morphological recognition and classification among pitaya clones were confirmed by the stem, flower and fruit features. The fruit production season was indicated from the beginning of May to the end of August, the beginning of September with 6-7 flowering cycles per year. The floral stage took from 15-19 days and fruit duration spent 30–32 days. VN White, fully self-compatible, obtained high fruit set rates (80.0-90.5 %) in all pollination treatments and the maximum fruit weight (402.6 g) in hand self- and (403.4 g) in open-pollination. Chaozhou 5 was partially self-compatible while Orejona and F11 were completely self-incompatible. Hand cross-pollination increased significantly fruit set (95.8; 88.4 and 90.2 %) and fruit weight (374.2; 281.8 and 416.3 g) in Chaozhou 5, Orejona, and F11, respectively. TSS contents were not much influenced by pollination methods.Keywords: Hylocereus spp., morphology, floral phenology, pollination requirement
Procedia PDF Downloads 30414558 The Mechanical Properties of a Small-Size Seismic Isolation Rubber Bearing for Bridges
Authors: Yi F. Wu, Ai Q. Li, Hao Wang
Abstract:
Taking a novel type of bridge bearings with the diameter being 100mm as an example, the theoretical analysis, the experimental research as well as the numerical simulation of the bearing were conducted. Since the normal compression-shear machines cannot be applied to the small-size bearing, an improved device to test the properties of the bearing was proposed and fabricated. Besides, the simulation of the bearing was conducted on the basis of the explicit finite element software ANSYS/LS-DYNA, and some parameters of the bearing are modified in the finite element model to effectively reduce the computation cost. Results show that all the research methods are capable of revealing the fundamental properties of the small-size bearings, and a combined use of these methods can better catch both the integral properties and the inner detailed mechanical behaviors of the bearing.Keywords: ANSYS/LS-DYNA, compression shear, contact analysis, explicit algorithm, small-size
Procedia PDF Downloads 18114557 A Review of Data Visualization Best Practices: Lessons for Open Government Data Portals
Authors: Bahareh Ansari
Abstract:
Background: The Open Government Data (OGD) movement in the last decade has encouraged many government organizations around the world to make their data publicly available to advance democratic processes. But current open data platforms have not yet reached to their full potential in supporting all interested parties. To make the data useful and understandable for everyone, scholars suggested that opening the data should be supplemented by visualization. However, different visualizations of the same information can dramatically change an individual’s cognitive and emotional experience in working with the data. This study reviews the data visualization literature to create a list of the methods empirically tested to enhance users’ performance and experience in working with a visualization tool. This list can be used in evaluating the OGD visualization practices and informing the future open data initiatives. Methods: Previous reviews of visualization literature categorized the visualization outcomes into four categories including recall/memorability, insight/comprehension, engagement, and enjoyment. To identify the papers, a search for these outcomes was conducted in the abstract of the publications of top-tier visualization venues including IEEE Transactions for Visualization and Computer Graphics, Computer Graphics, and proceedings of the CHI Conference on Human Factors in Computing Systems. The search results are complemented with a search in the references of the identified articles, and a search for 'open data visualization,' and 'visualization evaluation' keywords in the IEEE explore and ACM digital libraries. Articles are included if they provide empirical evidence through conducting controlled user experiments, or provide a review of these empirical studies. The qualitative synthesis of the studies focuses on identification and classifying the methods, and the conditions under which they are examined to positively affect the visualization outcomes. Findings: The keyword search yields 760 studies, of which 30 are included after the title/abstract review. The classification of the included articles shows five distinct methods: interactive design, aesthetic (artistic) style, storytelling, decorative elements that do not provide extra information including text, image, and embellishment on the graphs), and animation. Studies on decorative elements show consistency on the positive effects of these elements on user engagement and recall but are less consistent in their examination of the user performance. This inconsistency could be attributable to the particular data type or specific design method used in each study. The interactive design studies are consistent in their findings of the positive effect on the outcomes. Storytelling studies show some inconsistencies regarding the design effect on user engagement, enjoyment, recall, and performance, which could be indicative of the specific conditions required for the use of this method. Last two methods, aesthetics and animation, have been less frequent in the included articles, and provide consistent positive results on some of the outcomes. Implications for e-government: Review of the visualization best-practice methods show that each of these methods is beneficial under specific conditions. By using these methods in a potentially beneficial condition, OGD practices can promote a wide range of individuals to involve and work with the government data and ultimately engage in government policy-making procedures.Keywords: best practices, data visualization, literature review, open government data
Procedia PDF Downloads 10614556 Mining Multicity Urban Data for Sustainable Population Relocation
Authors: Xu Du, Aparna S. Varde
Abstract:
In this research, we propose to conduct diagnostic and predictive analysis about the key factors and consequences of urban population relocation. To achieve this goal, urban simulation models extract the urban development trends as land use change patterns from a variety of data sources. The results are treated as part of urban big data with other information such as population change and economic conditions. Multiple data mining methods are deployed on this data to analyze nonlinear relationships between parameters. The result determines the driving force of population relocation with respect to urban sprawl and urban sustainability and their related parameters. Experiments so far reveal that data mining methods discover useful knowledge from the multicity urban data. This work sets the stage for developing a comprehensive urban simulation model for catering to specific questions by targeted users. It contributes towards achieving sustainability as a whole.Keywords: data mining, environmental modeling, sustainability, urban planning
Procedia PDF Downloads 30814555 Innovative Technologies Functional Methods of Dental Research
Authors: Sergey N. Ermoliev, Margarita A. Belousova, Aida D. Goncharenko
Abstract:
Application of the diagnostic complex of highly informative functional methods (electromyography, reodentography, laser Doppler flowmetry, reoperiodontography, vital computer capillaroscopy, optical tissue oximetry, laser fluorescence diagnosis) allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment. Introduction. It is necessary to create a complex of innovative highly informative and safe functional diagnostic methods for improvement of the quality of patient treatment by the early detection of stomatologic diseases. The purpose of the present study was to investigate the etiology and pathogenesis of functional disorders identified in the pathology of hard tissue, dental pulp, periodontal, oral mucosa and chewing function, and the creation of new approaches to the diagnosis of dental diseases. Material and methods. 172 patients were examined. Density of hard tissues of the teeth and jaw bone was studied by intraoral ultrasonic densitometry (USD). Electromyographic activity of masticatory muscles was assessed by electromyography (EMG). Functional state of dental pulp vessels assessed by reodentography (RDG) and laser Doppler flowmetry (LDF). Reoperiodontography method (RPG) studied regional blood flow in the periodontal tissues. Microcirculatory vascular periodontal studied by vital computer capillaroscopy (VCC) and laser Doppler flowmetry (LDF). The metabolic level of the mucous membrane was determined by optical tissue oximetry (OTO) and laser fluorescence diagnosis (LFD). Results and discussion. The results obtained revealed changes in mineral density of hard tissues of the teeth and jaw bone, the bioelectric activity of masticatory muscles, regional blood flow and microcirculation in the dental pulp and periodontal tissues. LDF and OTO methods estimated fluctuations of saturation level and oxygen transport in microvasculature of periodontal tissues. With LFD identified changes in the concentration of enzymes (nicotinamide, flavins, lipofuscin, porphyrins) involved in metabolic processes Conclusion. Our preliminary results confirmed feasibility and safety the of intraoral ultrasound densitometry technique in the density of bone tissue of periodontium. Conclusion. Application of the diagnostic complex of above mentioned highly informative functional methods allows to perform a multifactorial analysis of the dental status and to prescribe complex etiopathogenetic treatment.Keywords: electromyography (EMG), reodentography (RDG), laser Doppler flowmetry (LDF), reoperiodontography method (RPG), vital computer capillaroscopy (VCC), optical tissue oximetry (OTO), laser fluorescence diagnosis (LFD)
Procedia PDF Downloads 28014554 Multiscale Modelling of Textile Reinforced Concrete: A Literature Review
Authors: Anicet Dansou
Abstract:
Textile reinforced concrete (TRC)is increasingly used nowadays in various fields, in particular civil engineering, where it is mainly used for the reinforcement of damaged reinforced concrete structures. TRC is a composite material composed of multi- or uni-axial textile reinforcements coupled with a fine-grained cementitious matrix. The TRC composite is an alternative solution to the traditional Fiber Reinforcement Polymer (FRP) composite. It has good mechanical performance and better temperature stability but also, it makes it possible to meet the criteria of sustainable development better.TRCs are highly anisotropic composite materials with nonlinear hardening behavior; their macroscopic behavior depends on multi-scale mechanisms. The characterization of these materials through numerical simulation has been the subject of many studies. Since TRCs are multiscale material by definition, numerical multi-scale approaches have emerged as one of the most suitable methods for the simulation of TRCs. They aim to incorporate information pertaining to microscale constitute behavior, mesoscale behavior, and macro-scale structure response within a unified model that enables rapid simulation of structures. The computational costs are hence significantly reduced compared to standard simulation at a fine scale. The fine scale information can be implicitly introduced in the macro scale model: approaches of this type are called non-classical. A representative volume element is defined, and the fine scale information are homogenized over it. Analytical and computational homogenization and nested mesh methods belong to these approaches. On the other hand, in classical approaches, the fine scale information are explicitly introduced in the macro scale model. Such approaches pertain to adaptive mesh refinement strategies, sub-modelling, domain decomposition, and multigrid methods This research presents the main principles of numerical multiscale approaches. Advantages and limitations are identified according to several criteria: the assumptions made (fidelity), the number of input parameters required, the calculation costs (efficiency), etc. A bibliographic study of recent results and advances and of the scientific obstacles to be overcome in order to achieve an effective simulation of textile reinforced concrete in civil engineering is presented. A comparative study is further carried out between several methods for the simulation of TRCs used for the structural reinforcement of reinforced concrete structures.Keywords: composites structures, multiscale methods, numerical modeling, textile reinforced concrete
Procedia PDF Downloads 10814553 Comparative Study and Parallel Implementation of Stochastic Models for Pricing of European Options Portfolios using Monte Carlo Methods
Authors: Vinayak Bassi, Rajpreet Singh
Abstract:
Over the years, with the emergence of sophisticated computers and algorithms, finance has been quantified using computational prowess. Asset valuation has been one of the key components of quantitative finance. In fact, it has become one of the embryonic steps in determining risk related to a portfolio, the main goal of quantitative finance. This study comprises a drawing comparison between valuation output generated by two stochastic dynamic models, namely Black-Scholes and Dupire’s bi-dimensionality model. Both of these models are formulated for computing the valuation function for a portfolio of European options using Monte Carlo simulation methods. Although Monte Carlo algorithms have a slower convergence rate than calculus-based simulation techniques (like FDM), they work quite effectively over high-dimensional dynamic models. A fidelity gap is analyzed between the static (historical) and stochastic inputs for a sample portfolio of underlying assets. In order to enhance the performance efficiency of the model, the study emphasized the use of variable reduction methods and customizing random number generators to implement parallelization. An attempt has been made to further implement the Dupire’s model on a GPU to achieve higher computational performance. Furthermore, ideas have been discussed around the performance enhancement and bottleneck identification related to the implementation of options-pricing models on GPUs.Keywords: monte carlo, stochastic models, computational finance, parallel programming, scientific computing
Procedia PDF Downloads 16214552 Evaluation of Quasi-Newton Strategy for Algorithmic Acceleration
Authors: T. Martini, J. M. Martínez
Abstract:
An algorithmic acceleration strategy based on quasi-Newton (or secant) methods is displayed for address the practical problem of accelerating the convergence of the Newton-Lagrange method in the case of convergence to critical multipliers. Since the Newton-Lagrange iteration converges locally at a linear rate, it is natural to conjecture that quasi-Newton methods based on the so called secant equation and some minimal variation principle, could converge superlinearly, thus restoring the convergence properties of Newton's method. This strategy can also be applied to accelerate the convergence of algorithms applied to fixed-points problems. Computational experience is reported illustrating the efficiency of this strategy to solve fixed-point problems with linear convergence rate.Keywords: algorithmic acceleration, fixed-point problems, nonlinear programming, quasi-newton method
Procedia PDF Downloads 48914551 Electroencephalogram Based Alzheimer Disease Classification using Machine and Deep Learning Methods
Authors: Carlos Roncero-Parra, Alfonso Parreño-Torres, Jorge Mateo Sotos, Alejandro L. Borja
Abstract:
In this research, different methods based on machine/deep learning algorithms are presented for the classification and diagnosis of patients with mental disorders such as alzheimer. For this purpose, the signals obtained from 32 unipolar electrodes identified by non-invasive EEG were examined, and their basic properties were obtained. More specifically, different well-known machine learning based classifiers have been used, i.e., support vector machine (SVM), Bayesian linear discriminant analysis (BLDA), decision tree (DT), Gaussian Naïve Bayes (GNB), K-nearest neighbor (KNN) and Convolutional Neural Network (CNN). A total of 668 patients from five different hospitals have been studied in the period from 2011 to 2021. The best accuracy is obtained was around 93 % in both ADM and ADA classifications. It can be concluded that such a classification will enable the training of algorithms that can be used to identify and classify different mental disorders with high accuracy.Keywords: alzheimer, machine learning, deep learning, EEG
Procedia PDF Downloads 12614550 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling
Authors: Florin Leon, Silvia Curteanu
Abstract:
Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression
Procedia PDF Downloads 30514549 Axial Load Capacity of Drilled Shafts from In-Situ Test Data at Semani Site, in Albania
Authors: Neritan Shkodrani, Klearta Rrushi, Anxhela Shaha
Abstract:
Generally, the design of axial load capacity of deep foundations is based on the data provided from field tests, such as SPT (Standard Penetration Test) and CPT (Cone Penetration Test) tests. This paper reports the results of axial load capacity analysis of drilled shafts at a construction site at Semani, in Fier county, Fier prefecture in Albania. In this case, the axial load capacity analyses are based on the data of 416 SPT tests and 12 CPTU tests, which are carried out in this site construction using 12 boreholes (10 borings of a depth 30.0 m and 2 borings of a depth of 80.0m). The considered foundation widths range from 0.5m to 2.5 m and foundation embedment lengths is fixed at a value of 25m. SPT – based analytical methods from the Japanese practice of design (Building Standard Law of Japan) and CPT – based analytical Eslami and Fellenius methods are used for obtaining axial ultimate load capacity of drilled shafts. The considered drilled shaft (25m long and 0.5m - 2.5m in diameter) is analyzed for the soil conditions of each borehole. The values obtained from sets of calculations are shown in different charts. Then the reported axial load capacity values acquired from SPT and CPTU data are compared and some conclusions are found related to the mentioned methods of calculations.Keywords: deep foundations, drilled shafts, axial load capacity, ultimate load capacity, allowable load capacity, SPT test, CPTU test
Procedia PDF Downloads 10414548 Research on Community-based Nature Education Design at the Gateway Communities of National Parks
Authors: Yulin Liang
Abstract:
Under the background of protecting ecology, natural education is an effective way for people to understand nature. At the same time, it is a new means of sustainable development of eco-tourism, which can improve the functions of China 's protected areas and develop new business formats for the development of national parks. This study takes national park gateway communities as the research object and uses literature review, inductive reasoning and other research methods to sort out the development process of natural education in China and the research progress of natural education design in national park gateway communities. Finally, it discuss how gateway communities can use natural education to transform their development methods and provide theoretical and practical basis for the development of gateway communities in national parks.Keywords: nature education, gateway communities, national park, sustainable development
Procedia PDF Downloads 6114547 Evaluation of Antioxidants in Medicinal plant Limoniastrum guyonianum
Authors: Assia Belfar, Mohamed Hadjadj, Messaouda Dakmouche, Zineb Ghiaba
Abstract:
Introduction: This study aims to phytochemical screening; Extracting the active compounds and estimate the effectiveness of antioxidant in Medicinal plants desert Limoniastrum guyonianum (Zeïta) from South Algeria. Methods: Total phenolic content and total flavonoid content using Folin-Ciocalteu and aluminum chloride colorimetric methods, respectively. The total antioxidant capacity was estimated by the following methods: DPPH (1.1-diphenyl-2-picrylhydrazyl radical) and reducing power assay. Results: Phytochemical screening of the plant part reveals the presence of phenols, saponins, flavonoids and tannins. While alkaloids and Terpenoids were absent. The acetonic extract of L. guyonianum was extracted successively with ethyl acetate and butanol. Extraction of yield varied widely in the L. guyonianum ranging from (0.9425 %to 11.131%). The total phenolic content ranged from 53.33 mg GAE/g DW to 672.79 mg GAE/g DW. The total flavonoid concentrations varied from 5.45 to 21.71 mg/100g. IC50 values ranged from 0.02 ± 0.0004 to 0.13 ± 0.002 mg/ml. All extracts showed very good activity of ferric reducing power, the higher power was in butanol fraction (23.91 mM) more effective than BHA, BHT and VC. Conclusions: Demonstrated this study that the acetonic extract of L. guyonianum contain a considerable quantity of phenolic compounds and possess a good antioxidant activity. Can be used as an easily accessible source of Natural Antioxidants and as a possible food supplement and in the pharmaceutical industry.Keywords: limoniastrum guyonianum, phenolics compounds, flavonoid compound, antioxidant activity
Procedia PDF Downloads 34814546 Challenges and Solutions to Human Capital Development in Thailand
Authors: Nhabhat Chaimongkol
Abstract:
Human capital is one of the factors that are vital for economic growth. This is especially true as humans will face increasingly more forms of disruptive technology in the near future. Therefore, there is a need to develop human capital in order to overcome the current uncertainty in the global economy and the future of jobs. In recent years, Thailand has increasingly devoted more attention to developing its human capital. The Thai government has raised this issue in its national agenda, which is part of its 20-year national strategy. Currently, there are multiple challenges and solutions regarding this issue. This study aims to find out what are the challenges and solutions to human capital development in Thailand. The research in this study uses mixed methods consisting of quantitative and qualitative research methods. The results show that while Thailand has many plans to develop human capital, it is still lacking the necessary actions and integrations that are required to achieve its goals. Finally, the challenges and solutions will be discussed in detail.Keywords: challenges, human capital, solutions, Thailand
Procedia PDF Downloads 17214545 Determination of Non-CO2 Greenhouse Gas Emission in Electronics Industry
Authors: Bong Jae Lee, Jeong Il Lee, Hyo Su Kim
Abstract:
Both developed and developing countries have adopted the decision to join the Paris agreement to reduce greenhouse gas (GHG) emissions at the Conference of the Parties (COP) 21 meeting in Paris. As a result, the developed and developing countries have to submit the Intended Nationally Determined Contributions (INDC) by 2020, and each country will be assessed for their performance in reducing GHG. After that, they shall propose a reduction target which is higher than the previous target every five years. Therefore, an accurate method for calculating greenhouse gas emissions is essential to be presented as a rational for implementing GHG reduction measures based on the reduction targets. Non-CO2 GHGs (CF4, NF3, N2O, SF6 and so on) are being widely used in fabrication process of semiconductor manufacturing, and etching/deposition process of display manufacturing process. The Global Warming Potential (GWP) value of Non-CO2 is much higher than CO2, which means it will have greater effect on a global warming than CO2. Therefore, GHG calculation methods of the electronics industry are provided by Intergovernmental Panel on climate change (IPCC) and U.S. Environmental Protection Agency (EPA), and it will be discussed at ISO/TC 146 meeting. As discussed earlier, being precise and accurate in calculating Non-CO2 GHG is becoming more important. Thus this study aims to discuss the implications of the calculating methods through comparing the methods of IPCC and EPA. As a conclusion, after analyzing the methods of IPCC & EPA, the method of EPA is more detailed and it also provides the calculation for N2O. In case of the default emission factor (by IPCC & EPA), IPCC provides more conservative results compared to that of EPA; The factor of IPCC was developed for calculating a national GHG emission, while the factor of EPA was specifically developed for the U.S. which means it must have been developed to address the environmental issue of the US. The semiconductor factory ‘A’ measured F gas according to the EPA Destruction and Removal Efficiency (DRE) protocol and estimated their own DRE, and it was observed that their emission factor shows higher DRE compared to default DRE factor of IPCC and EPA Therefore, each country can improve their GHG emission calculation by developing its own emission factor (if possible) at the time of reporting Nationally Determined Contributions (NDC). Acknowledgements: This work was supported by the Korea Evaluation Institute of Industrial Technology (No. 10053589).Keywords: non-CO2 GHG, GHG emission, electronics industry, measuring method
Procedia PDF Downloads 28914544 The Effectiveness of Concept Mapping as a Tool for Developing Critical Thinking in Undergraduate Medical Education: A BEME Systematic Review: BEME Guide No. 81
Authors: Marta Fonseca, Pedro Marvão, Beatriz Oliveira, Bruno Heleno, Pedro Carreiro-Martins, Nuno Neuparth, António Rendas
Abstract:
Background: Concept maps (CMs) visually represent hierarchical connections among related ideas. They foster logical organization and clarify idea relationships, potentially aiding medical students in critical thinking (to think clearly and rationally about what to do or what to believe). However, there are inconsistent claims about the use of CMs in undergraduate medical education. Our three research questions are: 1) What studies have been published on concept mapping in undergraduate medical education? 2) What was the impact of CMs on students’ critical thinking? 3) How and why have these interventions had an educational impact? Methods: Eight databases were systematically searched (plus a manual and an additional search were conducted). After eliminating duplicate entries, titles, and abstracts, and full-texts were independently screened by two authors. Data extraction and quality assessment of the studies were independently performed by two authors. Qualitative and quantitative data were integrated using mixed-methods. The results were reported using the structured approach to the reporting in healthcare education of evidence synthesis statement and BEME guidance. Results: Thirty-nine studies were included from 26 journals (19 quantitative, 8 qualitative and 12 mixed-methods studies). CMs were considered as a tool to promote critical thinking, both in the perception of students and tutors, as well as in assessing students’ knowledge and/or skills. In addition to their role as facilitators of knowledge integration and critical thinking, CMs were considered both teaching and learning methods. Conclusions: CMs are teaching and learning tools which seem to help medical students develop critical thinking. This is due to the flexibility of the tool as a facilitator of knowledge integration, as a learning and teaching method. The wide range of contexts, purposes, and variations in how CMs and instruments to assess critical thinking are used increase our confidence that the positive effects are consistent.Keywords: concept map, medical education, undergraduate, critical thinking, meaningful learning
Procedia PDF Downloads 125