Search results for: geophysical methods
14770 Weighted G2 Multi-Degree Reduction of Bezier Curves
Authors: Salisu ibrahim, Abdalla Rababah
Abstract:
In this research, we use Weighted G2-Multi-degree reduction of Bezier curve of degree n to a Bezier curve of degree m, m < n. The degree reduction of Bezier curves is used to represent a given Bezier curve of n by a Bezier curve of degree m, m < n. Exact degree reduction is not possible, and degree reduction is approximate process in nature. We derive a weighted degree reducing method that is geometrically continuous at the end points. Different norms will be considered, several error minimizations will be given. The proposed methods produce error function that are less than the errors of existing methods.Keywords: Bezier curves, multiple degree reduction, geometric continuity, error function
Procedia PDF Downloads 48214769 Your First Step to Understanding Research Ethics: Psychoneurolinguistic Approach
Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari
Abstract:
Objective: This research aims at investigating the research ethics in the field of science. Method: It is an exploratory research wherein the researchers attempted to cover the phenomenon at hand from all specialists’ viewpoints. Results Discussion is based upon the findings resulted from the analysis the researcher undertook. Concerning the results’ prediction, the researcher needs first to seek highly qualified people in the field of research as well as in the field of statistics who share the philosophy of the research. Then s/he should make sure that s/he is adequately trained in the specific techniques, methods and statically programs that are used at the study. S/he should also believe in continually analysis for the data in the most current methods.Keywords: research ethics, legal, rights, psychoneurolinguistics
Procedia PDF Downloads 4514768 The Need for Embodiment Perspectives and Somatic Methods in Social Work Curriculum: Lessons Learned from a Decade of Developing a Program to Support College Students Who Exited the State Foster Care System
Authors: Yvonne A. Unrau
Abstract:
Social work education is a competency-based curriculum that relies mostly on cognitive frameworks and problem-solving models. Absent from the curriculum is knowledge and skills that draw from an embodiment perspective, especially somatic practice methods. Embodiment broadly encompasses the understanding that biological, political, historical, and social factors impact human development via changes to the nervous system. In the past 20 years, research has well-established that unresolved traumatic events, especially during childhood, negatively impacts long-term health and well-being. Furthermore, traumatic stress compromises cognitive processing and activates reflexive action such as ‘fight’ or ‘flight,’ which are the focus of somatic methods. The main objective of this paper is to show how embodiment perspectives and somatic methods can enhance social work practice overall. Using an exploratory approach, the author shares a decade-long journey that involved creating an education-support program for college students who exited the state foster care system. Personal experience, program outcomes and case study narratives revealed that ‘classical’ social work methods were insufficient to fully address the complex needs of college students who were living with complex traumatic stressors. The paper chronicles select case study scenarios and key program development milestones over a 10-year period to show the benefit of incorporating embodiment perspectives in social work practice. The lessons reveal that there is an immediate need for social work curriculum to include embodiment perspectives so that social workers may be equipped to respond competently to their many clients who live with unresolved trauma.Keywords: social work practice, social work curriculum, embodiment, traumatic stress
Procedia PDF Downloads 12414767 A Comparative Study of Multi-SOM Algorithms for Determining the Optimal Number of Clusters
Authors: Imèn Khanchouch, Malika Charrad, Mohamed Limam
Abstract:
The interpretation of the quality of clusters and the determination of the optimal number of clusters is still a crucial problem in clustering. We focus in this paper on multi-SOM clustering method which overcomes the problem of extracting the number of clusters from the SOM map through the use of a clustering validity index. We then tested multi-SOM using real and artificial data sets with different evaluation criteria not used previously such as Davies Bouldin index, Dunn index and silhouette index. The developed multi-SOM algorithm is compared to k-means and Birch methods. Results show that it is more efficient than classical clustering methods.Keywords: clustering, SOM, multi-SOM, DB index, Dunn index, silhouette index
Procedia PDF Downloads 59914766 Effect of Fresh Concrete Curing Methods on Its Compressive Strength
Authors: Xianghe Dai, Dennis Lam, Therese Sheehan, Naveed Rehman, Jie Yang
Abstract:
Concrete is one of the most used construction materials that may be made onsite as fresh concrete and then placed in formwork to produce the desired shapes of structures. It has been recognized that the raw materials and mix proportion of concrete dominate the mechanical characteristics of hardened concrete, and the curing method and environment applied to the concrete in early stages of hardening will significantly influence the concrete properties, such as compressive strength, durability, permeability etc. In construction practice, there are various curing methods to maintain the presence of mixing water throughout the early stages of concrete hardening. They are also beneficial to concrete in hot weather conditions as they provide cooling and prevent the evaporation of water. Such methods include ponding or immersion, spraying or fogging, saturated wet covering etc. Also there are various curing methods that may be implemented to decrease the level of water lost which belongs to the concrete surface, such as putting a layer of impervious paper, plastic sheeting or membrane on the concrete to cover it. In the concrete material laboratory, accelerated strength gain methods supply the concrete with heat and additional moisture by applying live steam, coils that are subject to heating or pads that have been warmed electrically. Currently when determining the mechanical parameters of a concrete, the concrete is usually sampled from fresh concrete on site and then cured and tested in laboratories where standardized curing procedures are adopted. However, in engineering practice, curing procedures in the construction sites after the placing of concrete might be very different from the laboratory criteria, and this includes some standard curing procedures adopted in the laboratory that can’t be applied on site. Sometimes the contractor compromises the curing methods in order to reduce construction costs etc. Obviously the difference between curing procedures adopted in the laboratory and those used on construction sites might over- or under-estimate the real concrete quality. This paper presents the effect of three typical curing methods (air curing, water immersion curing, plastic film curing) and of maintaining concrete in steel moulds on the compressive strength development of normal concrete. In this study, Portland cement with 30% fly ash was used and different curing periods, 7 days, 28 days and 60 days were applied. It was found that the highest compressive strength was observed from concrete samples to which 7-day water immersion curing was applied and from samples maintained in steel moulds up to the testing date. The research results implied that concrete used as infill in steel tubular members might develop a higher strength than predicted by design assumptions based on air curing methods. Wrapping concrete with plastic film as a curing method might delay the concrete strength development in the early stages. Water immersion curing for 7 days might significantly increase the concrete compressive strength.Keywords: compressive strength, air curing, water immersion curing, plastic film curing, maintaining in steel mould, comparison
Procedia PDF Downloads 29414765 Extraction and Characterization of Ethiopian Hibiscus macranthus Bast Fiber
Authors: Solomon Tilahun Desisa, Muktar Seid Hussen
Abstract:
Hibiscus macranthus is one of family Malvaceae and genus Hibiscus plant which grows mainly in western part of Ethiopia. Hibiscus macranthus is the most adaptable and abundant plant in the nation, which are used as an ornamental plant often a hedge or fence plant, and used as a firewood after harvesting the stem together with the bark, and used also as a fiber for trying different kinds of things by forming the rope. However, Hibiscus macranthus plant fibre has not been commercially exploited and extracted properly. This study of work describes the possibility of mechanical and retting methods of Hibiscus macranthus fibre extraction and characterization. Hibiscus macranthus fibre is a bast fibre which obtained naturally from the stem or stalks of the dicotyledonous plant since it is a natural cellulose plant fiber. And the fibre characterized by studying its physical and chemical properties. The physical characteristics were investigated as follows, including the length of 100-190mm, fineness of 1.0-1.2Tex, diameter under X100 microscopic view 16-21 microns, the moisture content of 12.46% and dry tenacity of 48-57cN/Tex along with breaking extension of 0.9-1.6%. Hibiscus macranthus fiber productivity was observed that 12-18% of the stem out of which more than 65% is primary long fibers. The fiber separation methods prove to decrease of non-cellulose ingredients in the order of mechanical, water and chemical methods. The color measurement also shows the raw Hibiscus macranthus fiber has a natural golden color according to YID1925 and paler look under both retting methods than mechanical separation. Finally, it is suggested that Hibiscus macranthus fibre can be used for manufacturing of natural and organic crop and coffee packages as well as super absorbent, fine and high tenacity textile products.Keywords: Hibiscus macranthus, bast fiber, extraction, characterization
Procedia PDF Downloads 21214764 Working Memory Growth from Kindergarten to First Grade: Considering Impulsivity, Parental Discipline Methods and Socioeconomic Status
Authors: Ayse Cobanoglu
Abstract:
Working memory can be defined as a workspace that holds and regulates active information in mind. This study investigates individual changes in children's working memory from kindergarten to first grade. The main purpose of the study is whether parental discipline methods and child impulsive/overactive behaviors affect children's working memory initial status and growth rate, controlling for gender, minority status, and socioeconomic status (SES). A linear growth curve model with the first four waves of the Early Childhood Longitudinal Study-Kindergarten Cohort of 2011 (ECLS-K:2011) is performed to analyze the individual growth of children's working memory longitudinally (N=3915). Results revealed that there is a significant variation among students' initial status in the kindergarten fall semester as well as the growth rate during the first two years of schooling. While minority status, SES, and children's overactive/impulsive behaviors influenced children's initial status, only SES and minority status were significantly associated with the growth rate of working memory. For parental discipline methods, such as giving a warning and ignoring the child's negative behavior, are also negatively associated with initial working memory scores. Following that, students' working memory growth rate is examined, and students with lower SES as well as minorities showed a faster growth pattern during the first two years of schooling. However, the findings of parental disciplinary methods on working memory growth rates were mixed. It can be concluded that schooling helps low-SES minority students to develop their working memory.Keywords: growth curve modeling, impulsive/overactive behaviors, parenting, working memory
Procedia PDF Downloads 13614763 Topology Optimization of the Interior Structures of Beams under Various Load and Support Conditions with Solid Isotropic Material with Penalization Method
Authors: Omer Oral, Y. Emre Yilmaz
Abstract:
Topology optimization is an approach that optimizes material distribution within a given design space for a certain load and boundary conditions by providing performance goals. It uses various restrictions such as boundary conditions, set of loads, and constraints to maximize the performance of the system. It is different than size and shape optimization methods, but it reserves some features of both methods. In this study, interior structures of the parts were optimized by using SIMP (Solid Isotropic Material with Penalization) method. The volume of the part was preassigned parameter and minimum deflection was the objective function. The basic idea behind the theory was considered, and different methods were discussed. Rhinoceros 3D design tool was used with Grasshopper and TopOpt plugins to create and optimize parts. A Grasshopper algorithm was designed and tested for different beams, set of arbitrary located forces and support types such as pinned, fixed, etc. Finally, 2.5D shapes were obtained and verified by observing the changes in density function.Keywords: Grasshopper, lattice structure, microstructures, Rhinoceros, solid isotropic material with penalization method, TopOpt, topology optimization
Procedia PDF Downloads 13814762 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods
Authors: Autcha Araveeporn
Abstract:
This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution
Procedia PDF Downloads 35714761 Evaluation of Traditional Methods in Construction and Their Effects on Reinforced-Concrete Buildings Behavior
Authors: E. H. N. Gashti, M. Zarrini, M. Irannezhad, J. R. Langroudi
Abstract:
Using ETABS software, this study analyzed 23 buildings to evaluate effects of mistakes during construction phase on buildings structural behavior. For modelling, two different loadings were assumed: 1) design loading and 2) loading due to the effects of mistakes in construction phase. Research results determined that considering traditional construction methods for buildings resulted in a significant increase in dead loads and consequently intensified the displacements and base-shears of buildings under seismic loads.Keywords: reinforced-concrete buildings, construction mistakes, base-shear, displacements, failure
Procedia PDF Downloads 27014760 A New Method to Winner Determination for Economic Resource Allocation in Cloud Computing Systems
Authors: Ebrahim Behrouzian Nejad, Rezvan Alipoor Sabzevari
Abstract:
Cloud computing systems are large-scale distributed systems, so that they focus more on large scale resource sharing, cooperation of several organizations and their use in new applications. One of the main challenges in this realm is resource allocation. There are many different ways to resource allocation in cloud computing. One of the common methods to resource allocation are economic methods. Among these methods, the auction-based method has greater prominence compared with Fixed-Price method. The double combinatorial auction is one of the proper ways of resource allocation in cloud computing. This method includes two phases: winner determination and resource allocation. In this paper a new method has been presented to determine winner in double combinatorial auction-based resource allocation using Imperialist Competitive Algorithm (ICA). The experimental results show that in our new proposed the number of winner users is higher than genetic algorithm. On other hand, in proposed algorithm, the number of winner providers is higher in genetic algorithm.Keywords: cloud computing, resource allocation, double auction, winner determination
Procedia PDF Downloads 36014759 Personnel Selection Based on Step-Wise Weight Assessment Ratio Analysis and Multi-Objective Optimization on the Basis of Ratio Analysis Methods
Authors: Emre Ipekci Cetin, Ebru Tarcan Icigen
Abstract:
Personnel selection process is considered as one of the most important and most difficult issues in human resources management. At the stage of personnel selection, the applicants are handled according to certain criteria, the candidates are dealt with, and efforts are made to select the most appropriate candidate. However, this process can be more complicated in terms of the managers who will carry out the staff selection process. Candidates should be evaluated according to different criteria such as work experience, education, foreign language level etc. It is crucial that a rational selection process is carried out by considering all the criteria in an integrated structure. In this study, the problem of choosing the front office manager of a 5 star accommodation enterprise operating in Antalya is addressed by using multi-criteria decision-making methods. In this context, SWARA (Step-wise weight assessment ratio analysis) and MOORA (Multi-Objective Optimization on the basis of ratio analysis) methods, which have relatively few applications when compared with other methods, have been used together. Firstly SWARA method was used to calculate the weights of the criteria and subcriteria that were determined by the business. After the weights of the criteria were obtained, the MOORA method was used to rank the candidates using the ratio system and the reference point approach. Recruitment processes differ from sector to sector, from operation to operation. There are a number of criteria that must be taken into consideration by businesses in accordance with the structure of each sector. It is of utmost importance that all candidates are evaluated objectively in the framework of these criteria, after these criteria have been carefully selected in the selection of suitable candidates for employment. In the study, staff selection process was handled by using SWARA and MOORA methods together.Keywords: accommodation establishments, human resource management, multi-objective optimization on the basis of ratio analysis, multi-criteria decision making, step-wise weight assessment ratio analysis
Procedia PDF Downloads 34414758 The Menu Planning Problem: A Systematic Literature Review
Authors: Dorra Kallel, Ines Kanoun, Diala Dhouib
Abstract:
This paper elaborates a Systematic Literature Review SLR) to select the most outstanding studies that address the Menu Planning Problem (MPP) and to classify them according to the to the three following criteria: the used methods, types of patients and the required constraints. At first, a set of 4165 studies was selected. After applying the SLR’s guidelines, this collection was filtered to 13 studies using specific inclusion and exclusion criteria as well as an accurate analysis of each study. Second, the selected papers were invested to answer the proposed research questions. Finally, data synthesis and new perspectives for future works are incorporated in the closing section.Keywords: Menu Planning Problem (MPP), Systematic Literature Review (SLR), classification, exact and approaches methods
Procedia PDF Downloads 28114757 Comparison of the Boundary Element Method and the Method of Fundamental Solutions for Analysis of Potential and Elasticity
Authors: S. Zenhari, M. R. Hematiyan, A. Khosravifard, M. R. Feizi
Abstract:
The boundary element method (BEM) and the method of fundamental solutions (MFS) are well-known fundamental solution-based methods for solving a variety of problems. Both methods are boundary-type techniques and can provide accurate results. In comparison to the finite element method (FEM), which is a domain-type method, the BEM and the MFS need less manual effort to solve a problem. The aim of this study is to compare the accuracy and reliability of the BEM and the MFS. This comparison is made for 2D potential and elasticity problems with different boundary and loading conditions. In the comparisons, both convex and concave domains are considered. Both linear and quadratic elements are employed for boundary element analysis of the examples. The discretization of the problem domain in the BEM, i.e., converting the boundary of the problem into boundary elements, is relatively simple; however, in the MFS, obtaining appropriate locations of collocation and source points needs more attention to obtain reliable solutions. The results obtained from the presented examples show that both methods lead to accurate solutions for convex domains, whereas the BEM is more suitable than the MFS for concave domains.Keywords: boundary element method, method of fundamental solutions, elasticity, potential problem, convex domain, concave domain
Procedia PDF Downloads 9114756 The Views of Health Care Professionals outside of the General Practice Setting on the Provision of Oral Contraception in Comparison to Long-Acting Reversible Contraception
Authors: Carri Welsby, Jessie Gunson, Pen Roe
Abstract:
Currently, there is limited research examining health care professionals (HCPs) views on long-acting reversible contraception (LARC) advice and prescription, particularly outside of the general practice (GP) setting. The aim of this study is to systematically review existing evidence around the barriers and enablers of oral contraception (OC) in comparison to LARC, as perceived by HCPs in non-GP settings. Five electronic databases were searched in April 2018 using terms related to LARC, OC, HCPs, and views, but not terms related to GPs. Studies were excluded if they concerned emergency oral contraception, male contraceptives, contraceptive use in conjunction with a health condition(s), developing countries, GPs and GP settings, were non-English or was not published before 2013. A total of six studies were included for systematic reviewing. Five key areas emerged, under which themes were categorised, including (1) understanding HCP attitudes and counselling practices towards contraceptive methods; (2) assessment of HCP attitudes and beliefs about contraceptive methods; (3) misconceptions and concerns towards contraceptive methods; and (4) influences on views, attitudes, and beliefs of contraceptive methods. Limited education and training of HCPs exists around LARC provision, particularly compared to OC. The most common misconception inhibiting HCPs contraceptive information delivery to women was the belief that LARC was inappropriate for nulliparous women. In turn, by not providing the correct information on a variety of contraceptive methods, HCP counselling practices were disempowering for women and restricted them from accessing reproductive justice. Educating HCPs to be able to provide accurate and factual information to women on all contraception is vital to encourage a woman-centered approach during contraceptive counselling and promote informed choices by women.Keywords: advice, contraceptives, health care professionals, long acting reversible contraception, oral contraception, reproductive justice
Procedia PDF Downloads 16114755 A Method of Improving Out Put Using a Feedback Supply Chain System: Case Study Bramlima
Authors: Samuel Atongaba Danji, Veseke Moleke
Abstract:
The increase of globalization is a very important part of today’s changing environment and due to this, manufacturing industries have to always come up with methods of continuous improvement of their manufacturing methods in order to be competitive, without which may lead them to be left out of the market due to constant changing customers requirement. Due to this, the need is an advance supply chain system which prevents a number of issues that can prevent a company from being competitive. In this work, we developed a feedback control supply chain system which streamline the entire process in order to improve competitiveness and the result shows that when applied in a different geographical area, the output varies.Keywords: globalization, supply chain, improvement, manufacturing
Procedia PDF Downloads 33314754 Wavelets Contribution on Textual Data Analysis
Authors: Habiba Ben Abdessalem
Abstract:
The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.Keywords: textual data, wavelet, denoising, contingency table
Procedia PDF Downloads 27814753 Use of Electrochemical Methods for the Inhibition of Scaling with Green Products
Authors: Samira Ghizellaoui, Manel Boumagoura
Abstract:
The municipality of Constantine in eastern Algeria draws water from the Hamma groundwater source. The high fouling capacity is due to the high content of bicarbonate (442 mg/L) and calcium (136 mg/L). This work focuses on the use of three new green inhibitors for reducing calcium carbonate scale formation: gallic acid, quercetin and alginate, and on the comparison between them. These inhibitors have proven to be green antiscalants because they have no impact on the environment. Electrochemical methods (chronoamperometry and impedancemetry) were used to evaluate their performance. According to the study, these inhibitors are excellent green chemical inhibitors of scaling, and the best inhibitor is quercetin because it gave a good result with a lower concentration (2mg/L) compared to others inhibitors.Keywords: scaling, green inhibitor, chronoamperometry, impedancemetry
Procedia PDF Downloads 11614752 Case Studies of Mitigation Methods against the Impacts of High Water Levels in the Great Lakes
Authors: Jennifer M. Penton
Abstract:
Record high lake levels in 2017 and 2019 (2017 max lake level = 75.81 m; 2018 max lake level = 75.26 m; 2019 max lake level = 75.92 m) combined with a number of severe storms in the Great Lakes region, have resulted in significant wave generation across Lake Ontario. The resulting large wave heights have led to erosion of the natural shoreline, overtopping of existing revetments, backshore erosion, and partial and complete failure of several coastal structures, which in turn have led to further erosion of the shoreline and damaged existing infrastructure. Such impacts can be seen all along the coast of Lake Ontario. Three specific locations have been chosen as case studies for this paper, each addressing erosion and/or flood mitigation methods, such as revetments and sheet piling with increased land levels. Varying site conditions and the resulting shoreline damage are compared herein. The results are reflected in the case-specific design components of the mitigation and adaptation methods and are presented in this paper.Keywords: erosion mitigation, flood mitigation, great lakes, high water levels
Procedia PDF Downloads 17414751 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies
Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk
Abstract:
Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)
Procedia PDF Downloads 9614750 Iterative Solver for Solving Large-Scale Frictional Contact Problems
Authors: Thierno Diop, Michel Fortin, Jean Deteix
Abstract:
Since the precise formulation of the elastic part is irrelevant for the description of the algorithm, we shall consider a generic case. In practice, however, we will have to deal with a non linear material (for instance a Mooney-Rivlin model). We are interested in solving a finite element approximation of the problem, leading to large-scale non linear discrete problems and, after linearization, to large linear systems and ultimately to calculations needing iterative methods. This also implies that penalty method, and therefore augmented Lagrangian method, are to be banned because of their negative effect on the condition number of the underlying discrete systems and thus on the convergence of iterative methods. This is in rupture to the mainstream of methods for contact in which augmented Lagrangian is the principal tool. We shall first present the problem and its discretization; this will lead us to describe a general solution algorithm relying on a preconditioner for saddle-point problems which we shall describe in some detail as it is not entirely standard. We will propose an iterative approach for solving three-dimensional frictional contact problems between elastic bodies, including contact with a rigid body, contact between two or more bodies and also self-contact.Keywords: frictional contact, three-dimensional, large-scale, iterative method
Procedia PDF Downloads 21214749 Building Bridges on Roads With Major Constructions
Authors: Mohamed Zaidour
Abstract:
In this summary, we are going to look in brief at the bridges and their building and construction on most roads and we have followed a simple method to explain each field clearly because the geographical and climatic diversity of an area leads to different methods and types of roads and installation engineering in other areas In mountain areas we need to build retaining walls in areas of rain. It needs to construct ferries to discharge water from roads in areas of temporary or permanent rivers. There is a need to build bridges and construct road installations in the process of collecting the necessary information, such as soil type. This information needs it, engineer, when designing the constructor and in this section, we will identify the types and methods of calculation bridge columns rules phrases the walls are chock.Keywords: bridges, buildings, concrete, constructions, roads
Procedia PDF Downloads 11914748 On the Added Value of Probabilistic Forecasts Applied to the Optimal Scheduling of a PV Power Plant with Batteries in French Guiana
Authors: Rafael Alvarenga, Hubert Herbaux, Laurent Linguet
Abstract:
The uncertainty concerning the power production of intermittent renewable energy is one of the main barriers to the integration of such assets into the power grid. Efforts have thus been made to develop methods to quantify this uncertainty, allowing producers to ensure more reliable and profitable engagements related to their future power delivery. Even though a diversity of probabilistic approaches was proposed in the literature giving promising results, the added value of adopting such methods for scheduling intermittent power plants is still unclear. In this study, the profits obtained by a decision-making model used to optimally schedule an existing PV power plant connected to batteries are compared when the model is fed with deterministic and probabilistic forecasts generated with two of the most recent methods proposed in the literature. Moreover, deterministic forecasts with different accuracy levels were used in the experiments, testing the utility and the capability of probabilistic methods of modeling the progressively increasing uncertainty. Even though probabilistic approaches are unquestionably developed in the recent literature, the results obtained through a study case show that deterministic forecasts still provide the best performance if accurate, ensuring a gain of 14% on final profits compared to the average performance of probabilistic models conditioned to the same forecasts. When the accuracy of deterministic forecasts progressively decreases, probabilistic approaches start to become competitive options until they completely outperform deterministic forecasts when these are very inaccurate, generating 73% more profits in the case considered compared to the deterministic approach.Keywords: PV power forecasting, uncertainty quantification, optimal scheduling, power systems
Procedia PDF Downloads 8714747 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties
Authors: Sammani Danwawu Abdullahi
Abstract:
Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.Keywords: counting with uncertainties, mathematical programming, optimization, vertex enumeration
Procedia PDF Downloads 35914746 Molecular Electrostatic Potential in Z-3N(2-Ethoxyphenyl), 2-N'(2-Ethoxyphenyl) Imino Thiazolidin-4-one Molecule by Ab Initio and DFT Methods
Authors: Manel Boulakoud, Abdelkader Chouaih, Fodil Hamzaoui
Abstract:
In the present work we are interested in the determination of the Molecular electrostatic potential (MEP) in Z-3N(2-Ethoxyphenyl), 2-N’(2-Ethoxyphenyl) imino thiazolidin-4-one molecule by ab initio and Density Functional Theory (DFT) in the ground state. The MEP is related to the electronic density and is a very useful descriptor in understanding sites for electrophilic attack and nucleophilic reactions as well as hydrogen bonding interactions. First, geometry optimization was carried out using Hartree–Fock (HF) and DFT methods with 6-311G(d,p) basis set. In order to get more information on the molecule, its stability has been analyzed by natural bond orbital (NBO) analysis. Mulliken population analyses have been calculated. Finally, the molecular electrostatic potential (MEP) and HOMO-LUMO energy levels have been performed. The calculated HOMO and LUMO energies show also the charge transfer within the molecule. The energy gap obtained is about 4 eV which explain the stability of the studied compound. The obtained molecular electrostatic potential from the two methods confirms the nature of the electron charge transfer at the molecular shell and locate the electropositive part and the electronegative part in molecular scale of the title compound.Keywords: DFT, ab initio, HOMO-LUMO, organic compounds
Procedia PDF Downloads 53714745 Mapping Methods to Solve a Modified Korteweg de Vries Type Equation
Authors: E. V. Krishnan
Abstract:
In this paper, we employ mapping methods to construct exact travelling wave solutions for a modified Korteweg-de Vries equation. We have derived periodic wave solutions in terms of Jacobi elliptic functions, kink solutions and singular wave solutions in terms of hyperbolic functions.Keywords: travelling wave solutions, Jacobi elliptic functions, solitary wave solutions, Korteweg-de Vries equation
Procedia PDF Downloads 33214744 Emulation Model in Architectural Education
Authors: Ö. Şenyiğit, A. Çolak
Abstract:
It is of great importance for an architectural student to know the parameters through which he/she can conduct his/her design and makes his/her design effective in architectural education. Therefore; an empirical application study was carried out through the designing activity using the emulation model to support the design and design approaches of architectural students. During the investigation period, studies were done on the basic design elements and principles of the fall semester, and the emulation model, one of the designing methods that constitute the subject of the study, was fictionalized as three phased “recognition-interpretation-application”. As a result of the study, it was observed that when students were given a key method during the design process, their awareness increased and their aspects improved as well.Keywords: basic design, design education, design methods, emulation
Procedia PDF Downloads 23614743 Artificial Intelligence Technologies Used in Healthcare: Its Implication on the Healthcare Workforce and Applications in the Diagnosis of Diseases
Authors: Rowanda Daoud Ahmed, Mansoor Abdulhak, Muhammad Azeem Afzal, Sezer Filiz, Usama Ahmad Mughal
Abstract:
This paper discusses important aspects of AI in the healthcare domain. The increase of data in healthcare both in size and complexity, opens more room for artificial intelligence applications. Our focus is to review the main AI methods within the scope of the health care domain. The results of the review show that recommendations for diagnosis and recommendations for treatment, patent engagement, and administrative tasks are the key applications of AI in healthcare. Understanding the potential of AI methods in the domain of healthcare would benefit healthcare practitioners and will improve patient outcomes.Keywords: AI in healthcare, technologies of AI, neural network, future of AI in healthcare
Procedia PDF Downloads 11414742 Advancing in Cricket Analytics: Novel Approaches for Pitch and Ball Detection Employing OpenCV and YOLOV8
Authors: Pratham Madnur, Prathamkumar Shetty, Sneha Varur, Gouri Parashetti
Abstract:
In order to overcome conventional obstacles, this research paper investigates novel approaches for cricket pitch and ball detection that make use of cutting-edge technologies. The research integrates OpenCV for pitch inspection and modifies the YOLOv8 model for cricket ball detection in order to overcome the shortcomings of manual pitch assessment and traditional ball detection techniques. To ensure flexibility in a range of pitch environments, the pitch detection method leverages OpenCV’s color space transformation, contour extraction, and accurate color range defining features. Regarding ball detection, the YOLOv8 model emphasizes the preservation of minor object details to improve accuracy and is specifically trained to the unique properties of cricket balls. The methods are more reliable because of the careful preparation of the datasets, which include novel ball and pitch information. These cutting-edge methods not only improve cricket analytics but also set the stage for flexible methods in more general sports technology applications.Keywords: OpenCV, YOLOv8, cricket, custom dataset, computer vision, sports
Procedia PDF Downloads 8314741 Suitable Models and Methods for the Steady-State Analysis of Multi-Energy Networks
Authors: Juan José Mesas, Luis Sainz
Abstract:
The motivation for the development of this paper lies in the need for energy networks to reduce losses, improve performance, optimize their operation and try to benefit from the interconnection capacity with other networks enabled for other energy carriers. These interconnections generate interdependencies between some energy networks and others, which requires suitable models and methods for their analysis. Traditionally, the modeling and study of energy networks have been carried out independently for each energy carrier. Thus, there are well-established models and methods for the steady-state analysis of electrical networks, gas networks, and thermal networks separately. What is intended is to extend and combine them adequately to be able to face in an integrated way the steady-state analysis of networks with multiple energy carriers. Firstly, the added value of multi-energy networks, their operation, and the basic principles that characterize them are explained. In addition, two current aspects of great relevance are exposed: the storage technologies and the coupling elements used to interconnect one energy network with another. Secondly, the characteristic equations of the different energy networks necessary to carry out the steady-state analysis are detailed. The electrical network, the natural gas network, and the thermal network of heat and cold are considered in this paper. After the presentation of the equations, a particular case of the steady-state analysis of a specific multi-energy network is studied. This network is represented graphically, the interconnections between the different energy carriers are described, their technical data are exposed and the equations that have previously been presented theoretically are formulated and developed. Finally, the two iterative numerical resolution methods considered in this paper are presented, as well as the resolution procedure and the results obtained. The pros and cons of the application of both methods are explained. It is verified that the results obtained for the electrical network (voltages in modulus and angle), the natural gas network (pressures), and the thermal network (mass flows and temperatures) are correct since they comply with the distribution, operation, consumption and technical characteristics of the multi-energy network under study.Keywords: coupling elements, energy carriers, multi-energy networks, steady-state analysis
Procedia PDF Downloads 81