Search results for: derivative and convoluted derivative methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15614

Search results for: derivative and convoluted derivative methods

13004 Importance of Risk Assessment in Managers´ Decision-Making Process

Authors: Mária Hudáková, Vladimír Míka, Katarína Hollá

Abstract:

Making decisions is the core of management and a result of conscious activities which is under way in a particular environment and concrete conditions. The managers decide about the goals, procedures and about the methods how to respond to the changes and to the problems which developed. Their decisions affect the effectiveness, quality, economy and the overall successfulness in every organisation. In spite of this fact, they do not pay sufficient attention to the individual steps of the decision-making process. They emphasise more how to cope with the individual methods and techniques of making decisions and forget about the way how to cope with analysing the problem or assessing the individual solution variants. In many cases, the underestimating of the analytical phase can lead to an incorrect assessment of the problem and this can then negatively influence its further solution. Based on our analysis of the theoretical solutions by individual authors who are dealing with this area and the realised research in Slovakia and also abroad we can recognise an insufficient interest of the managers to assess the risks in the decision-making process. The goal of this paper is to assess the risks in the managers´ decision-making process relating to the conditions of the environment, to the subject’s activity (the manager’s personality), to the insufficient assessment of individual variants for solving the problems but also to situations when the arisen problem is not solved. The benefit of this paper is the effort to increase the need of the managers to deal with the risks during the decision-making process. It is important for every manager to assess the risks in his/her decision-making process and to make efforts to take such decisions which reflect the basic conditions, states and development of the environment in the best way and especially for the managers´ decisions to contribute to achieving the determined goals of the organisation as effectively as possible.

Keywords: risk, decision-making, manager, process, analysis, source of risk

Procedia PDF Downloads 264
13003 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection

Authors: Mahshid Arabi

Abstract:

With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.

Keywords: data protection, digital technologies, information security, modern management

Procedia PDF Downloads 30
13002 African Swine Fewer Situation and Diagnostic Methods in Lithuania

Authors: Simona Pileviciene

Abstract:

On 24th January 2014, Lithuania notified two primary cases of African swine fever (ASF) in wild boars. The animals were tested positive for ASF virus (ASFV) genome by real-time PCR at the National Reference Laboratory for ASF in Lithuania (NRL), results were confirmed by the European Union Reference Laboratory for African swine fever (CISA-INIA). Intensive wild and domestic animal monitoring program was started. During the period of 2014-2017 ASF was confirmed in two large commercial pig holding with the highest biosecurity. Pigs were killed and destroyed. Since 2014 ASF outbreak territory from east and south has expanded to the middle of Lithuania. Diagnosis by PCR is one of the highly recommended diagnostic methods by World Organization for Animal Health (OIE) for diagnosis of ASF. The aim of the present study was to compare singleplex real-time PCR assays to a duplex assay allowing the identification of ASF and internal control in a single PCR tube and to compare primers, that target the p72 gene (ASF 250 bp and ASF 75 bp) effectivity. Multiplex real-time PCR assays prove to be less time consuming and cost-efficient and therefore have a high potential to be applied in the routine analysis. It is important to have effective and fast method that allows virus detection at the beginning of disease for wild boar population and in outbreaks for domestic pigs. For experiments, we used reference samples (INIA, Spain), and positive samples from infected animals in Lithuania. Results show 100% sensitivity and specificity.

Keywords: African swine fewer, real-time PCR, wild boar, domestic pig

Procedia PDF Downloads 166
13001 Prevalence and Genetic Determinant of Drug Resistant Tuberculosis among Patients Completing Intensive Phase of Treatment in a Tertiary Referral Center in Nigeria

Authors: Aminu Bashir Mohammad, Agwu Ezera, Abdulrazaq G. Habib, Garba Iliyasu

Abstract:

Background: Drug resistance tuberculosis (DR-TB) continues to be a challenge in developing countries with poor resources. Routine screening for primary DR-TB before commencing treatment is not done in public hospitals in Nigeria, even with the large body of evidence that shows a high prevalence of primary DR-TB. Data on drug resistance and its genetic determinant among follow up TB patients is lacking in Nigeria. Hence the aim of this study was to determine the prevalence and genetic determinant of drug resistance among follow up TB patients in a tertiary hospital in Nigeria. Methods: This was a cross-sectional laboratory-based study conducted on 384 sputum samples collected from consented follow-up tuberculosis patients. Standard microbiology methods (Zeil-Nielsen staining and microscopy) and PCR (Line Probe Assay)] were used to analyze the samples collected. Person’s Chi-square was used to analyze the data generated. Results: Out of three hundred and eighty-four (384) sputum samples analyzed for mycobacterium tuberculosis (MTB) and DR-TB twenty-five 25 (6.5%) were found to be AFB positive. These samples were subjected to PCR (Line Probe Assay) out of which 18(72%) tested positive for DR-TB. Mutations conferring resistance to rifampicin (rpo B) and isoniazid (katG, and or inhA) were detected in 12/18(66.7%) and 6/18(33.3%), respectively. Transmission dynamic of DR-TB was not significantly (p>0.05) dependent on demographic characteristics. Conclusion: There is a need to strengthened the laboratory capacity for diagnosis of TB and drug resistance testing and make these services available, affordable, and accessible to the patients who need them.

Keywords: drug resistance tuberculosis, genetic determinant, intensive phase, Nigeria

Procedia PDF Downloads 285
13000 Using Seismic Base Isolation Systems in High-Rise Hospital Buildings and a Hybrid Proposal

Authors: Elif Bakkaloglu, Necdet Torunbalci

Abstract:

The fact of earthquakes in Turkiye is an inevitable natural disaster. Therefore, buildings must be prepared for this natural hazard. Especially in hospital buildings, earthquake resistance is an essential point because hospitals are one of the first places where people come after an earthquake. Although hospital buildings are more suitable for horizontal architecture, it is necessary to construct and expand multi-storey hospital buildings due to difficulties in finding suitable places as a result of excessive urbanization, difficulties in obtaining appropriate size land and decrease in suitable places and increase in land values. In Turkiye, using seismic isolators in public hospitals, which are placed in first-degree earthquake zone and have more than 100 beds, is made obligatory by general instruction. As a result of this decision, it may sometimes be necessary to construct seismic isolated multi-storey hospital buildings in cities where those problems are experienced. Although widespread use of seismic isolators in Japan, there are few multi-storey buildings in which seismic isolators are used in Turkiye. As it is known, base isolation systems are the most effective methods of earthquake resistance, as number of floors increases, center of gravity moves away from base in multi-storey buildings, increasing the overturning effect and limiting the use of these systems. In this context, it is aimed to investigate structural systems of multi-storey buildings which built using seismic isolation methods in the World. In addition to this, a working principle is suggested for disseminating seismic isolators in multi-storey hospital buildings. The results to be obtained from the study will guide architects who design multi-storey hospital buildings in their architectural designs and engineers in terms of structural system design.

Keywords: earthquake, energy absorbing systems, hospital, seismic isolation systems

Procedia PDF Downloads 151
12999 Homogenization of a Non-Linear Problem with a Thermal Barrier

Authors: Hassan Samadi, Mustapha El Jarroudi

Abstract:

In this work, we consider the homogenization of a non-linear problem in periodic medium with two periodic connected media exchanging a heat flux throughout their common interface. The interfacial exchange coefficient λ is assumed to tend to zero or to infinity following a rate λ=λ(ε) when the size ε of the basic cell tends to zero. Three homogenized problems are determined according to some critical value depending of λ and ε. Our method is based on Γ-Convergence techniques.

Keywords: variational methods, epiconvergence, homogenization, convergence technique

Procedia PDF Downloads 525
12998 Contactless Electromagnetic Detection of Stress Fluctuations in Steel Elements

Authors: M. A. García, J. Vinolas, A. Hernando

Abstract:

Steel is nowadays one of the most important structural materials because of its outstanding mechanical properties. Therefore, in order to look for a sustainable economic model and to optimize the use of extensive resources, new methods to monitor and prevent failure of steel-based facilities are required. The classical mechanical tests, as for instance building tasting, are invasive and destructive. Moreover, for facilities where the steel element is embedded, (as reinforced concrete) these techniques are directly non applicable. Hence, non-invasive monitoring techniques to prevent failure, without altering the structural properties of the elements are required. Among them, electromagnetic methods are particularly suitable for non-invasive inspection of the mechanical state of steel-based elements. The magnetoelastic coupling effects induce a modification of the electromagnetic properties of an element upon applied stress. Since most steels are ferromagnetic because of their large Fe content, it is possible to inspect their structure and state in a non-invasive way. We present here a distinct electromagnetic method for contactless evaluation of internal stress in steel-based elements. In particular, this method relies on measuring the magnetic induction between two coils with the steel specimen in between them. We found that the alteration of electromagnetic properties of the steel specimen induced by applied stress-induced changes in the induction allowed us to detect stress well below half of the elastic limit of the material. Hence, it represents an outstanding non-invasive method to prevent failure in steel-based facilities. We here describe the theoretical model, present experimental results to validate it and finally we show a practical application for detection of stress and inhomogeneities in train railways.

Keywords: magnetoelastic, magnetic induction, mechanical stress, steel

Procedia PDF Downloads 50
12997 Suicide, Help-Seeking and LGBT Youth: A Mixed Methods Study

Authors: Elizabeth McDermott, Elizabeth Hughes, Victoria Rawlings

Abstract:

Globally, suicide is the second leading cause of death among 15–29 year-olds. Young people who identify as lesbian, gay, bisexual and transgender (LGBT) have elevated rates of suicide and self-harm. Despite the increased risk, there is a paucity of research on LGBT help-seeking and suicidality. This is the first national study to investigate LGBT youth help-seeking for suicidal feelings and self-harm. We report on a UK sequential exploratory mixed method study that employed face-to-face and online methods in two stages. Stage one involved 29 online (n=15) and face-to-face (n=14) semi-structured interviews with LGBT youth aged under 25 years old. Stage two utilized an online LGBT youth questionnaire employing a community-based sampling strategy (n=789). We found across the sample that LGBT youth who self-harmed or felt suicidal were reluctant to seek help. Results indicated that participants were normalizing their emotional distress and only asked for help when they reached crisis point and were no longer coping. Those who self-harmed (p<0.001, OR=2.82), had attempted or planned suicide (p<0.05, OR=1.48), or had experience of abuse related to their sexuality or gender (p<0.01, OR=1.80), were most likely to seek help. There were a number of interconnecting reasons that contributed to participants’ problems accessing help. The most prominent of these were: negotiating norms in relation to sexuality, gender, mental health and age; being unable to talk about emotions, and coping and self-reliance. It is crucial that policies and practices that aim to prevent LGBT youth suicide recognize that norms and normalizing processes connected to sexual orientation and gender identity are additional difficulties that LGBT youth have accessing mental health support.

Keywords: help-seeking, LGBT, suicide, youth

Procedia PDF Downloads 275
12996 Green Organic Chemistry, a New Paradigm in Pharmaceutical Sciences

Authors: Pesaru Vigneshwar Reddy, Parvathaneni Pavan

Abstract:

Green organic chemistry which is the latest and one of the most researched topics now-a- days has been in demand since 1990’s. Majority of the research in green organic chemistry chemicals are some of the important starting materials for greater number of major chemical industries. The production of organic chemicals has raw materials (or) reagents for other application is major sector of manufacturing polymers, pharmaceuticals, pesticides, paints, artificial fibers, food additives etc. organic synthesis on a large scale compound to the labratory scale, involves the use of energy, basic chemical ingredients from the petro chemical sectors, catalyst and after the end of the reaction, seperation, purification, storage, packing distribution etc. During these processes there are many problems of health and safety for workers in addition to the environmental problems caused there by use and deposition as waste. Green chemistry with its 12 principles would like to see changes in conventional way that were used for decades to make synthetic organic chemical and the use of less toxic starting materials. Green chemistry would like to increase the efficiency of synthetic methods, to use less toxic solvents, reduce the stage of synthetic routes and minimize waste as far as practically possible. In this way, organic synthesis will be part of the effort for sustainable development Green chemistry is also interested for research and alternatives innovations on many practical aspects of organic synthesis in the university and research labaratory of institutions. By changing the methodologies of organic synthesis, health and safety will be advanced in the small scale laboratory level but also will be extended to the industrial large scale production a process through new techniques. The three key developments in green chemistry include the use of super critical carbondioxide as green solvent, aqueous hydrogen peroxide as an oxidising agent and use of hydrogen in asymmetric synthesis. It also focuses on replacing traditional methods of heating with that of modern methods of heating like microwaves traditions, so that carbon foot print should reduces as far as possible. Another beneficiary of this green chemistry is that it will reduce environmental pollution through the use of less toxic reagents, minimizing of waste and more bio-degradable biproducts. In this present paper some of the basic principles, approaches, and early achievements of green chemistry has a branch of chemistry that studies the laws of passing of chemical reactions is also considered, with the summarization of green chemistry principles. A discussion about E-factor, old and new synthesis of ibuprofen, microwave techniques, and some of the recent advancements also considered.

Keywords: energy, e-factor, carbon foot print, micro-wave, sono-chemistry, advancement

Procedia PDF Downloads 306
12995 Current Epizootic Situation of Q Fever in Polish Cattle

Authors: Monika Szymańska-Czerwińska, Agnieszka Jodełko, Krzysztof Niemczuk

Abstract:

Q fever (coxiellosis) is an infectious disease of animals and humans causes by C. burnetii and widely distributed throughout the world. Cattle and small ruminants are commonly known as shedders of C. burnetii. The aims of this study were the evaluation of seroprevalence and shedding of C. burnetii in cattle. Genotypes of the pathogen present in the tested specimens were also identified using MLVA (Multiple Locus Variable-Number Tandem Repeat Analysis) and MST (multispacer sequence typing) methods. Sampling was conducted in different regions of Poland in 2018-2021. In total, 2180 bovine serum samples from 801 cattle herds were tested by ELISA (enzyme-linked immunosorbent assay). 489 specimens from 157 cattle herds such as: individual milk samples (n=407), bulk tank milk (n=58), vaginal swabs (n=20), placenta (n=3) and feces (n=1) were subjected to C. burnetii specific qPCR. The qPCR (IS1111 transposon-like repetitive region) was performed using Adiavet COX RealTime PCR kit. Genotypic characterization of the strains was conducted utilizing MLVA and MST methods. MLVA was performed using 6 variable loci. The overall herd-level seroprevalence of C. burnetii infection was 36.74% (801/2180). Shedders were detected in 29.3% (46/157) cattle herds in all tested regions. ST 61 sequence type was identified in 10 out of 18 genotyped strains. Interestingly one strain represents sequence type which has never been recorded previously. MLVA method identified three previously known genotypes: most common was J but also I and BE were recognized. Moreover, a one genotype has never been described previously. Seroprevalence and shedding of C. burnetii in cattle is common and strains are genetically diverse.

Keywords: Coxiella burnetii, cattle, MST, MLVA, Q fever

Procedia PDF Downloads 86
12994 Relative Effectiveness of Inquiry: Approach and Expository Instructional Methods in Fostering Students’ Retention in Chemistry

Authors: Joy Johnbest Egbo

Abstract:

The study was designed to investigate the relative effectiveness of inquiry role approach and expository instructional methods in fostering students’ retention in chemistry. Two research questions were answered and three null hypotheses were formulated and tested at 0.05 level of significance. A quasi-experimental (the non-equivalent pretest, posttest control group) design was adopted for the study. The population for the study comprised all senior secondary school class two (SS II) students who were offering Chemistry in single sex schools in Enugu Education Zone. The instrument for data collection was a self-developed Chemistry Retention Test (CRT). Relevant data were collected from a sample of one hundred and forty–one (141) students drawn from two secondary schools (1 male and 1 female schools) using simple random sampling technique. A reliability co-efficient of 0.82 was obtained for the instrument using Kuder Richardson formular20 (K-R20). Mean and Standard deviation scores were used to answer the research questions while two–way analysis of covariance (ANCOVA) was used to test the hypotheses. The findings showed that the students taught with Inquiry role approach retained the chemistry concept significantly higher than their counterparts taught with expository method. Female students retained slightly higher than their male counterparts. There is significant interaction between instructional packages and gender on Chemistry students’ retention. It was recommended, among others, that teachers should be encouraged to employ the use of Inquiry-role approach more in the teaching of chemistry and other subjects in general. By so doing, students’ retention of the subject could be increased.

Keywords: inquiry role approach, retention, exposition method, chemistry

Procedia PDF Downloads 513
12993 Antagonistic Potential of Epiphytic Bacteria Isolated in Kazakhstan against Erwinia amylovora, the Causal Agent of Fire Blight

Authors: Assel E. Molzhigitova, Amankeldi K. Sadanov, Elvira T. Ismailova, Kulyash A. Iskandarova, Olga N. Shemshura, Ainur I. Seitbattalova

Abstract:

Fire blight is a very harmful for commercial apple and pear production quarantine bacterial disease. To date, several different methods have been proposed for disease control, including the use of copperbased preparations and antibiotics, which are not always reliable or effective. The use of bacteria as biocontrol agents is one of the most promising and eco-friendly alternative methods. Bacteria with protective activity against the causal agent of fire blight are often present among the epiphytic microorganisms of the phyllosphere of host plants. Therefore, the main objective of our study was screening of local epiphytic bacteria as possible antagonists against Erwinia amylovora, the causal agent of fire blight. Samples of infected organs of apple and pear trees (shoots, leaves, fruits) were collected from the industrial horticulture areas in various agro-ecological zones of Kazakhstan. Epiphytic microorganisms were isolated by standard and modified methods on specific nutrient media. The primary screening of selected microorganisms under laboratory conditions to determine the ability to suppress the growth of Erwinia amylovora was performed by agar-diffusion-test. Among 142 bacteria isolated from the fire blight host plants, 5 isolates, belonging to the genera Bacillus, Lactobacillus, Pseudomonas, Paenibacillus and Pantoea showed higher antagonistic activity against the pathogen. The diameters of inhibition zone have been depended on the species and ranged from 10 mm to 48 mm. The maximum diameter of inhibition zone (48 mm) was exhibited by B. amyloliquefaciens. Less inhibitory effect was showed by Pantoea agglomerans PA1 (19 mm). The study of inhibitory effect of Lactobacillus species against E. amylovora showed that among 7 isolates tested only one (Lactobacillus plantarum 17M) demonstrated inhibitory zone (30 mm). In summary, this study was devoted to detect the beneficial epiphytic bacteria from plants organs of pear and apple trees due to fire blight control in Kazakhstan. Results obtained from the in vitro experiments showed that the most efficient bacterial isolates are Lactobacillus plantarum 17M, Bacillus amyloliquefaciens MB40, and Pantoea agglomerans PA1. These antagonists are suitable for development as biocontrol agents for fire blight control. Their efficacies will be evaluated additionally, in biological tests under in vitro and field conditions during our further study.

Keywords: antagonists, epiphytic bacteria, Erwinia amylovora, fire blight

Procedia PDF Downloads 167
12992 Determination of Klebsiella Pneumoniae Susceptibility to Antibiotics Using Infrared Spectroscopy and Machine Learning Algorithms

Authors: Manal Suleiman, George Abu-Aqil, Uraib Sharaha, Klaris Riesenberg, Itshak Lapidot, Ahmad Salman, Mahmoud Huleihel

Abstract:

Klebsiella pneumoniae is one of the most aggressive multidrug-resistant bacteria associated with human infections resulting in high mortality and morbidity. Thus, for an effective treatment, it is important to diagnose both the species of infecting bacteria and their susceptibility to antibiotics. Current used methods for diagnosing the bacterial susceptibility to antibiotics are time-consuming (about 24h following the first culture). Thus, there is a clear need for rapid methods to determine the bacterial susceptibility to antibiotics. Infrared spectroscopy is a well-known method that is known as sensitive and simple which is able to detect minor biomolecular changes in biological samples associated with developing abnormalities. The main goal of this study is to evaluate the potential of infrared spectroscopy in tandem with Random Forest and XGBoost machine learning algorithms to diagnose the susceptibility of Klebsiella pneumoniae to antibiotics within approximately 20 minutes following the first culture. In this study, 1190 Klebsiella pneumoniae isolates were obtained from different patients with urinary tract infections. The isolates were measured by the infrared spectrometer, and the spectra were analyzed by machine learning algorithms Random Forest and XGBoost to determine their susceptibility regarding nine specific antibiotics. Our results confirm that it was possible to classify the isolates into sensitive and resistant to specific antibiotics with a success rate range of 80%-85% for the different tested antibiotics. These results prove the promising potential of infrared spectroscopy as a powerful diagnostic method for determining the Klebsiella pneumoniae susceptibility to antibiotics.

Keywords: urinary tract infection (UTI), Klebsiella pneumoniae, bacterial susceptibility, infrared spectroscopy, machine learning

Procedia PDF Downloads 168
12991 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 259
12990 Groupthink: The Dark Side of Team Cohesion

Authors: Farhad Eizakshiri

Abstract:

The potential for groupthink to explain the issues contributing to deterioration of decision-making ability within the unitary team and so to cause poor outcomes attracted a great deal of attention from a variety of disciplines, including psychology, social and organizational studies, political science, and others. Yet what remains unclear is how and why the team members’ strivings for unanimity and cohesion override their motivation to realistically appraise alternative courses of action. In this paper, the findings of a sequential explanatory mixed-methods research containing an experiment with thirty groups of three persons each and interviews with all experimental groups to investigate this issue is reported. The experiment sought to examine how individuals aggregate their views in order to reach a consensual group decision concerning the completion time of a task. The results indicated that groups made better estimates when they had no interaction between members in comparison with the situation that groups collectively agreed on time estimates. To understand the reasons, the qualitative data and informal observations collected during the task were analyzed through conversation analysis, thus leading to four reasons that caused teams to neglect divergent viewpoints and reduce the number of ideas being considered. Reasons found were the concurrence-seeking tendency, pressure on dissenters, self-censorship, and the illusion of invulnerability. It is suggested that understanding the dynamics behind the aforementioned reasons of groupthink will help project teams to avoid making premature group decisions by enhancing careful evaluation of available information and analysis of available decision alternatives and choices.

Keywords: groupthink, group decision, cohesiveness, project teams, mixed-methods research

Procedia PDF Downloads 396
12989 The Sensitivity of Electrical Geophysical Methods for Mapping Salt Stores within the Soil Profile

Authors: Fathi Ali Swaid

Abstract:

Soil salinization is one of the most hazardous phenomenons accelerating the land degradation processes. It either occurs naturally or is human-induced. High levels of soil salinity negatively affect crop growth and productivity leading land degradation ultimately. Thus, it is important to monitor and map soil salinity at an early stage to enact effective soil reclamation program that helps lessen or prevent future increase in soil salinity. Geophysical method has outperformed the traditional method for assessing soil salinity offering more informative and professional rapid assessment techniques for monitoring and mapping soil salinity. Soil sampling, EM38 and 2D conductivity imaging have been evaluated for their ability to delineate and map the level of salinity variations at Second Ponds Creek. The three methods have shown that the subsoil in the study area is saline. Salt variations were successfully observed under either method. However, EM38 reading and 2D inversion data show a clear spatial structure comparing to EC1:5 of soil samples in spite of that all soil samples, EM38 and 2D imaging were collected from the same location. Because EM38 readings and 2D imaging data are a weighted average of electrical soil conductance, it is more representative of soil properties than the soil samples method. The mapping of subsurface soil at the study area has been successful and the resistivity imaging has proven to be an advantage. The soil salinity analysis (EC1:5) correspond well to the true resistivity bringing together a good result of soil salinity. Soil salinity clearly indicated by previous investigation EM38 have been confirmed by the interpretation of the true resistivity at study area.

Keywords: 2D conductivity imaging, EM38 readings, soil salinization, true resistivity, urban salinity

Procedia PDF Downloads 376
12988 Evaluation of Electro-Flocculation for Biomass Production of Marine Microalgae Phaodactylum tricornutum

Authors: Luciana C. Ramos, Leandro J. Sousa, Antônio Ferreira da Silva, Valéria Gomes Oliveira Falcão, Suzana T. Cunha Lima

Abstract:

The commercial production of biodiesel using microalgae demands a high-energy input for harvesting biomass, making production economically unfeasible. Methods currently used involve mechanical, chemical, and biological procedures. In this work, a flocculation system is presented as a cost and energy effective process to increase biomass production of Phaeodactylum tricornutum. This diatom is the only species of the genus that present fast growth and lipid accumulation ability that are of great interest for biofuel production. The algae, selected from the Bank of Microalgae, Institute of Biology, Federal University of Bahia (Brazil), have been bred in tubular reactor with photoperiod of 12 h (clear/dark), providing luminance of about 35 μmol photons m-2s-1, and temperature of 22 °C. The medium used for growing cells was the Conway medium, with addition of silica. The seaweed growth curve was accompanied by cell count in Neubauer camera and by optical density in spectrophotometer, at 680 nm. The precipitation occurred at the end of the stationary phase of growth, 21 days after inoculation, using two methods: centrifugation at 5000 rpm for 5 min, and electro-flocculation at 19 EPD and 95 W. After precipitation, cells were frozen at -20 °C and, subsequently, lyophilized. Biomass obtained by electro-flocculation was approximately four times greater than the one achieved by centrifugation. The benefits of this method are that no addition of chemical flocculants is necessary and similar cultivation conditions can be used for the biodiesel production and pharmacological purposes. The results may contribute to improve biodiesel production costs using marine microalgae.

Keywords: biomass, diatom, flocculation, microalgae

Procedia PDF Downloads 330
12987 On the Existence of Homotopic Mapping Between Knowledge Graphs and Graph Embeddings

Authors: Jude K. Safo

Abstract:

Knowledge Graphs KG) and their relation to Graph Embeddings (GE) represent a unique data structure in the landscape of machine learning (relative to image, text and acoustic data). Unlike the latter, GEs are the only data structure sufficient for representing hierarchically dense, semantic information needed for use-cases like supply chain data and protein folding where the search space exceeds the limits traditional search methods (e.g. page-rank, Dijkstra, etc.). While GEs are effective for compressing low rank tensor data, at scale, they begin to introduce a new problem of ’data retreival’ which we observe in Large Language Models. Notable attempts by transE, TransR and other prominent industry standards have shown a peak performance just north of 57% on WN18 and FB15K benchmarks, insufficient practical industry applications. They’re also limited, in scope, to next node/link predictions. Traditional linear methods like Tucker, CP, PARAFAC and CANDECOMP quickly hit memory limits on tensors exceeding 6.4 million nodes. This paper outlines a topological framework for linear mapping between concepts in KG space and GE space that preserve cardinality. Most importantly we introduce a traceable framework for composing dense linguistic strcutures. We demonstrate performance on WN18 benchmark this model hits. This model does not rely on Large Langauge Models (LLM) though the applications are certainy relevant here as well.

Keywords: representation theory, large language models, graph embeddings, applied algebraic topology, applied knot theory, combinatorics

Procedia PDF Downloads 68
12986 Detecting Anomalous Matches: An Empirical Study from National Basketball Association

Authors: Jacky Liu, Dulani Jayasuriya, Ryan Elmore

Abstract:

Match fixing and anomalous sports events have increasingly threatened the integrity of professional sports, prompting concerns about existing detection methods. This study addresses prior research limitations in match fixing detection, improving the identification of potential fraudulent matches by incorporating advanced anomaly detection techniques. We develop a novel method to identify anomalous matches and player performances by examining series of matches, such as playoffs. Additionally, we investigate bettors' potential profits when avoiding anomaly matches and explore factors behind unusual player performances. Our literature review covers match fixing detection, match outcome forecasting models, and anomaly detection methods, underscoring current limitations and proposing a new sports anomaly detection method. Our findings reveal anomalous series in the 2022 NBA playoffs, with the Phoenix Suns vs Dallas Mavericks series having the lowest natural occurrence probability. We identify abnormal player performances and bettors' profits significantly decrease when post-season matches are included. This study contributes by developing a new approach to detect anomalous matches and player performances, and assisting investigators in identifying responsible parties. While we cannot conclusively establish reasons behind unusual player performances, our findings suggest factors such as team financial difficulties, executive mismanagement, and individual player contract issues.

Keywords: anomaly match detection, match fixing, match outcome forecasting, problematic players identification

Procedia PDF Downloads 79
12985 Architecture for QoS Based Service Selection Using Local Approach

Authors: Gopinath Ganapathy, Chellammal Surianarayanan

Abstract:

Services are growing rapidly and generally they are aggregated into a composite service to accomplish complex business processes. There may be several services that offer the same required function of a particular task in a composite service. Hence a choice has to be made for selecting suitable services from alternative functionally similar services. Quality of Service (QoS)plays as a discriminating factor in selecting which component services should be selected to satisfy the quality requirements of a user during service composition. There are two categories of approaches for QoS based service selection, namely global and local approaches. Global approaches are known to be Non-Polynomial (NP) hard in time and offer poor scalability in large scale composition. As an alternative to global methods, local selection methods which reduce the search space by breaking up the large/complex problem of selecting services for the workflow into independent sub problems of selecting services for individual tasks are coming up. In this paper, distributed architecture for selecting services based on QoS using local selection is presented with an overview of local selection methodology. The architecture describes the core components, namely, selection manager and QoS manager needed to implement the local approach and their functions. Selection manager consists of two components namely constraint decomposer which decomposes the given global or workflow level constraints in local or task level constraints and service selector which selects appropriate service for each task with maximum utility, satisfying the corresponding local constraints. QoS manager manages the QoS information at two levels namely, service class level and individual service level. The architecture serves as an implementation model for local selection.

Keywords: architecture of service selection, local method for service selection, QoS based service selection, approaches for QoS based service selection

Procedia PDF Downloads 426
12984 Ecosystem Model for Environmental Applications

Authors: Cristina Schreiner, Romeo Ciobanu, Marius Pislaru

Abstract:

This paper aims to build a system based on fuzzy models that can be implemented in the assessment of ecological systems, to determine appropriate methods of action for reducing adverse effects on environmental and implicit the population. The model proposed provides new perspective for environmental assessment, and it can be used as a practical instrument for decision-making.

Keywords: ecosystem model, environmental security, fuzzy logic, sustainability of habitable regions

Procedia PDF Downloads 420
12983 Neuroecological Approach for Anthropological Studies in Archaeology

Authors: Kalangi Rodrigo

Abstract:

The term Neuroecology elucidates the study of customizable variation in cognition and the brain. Subject marked the birth since 1980s, when researches began to apply methods of comparative evolutionary biology to cognitive processes and the underlying neural mechanisms of cognition. In Archaeology and Anthropology, we observe behaviors such as social learning skills, innovative feeding and foraging, tool use and social manipulation to determine the cognitive processes of ancient mankind. Depending on the brainstem size was used as a control variable, and phylogeny was controlled using independent contrasts. Both disciplines need to enriched with comparative literature and neurological experimental, behavioral studies among tribal peoples as well as primate groups which will lead the research to a potential end. Neuroecology examines the relations between ecological selection pressure and mankind or sex differences in cognition and the brain. The goal of neuroecology is to understand how natural law acts on perception and its neural apparatus. Furthermore, neuroecology will eventually lead both principal disciplines to Ethology, where human behaviors and social management studies from a biological perspective. It can be either ethnoarchaeological or prehistoric. Archaeology should adopt general approach of neuroecology, phylogenetic comparative methods can be used in the field, and new findings on the cognitive mechanisms and brain structures involved mating systems, social organization, communication and foraging. The contribution of neuroecology to archaeology and anthropology is the information it provides on the selective pressures that have influenced the evolution of cognition and brain structure of the mankind. It will shed a new light to the path of evolutionary studies including behavioral ecology, primate archaeology and cognitive archaeology.

Keywords: Neuroecology, Archaeology, Brain Evolution, Cognitive Archaeology

Procedia PDF Downloads 120
12982 Determination of Geotechnical Properties of Travertine Lithotypes in Van-Turkey

Authors: Ali Ozvan, Ismail Akkaya, Mucip Tapan

Abstract:

Travertine is generally a weak or medium strong rock, and physical, mechanical and structural properties of travertines are direct impacts on geotechnical studies. New settlement areas were determined on travertine units after two destructive earthquakes which occurred on October 23rd, 2011 (M=7.1) and November 9th, 2011 (M=5.6) in Tabanlı and Edremit districts of Van province in Turkey, respectively. In the study area, the travertines have different lithotype and engineering properties such as strong crystalline crust, medium strong shrub, and weak reed which can affect mechanical and engineering properties of travertine and each level have different handicaps. Travertine has a higher strength when compared to the soil ground; however, it can have different handicaps such as having poor rock mass, karst caves and weathering alteration. Physico-mechanical properties of travertine in the study area are determined by laboratory tests and field observations. Uniaxial compressive strength (UCS) values were detected by indirect methods, and the strength map of different lithotype of Edremit travertine was created in order to define suitable settlement areas. Also, rock mass properties and underground structure were determined by bore holes, field studies, and geophysical method. The reason of this study is to investigate the relationship between lithotype and physicomechanical properties of travertines. According to the results, lithotype has an effect on physical, mechanical and rock mass properties of travertine levels. It is detected by several research methods that various handicaps may occur on such areas when the active tectonic structure of the area is evaluated along with the karstic cavities within the travertine and different lithotype qualities.

Keywords: travertine, lithotype, geotechnical parameters, Van earthquake

Procedia PDF Downloads 231
12981 Barnard Feature Point Detector for Low-Contractperiapical Radiography Image

Authors: Chih-Yi Ho, Tzu-Fang Chang, Chih-Chia Huang, Chia-Yen Lee

Abstract:

In dental clinics, the dentists use the periapical radiography image to assess the effectiveness of endodontic treatment of teeth with chronic apical periodontitis. Periapical radiography images are taken at different times to assess alveolar bone variation before and after the root canal treatment, and furthermore to judge whether the treatment was successful. Current clinical assessment of apical tissue recovery relies only on dentist personal experience. It is difficult to have the same standard and objective interpretations due to the dentist or radiologist personal background and knowledge. If periapical radiography images at the different time could be registered well, the endodontic treatment could be evaluated. In the image registration area, it is necessary to assign representative control points to the transformation model for good performances of registration results. However, detection of representative control points (feature points) on periapical radiography images is generally very difficult. Regardless of which traditional detection methods are practiced, sufficient feature points may not be detected due to the low-contrast characteristics of the x-ray image. Barnard detector is an algorithm for feature point detection based on grayscale value gradients, which can obtain sufficient feature points in the case of gray-scale contrast is not obvious. However, the Barnard detector would detect too many feature points, and they would be too clustered. This study uses the local extrema of clustering feature points and the suppression radius to overcome the problem, and compared different feature point detection methods. In the preliminary result, the feature points could be detected as representative control points by the proposed method.

Keywords: feature detection, Barnard detector, registration, periapical radiography image, endodontic treatment

Procedia PDF Downloads 442
12980 The Population Death Model and Influencing Factors from the Data of The "Sixth Census": Zhangwan District Case Study

Authors: Zhou Shangcheng, Yi Sicen

Abstract:

Objective: To understand the mortality patterns of Zhangwan District in 2010 and provide the basis for the development of scientific and rational health policy. Methods: Data are collected from the Sixth Census of Zhangwan District and disease surveillance system. The statistical analysis include death difference between age, gender, region and time and the related factors. Methods developed for the Global Burden of Disease (GBD) Study by the World Bank and World Health Organization (WHO) were adapted and applied to Zhangwan District population health data. DALY rate per 1,000 was calculated for varied causes of death. SPSS 16 is used by statistic analysis. Results: From the data of death population of Zhangwan District we know the crude mortality rate was 6.03 ‰. There are significant differences of mortality rate in male and female population which was respectively 7.37 ‰ and 4.68 ‰. 0 age group population life expectancy in Zhangwan District in 2010 was 78.40 years old(Male 75.93, Female 81.03). The five leading causes of YLL in descending order were: cardiovascular diseases(42.63DALY/1000), malignant neoplasm (23.73DALY/1000), unintentional injuries (5.84DALY/1000), Respiratory diseases(5.43 DALY/1000), Respiratory infections (2.44DALY/1000). In addition, there are strong relation between the marital status , educational level and mortality in some to a certain extend. Conclusion Zhangwan District, as city level, is at lower mortality levels. The mortality of the total population of Zhangwan District has a downward trend and life expectancy is rising.

Keywords: sixth census, Zhangwan district, death level differences, influencing factors, cause of death

Procedia PDF Downloads 270
12979 Microstructure Evolution and Pre-transformation Microstructure Reconstruction in Ti-6Al-4V Alloy

Authors: Shreyash Hadke, Manendra Singh Parihar, Rajesh Khatirkar

Abstract:

In the present investigation, the variation in the microstructure with the changes in the heat treatment conditions i.e. temperature and time was observed. Ti-6Al-4V alloy was subject to solution annealing treatments in β (1066C) and α+β phase (930C and 850C) followed by quenching, air cooling and furnace cooling to room temperature respectively. The effect of solution annealing and cooling on the microstructure was studied by using optical microscopy (OM), scanning electron microscopy (SEM), electron backscattered diffraction (EBSD) and x-ray diffraction (XRD). The chemical composition of the β phase for different conditions was determined with the help of energy dispersive spectrometer (EDS) attached to SEM. Furnace cooling resulted in the development of coarser structure (α+β), while air cooling resulted in much finer structure with widmanstatten morphology of α at the grain boundaries. Quenching from solution annealing temperature formed α’ martensite, their proportion being dependent on the temperature in β phase field. It is well known that the transformation of β to α follows Burger orientation relationship (OR). In order to reconstruct the microstructure of parent β phase, a MATLAB code was written using neighbor-to-neighbor, triplet method and Tari’s method. The code was tested on the annealed samples (1066C solution annealing temperature followed by furnace cooling to room temperature). The parent phase data thus generated was then plotted using the TSL-OIM software. The reconstruction results of the above methods were compared and analyzed. The Tari’s approach (clustering approach) gave better results compared to neighbor-to-neighbor and triplet method but the time taken by the triplet method was least compared to the other two methods.

Keywords: Ti-6Al-4V alloy, microstructure, electron backscattered diffraction, parent phase reconstruction

Procedia PDF Downloads 446
12978 Effects of Centrifugation, Encapsulation Method and Different Coating Materials on the Total Antioxidant Activity of the Microcapsules of Powdered Cherry Laurels

Authors: B. Cilek Tatar, G. Sumnu, M. Oztop, E. Ayaz

Abstract:

Encapsulation protects sensitive food ingredients against heat, oxygen, moisture and pH until they are released to the system. It can mask the unwanted taste of nutrients that are added to the foods for fortification purposes. Cherry laurels (Prunus laurocerasus) contain phenolic compounds which decrease the proneness to several chronic diseases such as types of cancer and cardiovascular diseases. The objective of this research was to study the effects of centrifugation, different coating materials and homogenization methods on microencapsulation of powders obtained from cherry laurels. In this study, maltodextrin and mixture of maltodextrin:whey protein with a ratio of 1:3 (w/w) were chosen as coating materials. Total solid content of coating materials was kept constant as 10% (w/w). Capsules were obtained from powders of freeze-dried cherry laurels through encapsulation process by silent crusher homogenizer or microfluidization. Freeze-dried cherry laurels were core materials and core to coating ratio was chosen as 1:10 by weight. To homogenize the mixture, high speed homogenizer was used at 4000 rpm for 5 min. Then, silent crusher or microfluidizer was used to complete encapsulation process. The mixtures were treated either by silent crusher for 1 min at 75000 rpm or microfluidizer at 50 MPa for 3 passes. Freeze drying for 48 hours was applied to emulsions to obtain capsules in powder form. After these steps, dry capsules were grounded manually into a fine powder. The microcapsules were analyzed for total antioxidant activity with DPPH (1,1-diphenyl-2-picrylhydrazyl) radical scavenging method. Prior to high speed homogenization, the samples were centrifuged (4000 rpm, 1 min). Centrifugation was found to have positive effect on total antioxidant activity of capsules. Microcapsules treated by microfluidizer were found to have higher total antioxidant activities than those treated by silent crusher. It was found that increasing whey protein concentration in coating material (using maltodextrin:whey protein 1:3 mixture) had positive effect on total antioxidant activity for both silent crusher and microfluidization methods. Therefore, capsules prepared by microfluidization of centrifuged mixtures can be selected as the best conditions for encapsulation of cherry laurel powder by considering their total antioxidant activity. In this study, it was shown that capsules prepared by these methods can be recommended to be incorporated into foods in order to enhance their functionality by increasing antioxidant activity.

Keywords: antioxidant activity, cherry laurel, microencapsulation, microfluidization

Procedia PDF Downloads 294
12977 The Use of Information and Communication Technology within and between Emergency Medical Teams during a Disaster: A Qualitative study

Authors: Badryah Alshehri, Kevin Gormley, Gillian Prue, Karen McCutcheon

Abstract:

In a disaster event, sharing patient information between the pre-hospital Emergency Medical Services (EMS) and Emergency Department (ED) hospitals is a complex process during which important information may be altered or lost due to poor communication. The aim of this study was to critically discuss the current evidence base in relation to communication between pre- EMS hospital and ED hospital professionals by the use of Information and Communication Systems (ICT). This study followed the systematic approach; six electronic databases were searched: CINAHL, Medline, Embase, PubMed, Web of Science, and IEEE Xplore Digital Library were comprehensively searched in January 2018 and a second search was completed in April 2020 to capture more recent publications. The study selection process was undertaken independently by the study authors. Both qualitative and quantitative studies were chosen that focused on factors that are positively or negatively associated with coordinated communication between pre-hospital EMS and ED teams in a disaster event. These studies were assessed for quality, and the data were analyzed according to the key screening themes which emerged from the literature search. Twenty-two studies were included. Eleven studies employed quantitative methods, seven studies used qualitative methods, and four studies used mixed methods. Four themes emerged on communication between EMTs (pre-hospital EMS and ED staff) in a disaster event using the ICT. (1) Disaster preparedness plans and coordination. This theme reported that disaster plans are in place in hospitals, and in some cases, there are interagency agreements with pre-hospital and relevant stakeholders. However, the findings showed that the disaster plans highlighted in these studies lacked information regarding coordinated communications within and between the pre-hospital and hospital. (2) Communication systems used in the disaster. This theme highlighted that although various communication systems are used between and within hospitals and pre-hospitals, technical issues have influenced communication between teams during disasters. (3) Integrated information management systems. This theme suggested the need for an integrated health information system that can help pre-hospital and hospital staff to record patient data and ensure the data is shared. (4) Disaster training and drills. While some studies analyzed disaster drills and training, the majority of these studies were focused on hospital departments other than EMTs. These studies suggest the need for simulation disaster training and drills, including EMTs. This review demonstrates that considerable gaps remain in the understanding of the communication between the EMS and ED hospital staff in relation to response in disasters. The review shows that although different types of ICTs are used, various issues remain which affect coordinated communication among the relevant professionals.

Keywords: emergency medical teams, communication, information and communication technologies, disaster

Procedia PDF Downloads 126
12976 Fractal Nature of Granular Mixtures of Different Concretes Formulated with Different Methods of Formulation

Authors: Fatima Achouri, Kaddour Chouicha, Abdelwahab Khatir

Abstract:

It is clear that concrete of quality must be made with selected materials chosen in optimum proportions that remain after implementation, a minimum of voids in the material produced. The different methods of formulations what we use, are based for the most part on a granular curve which describes an ‘optimal granularity’. Many authors have engaged in fundamental research on granular arrangements. A comparison of mathematical models reproducing these granular arrangements with experimental measurements of compactness have to verify that the minimum porosity P according to the following extent granular exactly a power law. So the best compactness in the finite medium are obtained with power laws, such as Furnas, Fuller or Talbot, each preferring a particular setting between 0.20 and 0.50. These considerations converge on the assumption that the optimal granularity Caquot approximates by a power law. By analogy, it can then be analyzed as a granular structure of fractal-type since the properties that characterize the internal similarity fractal objects are reflected also by a power law. Optimized mixtures may be described as a series of installments falling granular stuff to better the tank on a regular hierarchical distribution which would give at different scales, by cascading effects, the same structure to the mix. Likely this model may be appropriate for the entire extent of the size distribution of the components, since the cement particles (and silica fume) correctly deflocculated, micrometric dimensions, to chippings sometimes several tens of millimeters. As part of this research, the aim is to give an illustration of the application of fractal analysis to characterize the granular concrete mixtures optimized for a so-called fractal dimension where different concretes were studying that we proved a fractal structure of their granular mixtures regardless of the method of formulation or the type of concrete.

Keywords: concrete formulation, fractal character, granular packing, method of formulation

Procedia PDF Downloads 259
12975 Text Mining of Veterinary Forums for Epidemiological Surveillance Supplementation

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Web scraping and text mining are popular computer science methods deployed by public health researchers to augment traditional epidemiological surveillance. However, within veterinary disease surveillance, such techniques are still in the early stages of development and have not yet been fully utilised. This study presents an exploration into the utility of incorporating internet-based data to better understand the smallholder farming communities within Scotland by using online text extraction and the subsequent mining of this data. Web scraping of the livestock fora was conducted in conjunction with text mining of the data in search of common themes, words, and topics found within the text. Results from bi-grams and topic modelling uncover four main topics of interest within the data pertaining to aspects of livestock husbandry: feeding, breeding, slaughter, and disposal. These topics were found amongst both the poultry and pig sub-forums. Topic modeling appears to be a useful method of unsupervised classification regarding this form of data, as it has produced clusters that relate to biosecurity and animal welfare. Internet data can be a very effective tool in aiding traditional veterinary surveillance methods, but the requirement for human validation of said data is crucial. This opens avenues of research via the incorporation of other dynamic social media data, namely Twitter and Facebook/Meta, in addition to time series analysis to highlight temporal patterns.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, smallholding, social media, web scraping, sentiment analysis, geolocation, text mining, NLP

Procedia PDF Downloads 99