Search results for: quantum chemical methods
16874 In the Primary Education, the Classroom Teacher's Procedure of Coping WITH Stress, the Health of Psyche and the Direction of Check Point
Authors: Caglayan Pinar Demirtas, Mustafa Koc
Abstract:
Objective: This study was carried out in order to find out; the methods which are used by primary school teachers to cope with stress, their psychological health, and the direction of controlling focus. The study was carried out by using the ‘school survey’ and ‘society survey’ methods. Method: The study included primary school teachers. The study group was made up of 1066 people; 511 women and 555 men who accepted volunteerly to complete; ‘the inventory for collecting data, ‘the Scale for Attitude of Overcoming Stress’ (SBTE / SAOS), ‘Rotter’s Scale for the Focus of Inner- Outer Control’ (RİDKOÖ / RSFIOC), and ‘the Symptom Checking List’ (SCL- 90). The data was collected by using ‘the Scale for Attitude of Overcoming Stress’, ‘the Scale for the Focus of Inner- Outer Control’, ‘the Symptom Checking List’, and a personal information form developed by the researcher. SPSS for Windows packet programme was used. Result: The age variable is a factor in interpersonal sensitivity, depression, anxciety, hostality symptoms but it is not a factor in the other symptoms. The variable, gender, is a factor in emotional practical escaping overcoming method but it is not a factor in the other overcoming methods. Namely, it has been found out that, women use emotional practical escaping overcoming method more than men. Marital status is a factor in methods of overcoming stress such as trusting in religion, emotional practical escaping and biochemical escaping while it is not a factor in the other methods. Namely, it has been found out that married teachers use trusting in religion method, and emotional practical escaping method more than single ones. Single teachers generally use biochemical escaping method. In primary school teachers’ direction of controlling focus, gender variable is a factor. It has been found out that women are more inner controlled while the men are more outer controlled. The variable, time of service, is a factor in the direction of controlling focus; that is, teachers with 1-5 years of service time are more inner controlled compared with teachers with 16-20 years of service time. The variable, age, is a factor in the direction of controlling focus; that is, teachers in 26-30 age groups are more outer controlled compared with the other age groups and again teachers in 26-30 age group are more inner controlled when compared with the other age groups. Direction of controlling focus is a factor in the primary school teachers’ psychological health. Namely, being outer controlled is a factor but being inner controlled is not. The methods; trusting in religion, active plannıng and biochemical escaping used by primary school teachers to cope with stress act as factors in the direction of controlling focus but not in the others. Namely, it has been found out that outer controlled teachers prefer the methods of trusting in religion and active planning while the inner controlled ones prefer biochemical escaping.Keywords: coping with, controlling focus, psychological health, stress
Procedia PDF Downloads 35116873 2D Convolutional Networks for Automatic Segmentation of Knee Cartilage in 3D MRI
Authors: Ananya Ananya, Karthik Rao
Abstract:
Accurate segmentation of knee cartilage in 3-D magnetic resonance (MR) images for quantitative assessment of volume is crucial for studying and diagnosing osteoarthritis (OA) of the knee, one of the major causes of disability in elderly people. Radiologists generally perform this task in slice-by-slice manner taking 15-20 minutes per 3D image, and lead to high inter and intra observer variability. Hence automatic methods for knee cartilage segmentation are desirable and are an active field of research. This paper presents design and experimental evaluation of 2D convolutional neural networks based fully automated methods for knee cartilage segmentation in 3D MRI. The architectures are validated based on 40 test images and 60 training images from SKI10 dataset. The proposed methods segment 2D slices one by one, which are then combined to give segmentation for whole 3D images. Proposed methods are modified versions of U-net and dilated convolutions, consisting of a single step that segments the given image to 5 labels: background, femoral cartilage, tibia cartilage, femoral bone and tibia bone; cartilages being the primary components of interest. U-net consists of a contracting path and an expanding path, to capture context and localization respectively. Dilated convolutions lead to an exponential expansion of receptive field with only a linear increase in a number of parameters. A combination of modified U-net and dilated convolutions has also been explored. These architectures segment one 3D image in 8 – 10 seconds giving average volumetric Dice Score Coefficients (DSC) of 0.950 - 0.962 for femoral cartilage and 0.951 - 0.966 for tibia cartilage, reference being the manual segmentation.Keywords: convolutional neural networks, dilated convolutions, 3 dimensional, fully automated, knee cartilage, MRI, segmentation, U-net
Procedia PDF Downloads 26116872 Assessment of the Properties of Microcapsules with Different Polymeric Shells Containing a Reactive Agent for their Suitability in Thermoplastic Self-healing Materials
Authors: Małgorzata Golonka, Jadwiga Laska
Abstract:
Self-healing polymers are one of the most investigated groups of smart materials. As materials engineering has recently focused on the design, production and research of modern materials and future technologies, researchers are looking for innovations in structural, construction and coating materials. Based on available scientific articles, it can be concluded that most of the research focuses on the self-healing of cement, concrete, asphalt and anticorrosion resin coatings. In our study, a method of obtaining and testing the properties of several types of microcapsules for use in self-healing polymer materials was developed. A method to obtain microcapsules exhibiting various mechanical properties, especially compressive strength was developed. The effect was achieved by using various polymer materials to build the shell: urea-formaldehyde resin (UFR), melamine-formaldehyde resin (MFR), melamine-urea-formaldehyde resin (MUFR). Dicyclopentadiene (DCPD) was used as the core material due to the possibility of its polymerization according to the ring-opening olefin metathesis (ROMP) mechanism in the presence of a solid Grubbs catalyst showing relatively high chemical and thermal stability. The ROMP of dicyclopentadiene leads to a polymer with high impact strength, high thermal resistance, good adhesion to other materials and good chemical and environmental resistance, so it is potentially a very promising candidate for the self-healing of materials. The capsules were obtained by condensation polymerization of formaldehyde with urea, melamine or copolymerization with urea and melamine in situ in water dispersion, with different molar ratios of formaldehyde, urea and melamine. The fineness of the organic phase dispersed in water, and consequently the size of the microcapsules, was regulated by the stirring speed. In all cases, to establish such synthesis conditions as to obtain capsules with appropriate mechanical strength. The microcapsules were characterized by determining the diameters and their distribution and measuring the shell thickness using digital optical microscopy and scanning electron microscopy, as well as confirming the presence of the active substance in the core by FTIR and SEM. Compression tests were performed to determine mechanical strength of the microcapsules. The highest repeatability of microcapsule properties was obtained for UFR resin, while the MFR resin had the best mechanical properties. The encapsulation efficiency of MFR was much lower compared to UFR, though. Therefore, capsules with a MUFR shell may be the optimal solution. The chemical reaction between the active substance present in the capsule core and the catalyst placed outside the capsules was confirmed by FTIR spectroscopy. The obtained autonomous repair systems (microcapsules + catalyst) were introduced into polyethylene in the extrusion process and tested for the self-repair of the material.Keywords: autonomic self-healing system, dicyclopentadiene, melamine-urea-formaldehyde resin, microcapsules, thermoplastic materials
Procedia PDF Downloads 4516871 Solubility Measurements in the Context of Nanoregulation
Authors: Ratna Tantra
Abstract:
From a risk assessment point of view, solubility is a property that has been identified as being important. If nanomaterial is completely soluble, then its disposal can be treated much in the same way as ‘ordinary’ chemicals, which subsequently will simplify testing and characterization regimes. The measurement of solubility has been highlighted as important in a pan-European project, Framework Programme (FP) 7 NANoREG. Some of the project outputs surrounding this topic will be presented here, in which there are two parts. First, a review on existing methods capable of measuring nanomaterial solubility will be discussed. Second, a case study will be presented based on using colorimetry methods to quantify dissolve zinc from ZnO nanomaterial upon exposure to digestive juices. The main findings are as follows: a) there is no universal method for nanomaterial solubility testing. The method chosen will be dependent on sample type and nano-specific application/scenario. b) The colorimetry results show a positive correlation between particle concentration and amount of [Zn2+] released; this was expected c) results indicate complete dissolution of the ZnO nanomaterial, as a result of the digestion protocol but only a fraction existing as free ions. Finally, what differentiates the F7 NANoREG project over other projects is the need for participating research laboratories to follow a set of defined protocols, necessary to establish quality control and assurance. The methods and results associated with mandatory testing that carried out by all partners in NANoREG will be discussed.Keywords: nanomaterials, nanotoxicology, solubility, zinc oxide
Procedia PDF Downloads 33516870 Grid Computing for Multi-Objective Optimization Problems
Authors: Aouaouche Elmaouhab, Hassina Beggar
Abstract:
Solving multi-objective discrete optimization applications has always been limited by the resources of one machine: By computing power or by memory, most often both. To speed up the calculations, the grid computing represents a primary solution for the treatment of these applications through the parallelization of these resolution methods. In this work, we are interested in the study of some methods for solving multiple objective integer linear programming problem based on Branch-and-Bound and the study of grid computing technology. This study allowed us to propose an implementation of the method of Abbas and Al on the grid by reducing the execution time. To enhance our contribution, the main results are presented.Keywords: multi-objective optimization, integer linear programming, grid computing, parallel computing
Procedia PDF Downloads 48616869 Phytochemical Composition and Characterization of Bioactive Compounds of the Green Seaweed Ulva lactuca: A Phytotherapeutic Approach
Authors: Mariame Taibi, Marouane Aouiji, Rachid Bengueddour
Abstract:
The Moroccan coastline is particularly rich in algae and constitutes a reserve of species with considerable economic, social and ecological potential. This work focuses on the research and characterization of algae bioactive compounds that can be used in pharmacology or phytopathology. The biochemical composition of the green alga Ulva lactuca (Ulvophyceae) was studied by determining the content of moisture, ash, phenols, flavonoids, total tannins, and chlorophyll. Seven solvents: distilled water, methanol, ethyl acetate, chloroform, benzene, petroleum ether, and hexane, were tested for their effectiveness in recovering chemical compounds. The identification of functional groupings, as well as the bioactive chemical compounds, was determined by FT-IR and GC-MS. The moisture content of the alga was 77%, while the ash content was 15%. Phenol content differed from one solvent studied to another, while chlorophyll a, b, and total chlorophyll were determined at 14%, 9.52%, and 25%, respectively. Carotenoid was present in a considerable amount (8.17%). The experimental results show that methanol is the most effective solvent for recovering bioactive compounds, followed by water. Moreover, the green alga Ulva lactuca is characterized by a high level of total polyphenols (45±3.24 mg GAE/gDM), average levels of total tannins and flavonoids (22.52±8.23 mg CE/gDM, 15.49±0.064 mg QE/gDM) respectively. The results of Fourier transform infrared spectroscopy (FT-IR) confirmed the presence of alcohol/phenol and amide functions in Ulva lactuca. The GC-MS analysis gave precisely the compounds contained in the various extracts, such as phenolic compounds, fatty acids, terpenoids, alcohols, alkanes, hydrocarbons, and steroids. All these results represent only a first step in the search for biologically active natural substances from seaweed. Additional tests are envisaged to confirm the bioactivity of seaweed.Keywords: algae, Ulva lactuca, phenolic compounds, FTIR, GC-MS
Procedia PDF Downloads 10816868 Hawking Radiation of Grumiller Black
Authors: Sherwan Kher Alden Yakub Alsofy
Abstract:
In this paper, we consider the relativistic Hamilton-Jacobi (HJ) equation and study the Hawking radiation (HR) of scalar particles from uncharged Grumiller black hole (GBH) which is affordable for testing in astrophysics. GBH is also known as Rindler modified Schwarzschild BH. Our aim is not only to investigate the effect of the Rindler parameter A on the Hawking temperature (TH ), but to examine whether there is any discrepancy between the computed horizon temperature and the standard TH as well. For this purpose, in addition to its naive coordinate system, we study on the three regular coordinate systems which are Painlev´-Gullstrand (PG), ingoing Eddington- Finkelstein (IEF) and Kruskal-Szekeres (KS) coordinates. In all coordinate systems, we calculate the tunneling probabilities of incoming and outgoing scalar particles from the event horizon by using the HJ equation. It has been shown in detail that the considered HJ method is concluded with the conventional TH in all these coordinate systems without giving rise to the famous factor- 2 problem. Furthermore, in the PG coordinates Parikh-Wilczek’s tunneling (PWT) method is employed in order to show how one can integrate the quantum gravity (QG) corrections to the semiclassical tunneling rate by including the effects of self-gravitation and back reaction. We then show how these corrections yield a modification in the TH.Keywords: ingoing Eddington, Finkelstein, coordinates Parikh-Wilczek’s, Hamilton-Jacobi equation
Procedia PDF Downloads 61516867 Research on the Online Learning Activities Design and Students’ Experience Based on APT Model
Authors: Wang Yanli, Cheng Yun, Yang Jiarui
Abstract:
Due to the separation of teachers and students, online teaching during the COVID-19 epidemic was faced with many problems, such as low enthusiasm of students, distraction, low learning atmosphere, and insufficient interaction between teachers and students. The essay designed the elaborate online learning activities of the course 'Research Methods of Educational Science' based on the APT model from three aspects of multiple assessment methods, a variety of teaching methods, and online learning environment and technology. Student's online learning experience was examined from the perception of online course, the perception of the online learning environment, and satisfaction after the course’s implementation. The research results showed that students have a positive overall evaluation of online courses, a high degree of engagement in learning, positive acceptance of online learning, and high satisfaction with it, but students hold a relatively neutral attitude toward online learning. And some dimensions in online learning experience were found to have positive influence on students' satisfaction with online learning. We suggest making the good design of online courses, selecting proper learning platforms, and conducting blended learning to improve students’ learning experience. This study has both theoretical and practical significance for the design, implementation, effect feedback, and sustainable development of online teaching in the post-epidemic era.Keywords: APT model, online learning, online learning activities, learning experience
Procedia PDF Downloads 13616866 Voices and Pictures from an Online Course and a Face to Face Course
Authors: Eti Gilad, Shosh Millet
Abstract:
In light of the technological development and its introduction into the field of education, an online course was designed in parallel to the 'conventional' course for teaching the ''Qualitative Research Methods''. This course aimed to characterize learning-teaching processes in a 'Qualitative Research Methods' course studied in two different frameworks. Moreover its objective was to explore the difference between the culture of a physical learning environment and that of online learning. The research monitored four learner groups, a total of 72 students, for two years, two groups from the two course frameworks each year. The courses were obligatory for M.Ed. students at an academic college of education and were given by one female-lecturer. The research was conducted in the qualitative method as a case study in order to attain insights about occurrences in the actual contexts and sites in which they transpire. The research tools were open-ended questionnaire and reflections in the form of vignettes (meaningful short pictures) to all students as well as an interview with the lecturer. The tools facilitated not only triangulation but also collecting data consisting of voices and pictures of teaching and learning. The most prominent findings are: differences between the two courses in the change features of the learning environment culture for the acquisition of contents and qualitative research tools. They were manifested by teaching methods, illustration aids, lecturer's profile and students' profile.Keywords: face to face course, online course, qualitative research, vignettes
Procedia PDF Downloads 41816865 A Review of Lortie’s Schoolteacher
Authors: Tsai-Hsiu Lin
Abstract:
Dan C. Lortie’s Schoolteacher: A sociological study is one of the best works on the sociology of teaching since W. Waller’s classic study. It is a book worthy of review. Following the tradition of symbolic interactionists, Lortie demonstrated the qualities who studied the occupation of teaching. Using several methods to gather effective data, Lortie has portrayed the ethos of the teaching profession. Therefore, the work is an important book on the teaching profession and teacher culture. Though outstanding, Lortie’s work is also flawed in that his perspectives and methodology were adopted largely from symbolic interactionism. First, Lortie in his work analyzed many points regarding teacher culture; for example, he was interested in exploring “sentiment,” “cathexis,” and “ethos.” Thus, he was more a psychologist than a sociologist. Second, symbolic interactionism led him to discern the teacher culture from a micro view, thereby missing the structural aspects. For example, he did not fully discuss the issue of gender and he ignored the issue of race. Finally, following the qualitative sociological tradition, Lortie employed many qualitative methods to gather data but only foucused on obtaining and presenting interview data. Moreover, he used measurement methods that were too simplistic for analyzing quantitative data fully.Keywords: education reform, teacher culture, teaching profession, Lortie’s Schoolteacher
Procedia PDF Downloads 22916864 Comparison of Sensitivity and Specificity of Pap Smear and Polymerase Chain Reaction Methods for Detection of Human Papillomavirus: A Review of Literature
Authors: M. Malekian, M. E. Heydari, M. Irani Estyar
Abstract:
Human papillomavirus (HPV) is one of the most common sexually transmitted infection, which may lead to cervical cancer as the main cause of it. With early diagnosis and treatment in health care services, cervical cancer and its complications are considered to be preventable. This study was aimed to compare the efficiency, sensitivity, and specificity of Pap smear and polymerase chain reaction (PCR) in detecting HPV. A literature search was performed in Google Scholar, PubMed and SID databases using the keywords 'human papillomavirus', 'pap smear' and 'polymerase change reaction' to identify studies comparing Pap smear and PCR methods for the detection. No restrictions were considered.10 studies were included in this review. All samples that were positive by pop smear were also positive by PCR. However, there were positive samples detected by PCR which was negative by pop smear and in all studies, many positive samples were missed by pop smear technique. Although The Pap smear had high specificity, PCR based HPV detection was more sensitive method and had the highest sensitivity. In order to promote the quality of detection and high achievement of the maximum results, PCR diagnostic methods in addition to the Pap smear are needed and Pap smear method should be combined with PCR techniques according to the high error rate of Pap smear in detection.Keywords: human papillomavirus, cervical cancer, pap smear, polymerase chain reaction
Procedia PDF Downloads 13116863 On Unification of the Electromagnetic, Strong and Weak Interactions
Authors: Hassan Youssef Mohamed
Abstract:
In this paper, we show new wave equations, and by using the equations, we concluded that the strong force and the weak force are not fundamental, but they are quantum effects for electromagnetism. This result is different from the current scientific understanding about strong and weak interactions at all. So, we introduce three evidences for our theory. First, we prove the asymptotic freedom phenomenon in the strong force by using our model. Second, we derive the nuclear shell model as an approximation of our model. Third, we prove that the leptons do not participate in the strong interactions, and we prove the short ranges of weak and strong interactions. So, our model is consistent with the current understanding of physics. Finally, we introduce the electron-positron model as the basic ingredients for protons, neutrons, and all matters, so we can study all particles interactions and nuclear interaction as many-body problems of electrons and positrons. Also, we prove the violation of parity conservation in weak interaction as evidence of our theory in the weak interaction. Also, we calculate the average of the binding energy per nucleon.Keywords: new wave equations, the strong force, the grand unification theory, hydrogen atom, weak force, the nuclear shell model, the asymptotic freedom, electron-positron model, the violation of parity conservation, the binding energy
Procedia PDF Downloads 18516862 Pricing European Options under Jump Diffusion Models with Fast L-stable Padé Scheme
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. Modeling option pricing by Black-School models with jumps guarantees to consider the market movement. However, only numerical methods can solve this model. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, the exponential time differencing (ETD) method is applied for solving partial integrodifferential equations arising in pricing European options under Merton’s and Kou’s jump-diffusion models. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). A partial fraction form of Pad`e schemes is used to overcome the complexity of inverting polynomial of matrices. These two tools guarantee to get efficient and accurate numerical solutions. We construct a parallel and easy to implement a version of the numerical scheme. Numerical experiments are given to show how fast and accurate is our scheme.Keywords: Integral differential equations, , L-stable methods, pricing European options, Jump–diffusion model
Procedia PDF Downloads 15116861 Liquid-Liquid Plug Flow Characteristics in Microchannel with T-Junction
Authors: Anna Yagodnitsyna, Alexander Kovalev, Artur Bilsky
Abstract:
The efficiency of certain technological processes in two-phase microfluidics such as emulsion production, nanomaterial synthesis, nitration, extraction processes etc. depends on two-phase flow regimes in microchannels. For practical application in chemistry and biochemistry it is very important to predict the expected flow pattern for a large variety of fluids and channel geometries. In the case of immiscible liquids, the plug flow is a typical and optimal regime for chemical reactions and needs to be predicted by empirical data or correlations. In this work flow patterns of immiscible liquid-liquid flow in a rectangular microchannel with T-junction are investigated. Three liquid-liquid flow systems are considered, viz. kerosene – water, paraffin oil – water and castor oil – paraffin oil. Different flow patterns such as parallel flow, slug flow, plug flow, dispersed (droplet) flow, and rivulet flow are observed for different velocity ratios. New flow pattern of the parallel flow with steady wavy interface (serpentine flow) has been found. It is shown that flow pattern maps based on Weber numbers for different liquid-liquid systems do not match well. Weber number multiplied by Ohnesorge number is proposed as a parameter to generalize flow maps. Flow maps based on this parameter are superposed well for all liquid-liquid systems of this work and other experiments. Plug length and velocity are measured for the plug flow regime. When dispersed liquid wets channel walls plug length cannot be predicted by known empirical correlations. By means of particle tracking velocimetry technique instantaneous velocity fields in a plug flow regime were measured. Flow circulation inside plug was calculated using velocity data that can be useful for mass flux prediction in chemical reactions.Keywords: flow patterns, hydrodynamics, liquid-liquid flow, microchannel
Procedia PDF Downloads 39416860 A Review on Bone Grafting, Artificial Bone Substitutes and Bone Tissue Engineering
Authors: Kasun Gayashan Samarawickrama
Abstract:
Bone diseases, defects, and fractions are commonly seen in modern life. Since bone is regenerating dynamic living tissue, it will undergo healing process naturally, it cannot recover from major bone injuries, diseases and defects. In order to overcome them, bone grafting technique was introduced. Gold standard was the best method for bone grafting for the past decades. Due to limitations of gold standard, alternative methods have been implemented. Apart from them artificial bone substitutes and bone tissue engineering have become the emerging methods with technology for bone grafting. Many bone diseases and defects will be healed permanently with these promising techniques in future.Keywords: bone grafting, gold standard, bone substitutes, bone tissue engineering
Procedia PDF Downloads 29916859 Fuzzy Expert Approach for Risk Mitigation on Functional Urban Areas Affected by Anthropogenic Ground Movements
Authors: Agnieszka A. Malinowska, R. Hejmanowski
Abstract:
A number of European cities are strongly affected by ground movements caused by anthropogenic activities or post-anthropogenic metamorphosis. Those are mainly water pumping, current mining operation, the collapse of post-mining underground voids or mining-induced earthquakes. These activities lead to large and small-scale ground displacements and a ground ruptures. The ground movements occurring in urban areas could considerably affect stability and safety of structures and infrastructures. The complexity of the ground deformation phenomenon in relation to the structures and infrastructures vulnerability leads to considerable constraints in assessing the threat of those objects. However, the increase of access to the free software and satellite data could pave the way for developing new methods and strategies for environmental risk mitigation and management. Open source geographical information systems (OS GIS), may support data integration, management, and risk analysis. Lately, developed methods based on fuzzy logic and experts methods for buildings and infrastructure damage risk assessment could be integrated into OS GIS. Those methods were verified base on back analysis proving their accuracy. Moreover, those methods could be supported by ground displacement observation. Based on freely available data from European Space Agency and free software, ground deformation could be estimated. The main innovation presented in the paper is the application of open source software (OS GIS) for integration developed models and assessment of the threat of urban areas. Those approaches will be reinforced by analysis of ground movement based on free satellite data. Those data would support the verification of ground movement prediction models. Moreover, satellite data will enable our mapping of ground deformation in urbanized areas. Developed models and methods have been implemented in one of the urban areas hazarded by underground mining activity. Vulnerability maps supported by satellite ground movement observation would mitigate the hazards of land displacements in urban areas close to mines.Keywords: fuzzy logic, open source geographic information science (OS GIS), risk assessment on urbanized areas, satellite interferometry (InSAR)
Procedia PDF Downloads 15916858 Evaluation of the Gas Exchange Characteristics of Selected Plant Species of Universiti Tun Hussein Onn Malaysia, UTHM
Authors: Yunusa Audu, Alona Cuevas Linatoc, Aisha Idris
Abstract:
The maximum carboxylation rate of Rubisco (Vcmax) and the maximum electron transport rate (Jmax), light compensation point (LCP), light saturation point (LSP), maximum photosynthesis (Amax), and apparent quantum yield (Aqy) are gas exchange characteristics that are derived from the carbon dioxide (CO2) and light response curves. This characteristics can be affected by the level of CO2 and light received by the plant. Moreover, the characteristics determines the photosynthetic capacity of the plant. The objective of the study is to evaluate the gas exchange characteristics of selected plant species of UTHM. Photosynthetic carbon dioxide (A\Ci) and light (A/Q) response curves were measured using portable photosynthesis system (LICOR). The results shows that both A/Ci and A/Q curves increases as CO2 and light increases, but reach to a certain point where the curves will become saturated. Spathodea campanulata was having the highest Vcmax (52.14±0.005 µmolCO2 m-2s-1), Jmax (104.461±0.011 µmolCO2 m-2s-1) and Aqy (0.072±0.001 mol CO2 mol-1 photons). The highest LCP was observed in Rhaphis excelsa (69.60±0.067 µmol photons m-2s-1) while the highest LSP was recorded for Costus spicatus (1576.69±0.173 µmol photons m-2s-1). It was concluded that the plants need high light intensity and CO2 for their maximum assimilation rate.Keywords: Gas, Co2, Exchange, Plants
Procedia PDF Downloads 1416857 The Impact of Dust Storm Events on the Chemical and Toxicological Characteristics of Ambient Particulate Matter in Riyadh, Saudi Arabia
Authors: Abdulmalik Altuwayjiri, Milad Pirhadi, Mohammed Kalafy, Badr Alharbi, Constantinos Sioutas
Abstract:
In this study, we investigated the chemical and toxicological characteristics of PM10 in the metropolitan area of Riyadh, Saudi Arabia. PM10 samples were collected on quartz and teflon filters during cold (December 2019–April 2020) and warm (May 2020–August 2020) seasons, including dust and non-dust events. The PM10 constituents were chemically analyzed for their metal, inorganic ions, and elemental and organic carbon (EC/OC) contents. Additionally, the PM10 oxidative potential was measured by means of the dithiothreitol (DTT) assay. Our findings revealed that the oxidative potential of the collected ambient PM10 samples was significantly higher than those measured in many urban areas worldwide. The oxidative potential of the collected ambient PM¹⁰⁻ samples was also higher during dust episodes compared to non-dust events, mainly due to higher concentrations of metals during these events. We performed Pearson correlation analysis, principal component analysis (PCA), and multi-linear regression (MLR) to identify the most significant sources contributing to the toxicity of PM¹⁰⁻ The results of the MLR analyses indicated that the major pollution sources contributing to the oxidative potential of ambient PM10 were soil and resuspended dust emissions (identified by Al, K, Fe, and Li) (31%), followed by secondary organic aerosol (SOA) formation (traced by SO₄-² and NH+₄) (20%), and industrial activities (identified by Se and La) (19%), and traffic emissions (characterized by EC, Zn, and Cu) (17%). Results from this study underscore the impact of transported dust emissions on the oxidative potential of ambient PM10 in Riyadh and can be helpful in adopting appropriate public health policies regarding detrimental outcomes of exposure to PM₁₀-Keywords: ambient PM10, oxidative potential, source apportionment, Riyadh, dust episodes
Procedia PDF Downloads 17216856 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. Methods: The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigenère. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e., shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b+1, it’ll return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Results: Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character isn’t used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it’s questionable if it works better than the other methods from the point of view of execution time and storage space.Keywords: ciphering, deciphering, authentic, algorithm, polyalphabetic cipher, random key, methods comparison
Procedia PDF Downloads 10316855 An Advanced Match-Up Scheduling Under Single Machine Breakdown
Abstract:
When a machine breakdown forces a Modified Flow Shop (MFS) out of the prescribed state, the proposed strategy reschedules part of the initial schedule to match up with the preschedule at some point. The objective is to create a new schedule that is consistent with the other production planning decisions like material flow, tooling and purchasing by utilizing the time critical decision making concept. We propose a new rescheduling strategy and a match-up point determination procedure through a feedback mechanism to increase both the schedule quality and stability. The proposed approach is compared with alternative reactive scheduling methods under different experimental settings.Keywords: advanced critical task methods modified flow shop (MFS), Manufacturing, experiment, determination
Procedia PDF Downloads 40516854 Heuristic Methods for the Capacitated Location- Allocation Problem with Stochastic Demand
Authors: Salinee Thumronglaohapun
Abstract:
The proper number and appropriate locations of service centers can save cost, raise revenue and gain more satisfaction from customers. Establishing service centers is high-cost and difficult to relocate. In long-term planning periods, several factors may affect the service. One of the most critical factors is uncertain demand of customers. The opened service centers need to be capable of serving customers and making a profit although the demand in each period is changed. In this work, the capacitated location-allocation problem with stochastic demand is considered. A mathematical model is formulated to determine suitable locations of service centers and their allocation to maximize total profit for multiple planning periods. Two heuristic methods, a local search and genetic algorithm, are used to solve this problem. For the local search, five different chances to choose each type of moves are applied. For the genetic algorithm, three different replacement strategies are considered. The results of applying each method to solve numerical examples are compared. Both methods reach to the same best found solution in most examples but the genetic algorithm provides better solutions in some cases.Keywords: location-allocation problem, stochastic demand, local search, genetic algorithm
Procedia PDF Downloads 12416853 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms
Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau
Abstract:
Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.Keywords: job-shop scheduling, terminology, notation, standardization
Procedia PDF Downloads 10916852 Drippers Scaling Inhibition of the Localized Irrigation System by Green Inhibitors Based on Plant Extracts
Authors: Driouiche Ali, Karmal Ilham
Abstract:
The Agadir region is characterized by a dry climate, ranging from arid attenuated by oceanic influences to hyper-arid. The water mobilized in the agricultural sector of greater Agadir is 95% of underground origin and comes from the water table of Chtouka. The rest represents the surface waters of the Youssef Ben Tachfine dam. These waters are intended for the irrigation of 26880 hectares of modern agriculture. More than 120 boreholes and wells are currently exploited. Their depth varies between 10 m and 200 m and the unit flow rates of the boreholes are 5 to 50 l/s. A drop in the level of the water table of about 1.5 m/year, on average, has been observed during the last five years. Farmers are thus called upon to improve irrigation methods. Thus, localized or drip irrigation is adopted to allow rational use of water. The importance of this irrigation system is due to the fact that water is applied directly to the root zone and its compatibility with fertilization. However, this irrigation system faces a thorny problem which is the clogging of pipes and drippers. This leads to a lack of uniformity of irrigation over time. This so-called scaling phenomenon, the consequences of which are harmful (cleaning or replacement of pipes), leads to considerable unproductive expenditure. The objective set by this work is the search for green inhibitors likely to prevent this phenomenon of scaling. This study requires a better knowledge of these waters, their physico-chemical characteristics and their scaling power. Thus, using the "LCGE" controlled degassing technique, we initially evaluated, on pure calco-carbonic water at 30°F, the scaling-inhibiting power of some available plant extracts in our region of Souss-Massa. We then carried out a comparative study of the efficacy of these green inhibitors. The action of the most effective green inhibitor on real agricultural waters was then studied.Keywords: green inhibitors, localized irrigation, plant extracts, scaling inhibition
Procedia PDF Downloads 8216851 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying
Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra
Abstract:
Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.Keywords: FT-NIR, pasta, moisture determination, food engineering
Procedia PDF Downloads 25816850 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods
Authors: Ali Berkan Ural
Abstract:
This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning
Procedia PDF Downloads 9516849 Understanding the Utilization of Luffa Cylindrica in the Adsorption of Heavy Metals to Clean Up Wastewater
Authors: Akanimo Emene, Robert Edyvean
Abstract:
In developing countries, a low cost method of wastewater treatment is highly recommended. Adsorption is an efficient and economically viable treatment process for wastewater. The utilisation of this process is based on the understanding of the relationship between the growth environment and the metal capacity of the biomaterial. Luffa cylindrica (LC), a plant material, was used as an adsorbent in adsorption design system of heavy metals. The chemically modified LC was used to adsorb heavy metals ions, lead and cadmium, from aqueous environmental solution at varying experimental conditions. Experimental factors, adsorption time, initial metal ion concentration, ionic strength and pH of solution were studied. The chemical nature and surface area of the tissues adsorbing heavy metals in LC biosorption systems were characterised by using electron microscopy and infra-red spectroscopy. It showed an increase in the surface area and improved adhesion capacity after chemical treatment. Metal speciation of the metal ions showed the binary interaction between the ions and the LC surface as the pH increases. Maximum adsorption was shown between pH 5 and pH 6. The ionic strength of the metal ion solution has an effect on the adsorption capacity based on the surface charge and the availability of the adsorption sites on the LC. The nature of the metal-surface complexes formed as a result of the experimental data were analysed with kinetic and isotherm models. The pseudo second order kinetic model and the two-site Langmuir isotherm model showed the best fit. Through the understanding of this process, there will be an opportunity to provide an alternative method for water purification. This will be provide an option, for when expensive water treatment technologies are not viable in developing countries.Keywords: adsorption, luffa cylindrica, metal-surface complexes, pH
Procedia PDF Downloads 8916848 Advances in Artificial intelligence Using Speech Recognition
Authors: Khaled M. Alhawiti
Abstract:
This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance
Procedia PDF Downloads 47816847 Diagnosis of Avian Pathology in the East of Algeria
Authors: Khenenou Tarek, Benzaoui Hassina, Melizi Mohamed
Abstract:
The diagnosis requires a background of current knowledge in the field and also complementary means in which the laboratory occupies the central place for a better investigation. A correct diagnosis allows to establish the most appropriate treatment as soon as possible and avoids both the economic losses associated with mortality and growth retardation often observed in poultry furthermore it may reduce the high cost of treatment. Epedemiologic survey, hematologic and histopathologic study’s are three aspects of diagnosis heavily used in both human and veterinary pathology and the advanced researches in human medicine would be exploited to be applied in veterinary medicine with given modification .Whereas, the diagnostic methods in the east of Algeria are limited to the clinical signs and necropsy finding. Therefore, the diagnosis is based simply on the success or the failure of the therapeutic methods (therapeutic diagnosis).Keywords: chicken, diagnosis, hematology, histopathology
Procedia PDF Downloads 63016846 Study Properties of Bamboo Composite after Treatment Surface by Chemical Method
Authors: Kiatnarong Supapanmanee, Ekkarin Phongphinittana, Pongsak Nimdum
Abstract:
Natural fibers are readily available raw materials that are widely used as composite materials. The most common problem facing many researchers with composites made from this fiber is the adhesion between the natural fiber contact surface and the matrix material. Part of the problem is due to the hydrophilic properties of natural fibers and the hydrophobic properties of the matrix material. Based on the aforementioned problems, this research selected bamboo fiber, which is a strong natural fiber in the research study. The first step was to study the effect of the mechanical properties of the pure bamboo strip by testing the tensile strength of different measurement lengths. The bamboo strip was modified surface with sodium hydroxide (NaOH) at 6wt% concentrations for different soaking periods. After surface modification, the physical and mechanical properties of the pure bamboo strip fibers were studied. The modified and unmodified bamboo strips were molded into a composite material using epoxy as a matrix to compare the mechanical properties and adhesion between the fiber surface and the material with tensile and bending tests. In addition, the results of these tests were compared with the finite element method (FEM). The results showed that the length of the bamboo strip affects the strength of the fibers, with shorter fibers causing higher tensile stress. Effects of surface modification of bamboo strip with NaOH, this chemical eliminates lignin and hemicellulose, resulting in the smaller dimension of the bamboo strip and increased density. From the pretreatment results above, it was found that the treated bamboo strip and composite material had better Ultimate tensile stress and Young's modulus. Moreover, that results in better adhesion between bamboo fiber and matrix material.Keywords: bamboo fiber, bamboo strip, composite material, bamboo composite, pure bamboo, surface modification, mechanical properties of bamboo, bamboo finite element method
Procedia PDF Downloads 9216845 Polymersomes in Drug Delivery: A Comparative Review with Liposomes and Micelles
Authors: Salma E. Ahmed
Abstract:
Since the mid 50’s, enormous attention has been paid towards nanocarriers and their applications in drug and gene delivery. Among these vesicles, liposomes and micelles have been heavily investigated due to their many advantages over other types. Liposomes, for instance, are mostly distinguished by their ability to encapsulate hydrophobic, hydrophilic and amphiphilic drugs. Micelles, on the other hand, are self-assembled shells of lipids, amphiphilic or oppositely charged block copolymers that, once exposed to aqueous media, can entrap hydrophobic agents, and possess prolonged circulation in the bloodstream. Both carriers are considered compatible and biodegradable. Nevertheless, they have limited stabilities, chemical versatilities, and drug encapsulation efficiencies. In order to overcome these downsides, strategies for optimizing a novel drug delivery system that has the architecture of liposomes and polymeric characteristics of micelles have been evolved. Polymersomes are vehicles with fluidic cores and hydrophobic shells that are protected and isolated from the aqueous media by the hydrated hydrophilic brushes which give the carrier its distinctive polymeric bilayer shape. Similar to liposomes, this merit enables the carrier to encapsulate a wide range of agents, despite their affinities and solubilities in water. Adding to this, the high molecular weight of the amphiphiles that build the body of the polymersomes increases their colloidal and chemical stabilities and reduces the permeability of the polymeric membranes, which makes the vesicles more protective to the encapsulated drug. These carriers can also be modified in ways that make them responsive when targeted or triggered, by manipulating their composition and attaching moieties and conjugates to the body of the carriers. These appealing characteristics, in addition to the ease of synthesis, gave the polymersomes greater potentials in the area of drug delivery. Thus, their design and characterization, in comparison with liposomes and micelles, are briefly reviewed in this work.Keywords: controlled release, liposomes, micelles, polymersomes, targeting
Procedia PDF Downloads 195