Search results for: analytical tools
5389 Comparing Energy Labelling of Buildings in Spain
Authors: Carolina Aparicio-Fernández, Alejandro Vilar Abad, Mar Cañada Soriano, Jose-Luis Vivancos
Abstract:
The building sector is responsible for 40% of the total energy consumption in the European Union (EU). Thus, implementation of strategies for quantifying and reducing buildings energy consumption is indispensable for reaching the EU’s carbon neutrality and energy efficiency goals. Each Member State has transposed the European Directives according to its own peculiarities: existing technical legislation, constructive solutions, climatic zones, etc. Therefore, in accordance with the Energy Performance of Buildings Directive, Member States have developed different Energy Performance Certificate schemes, using proposed energy simulation software-tool for each national or regional area. Energy Performance Certificates provide a powerful and comprehensive information to predict, analyze and improve the energy demand of new and existing buildings. Energy simulation software and databases allow a better understanding of the current constructive reality of the European building stock. However, Energy Performance Certificates still have to face several issues to consider them as a reliable and global source of information since different calculation tools are used that do not allow the connection between them. In this document, TRNSYS (TRaNsient System Simulation program) software is used to calculate the energy demand of a building, and it is compared with the energy labeling obtained with Spanish Official software-tools. We demonstrate the possibility of using not official software-tools to calculate the Energy Performance Certificate. Thus, this approach could be used throughout the EU and compare the results in all possible cases proposed by the EU Member States. To implement the simulations, an isolated single-family house with different construction solutions is considered. The results are obtained for every climatic zone of the Spanish Technical Building Code.Keywords: energy demand, energy performance certificate EPBD, trnsys, buildings
Procedia PDF Downloads 1275388 Carbide Structure and Fracture Toughness of High Speed Tool Steels
Authors: Jung-Ho Moon, Tae Kwon Ha
Abstract:
M2 steels, the typical Co-free high speed steel (HSS) possessing hardness level of 63~65 HRc, are most widely used for cutting tools. On the other hand, Co-containing HSS’s, such as M35 and M42, show a higher hardness level of 65~67 HRc and used for high quality cutting tools. In the fabrication of HSS’s, it is very important to control cleanliness and eutectic carbide structure of the ingot and it is required to increase productivity at the same time. Production of HSS ingots includes a variety of processes such as casting, electro-slag remelting (ESR), forging, blooming, and wire rod rolling processes. In the present study, electro-slag rapid remelting (ESRR) process, an advanced ESR process combined by continuous casting, was successfully employed to fabricate HSS billets of M2, M35, and M42 steels. Distribution and structure of eutectic carbides of the billets were analysed and cleanliness, hardness, and composition profile of the billets were also evaluated.Keywords: high speed tool steel, eutectic carbide, microstructure, hardness, fracture toughness
Procedia PDF Downloads 4455387 Planning and Implementing Large-Scale Ecological Connectivity: A Review of Past and Ongoing Practices in Turkey
Authors: Tutku Ak, A. Esra Cengiz, Çiğdem Ayhan Kaptan
Abstract:
The conservation community has been increasingly promoting the concept of ecological connectivity towards the prevention and mitigation of landscape fragmentation. Many tools have been proposed for this purpose in not only Europe, but also around the world. Spatial planning for building connectivity, however, has many problems associated with the complexity of ecological processes at spatial and temporal scales. Furthermore, on the ground implementation could be very difficult potentially leading to ecologically disastrous results and waste of resources. These problems, on the other hand, can be avoided or rectified as more experience is gained with implementation. Therefore, it is the objective of this study to document the experiences gained with connectivity planning in Turkish landscapes. This paper is a preliminary review of the conservation initiatives and projects aimed at protecting and building ecological connectivity in and around Turkey. The objective is to scope existing conservation plans, tools and implementation approaches in Turkey and the ultimate goal is to understand to what degree they have been implemented and what are the constraints and opportunities that are being faced.Keywords: ecological connectivity, large-scale landscapes, planning and implementation, Turkey
Procedia PDF Downloads 5015386 Portfolio Management for Construction Company during Covid-19 Using AHP Technique
Authors: Sareh Rajabi, Salwa Bheiry
Abstract:
In general, Covid-19 created many financial and non-financial damages to the economy and community. Level and severity of covid-19 as pandemic case varies over the region and due to different types of the projects. Covid-19 virus emerged as one of the most imperative risk management factors word-wide recently. Therefore, as part of portfolio management assessment, it is essential to evaluate severity of such risk on the project and program in portfolio management level to avoid any risky portfolio. Covid-19 appeared very effectively in South America, part of Europe and Middle East. Such pandemic infection affected the whole universe, due to lock down, interruption in supply chain management, health and safety requirements, transportations and commercial impacts. Therefore, this research proposes Analytical Hierarchy Process (AHP) to analyze and assess such pandemic case like Covid-19 and its impacts on the construction projects. The AHP technique uses four sub-criteria: Health and safety, commercial risk, completion risk and contractual risk to evaluate the project and program. The result will provide the decision makers with information which project has higher or lower risk in case of Covid-19 and pandemic scenario. Therefore, the decision makers can have most feasible solution based on effective weighted criteria for project selection within their portfolio to match with the organization’s strategies.Keywords: portfolio management, risk management, COVID-19, analytical hierarchy process technique
Procedia PDF Downloads 1095385 White Wine Discrimination Based on Deconvoluted Surface Enhanced Raman Spectroscopy Signals
Authors: Dana Alina Magdas, Nicoleta Simona Vedeanu, Ioana Feher, Rares Stiufiuc
Abstract:
Food and beverages authentication using rapid and non-expensive analytical tools represents nowadays an important challenge. In this regard, the potential of vibrational techniques in food authentication has gained an increased attention during the last years. For wines discrimination, Raman spectroscopy appears more feasible to be used as compared with IR (infrared) spectroscopy, because of the relatively weak water bending mode in the vibrational spectroscopy fingerprint range. Despite this, the use of Raman technique in wine discrimination is in an early stage. Taking this into consideration, the wine discrimination potential of surface-enhanced Raman scattering (SERS) technique is reported in the present work. The novelty of this study, compared with the previously reported studies, concerning the application of vibrational techniques in wine discrimination consists in the fact that the present work presents the wines differentiation based on the individual signals obtained from deconvoluted spectra. In order to achieve wines classification with respect to variety, geographical origin and vintage, the peaks intensities obtained after spectra deconvolution were compared using supervised chemometric methods like Linear Discriminant Analysis (LDA). For this purpose, a set of 20 white Romanian wines from different viticultural Romanian regions four varieties, was considered. Chemometric methods applied directly to row SERS experimental spectra proved their efficiency, but discrimination markers identification found to be very difficult due to the overlapped signals as well as for the band shifts. By using this approach, a better general view related to the differences that appear among the wines in terms of compositional differentiation could be reached.Keywords: chemometry, SERS, variety, wines discrimination
Procedia PDF Downloads 1605384 The Capabilities of New Communication Devices in Development of Informing: Case Study Mobile Functions in Iran
Authors: Mohsen Shakerinejad
Abstract:
Due to the growing momentum of technology, the present age is called age of communication and information. And With Astounding progress of Communication and information tools, current world Is likened to the "global village". That a message can be sent from one point to another point of the world in a Time scale Less than a minute. However, one of the new sociologists -Alain Touraine- in describing the destructive effects of new changes arising from the development of information appliances refers to the "new fields for undemocratic social control And the incidence of acute and unrest social and political tensions", Yet, in this era That With the advancement of the industry, the life of people has been industrial too, quickly and accurately Data Transfer, Causes Blowing new life in the Body of Society And according to the features of each society and the progress of science and technology, Various tools should be used. One of these communication tools is Mobile. Cellular phone As Communication and telecommunication revolution in recent years, Has had a great influence on the individual and collective life of societies. This powerful communication tool Have had an Undeniable effect, On all aspects of life, including social, economic, cultural, scientific, etc. so that Ignoring It in Design, Implementation and enforcement of any system is not wise. Nowadays knowledge and information are one of the most important aspects of human life. Therefore, in this article, it has been tried to introduce mobile potentials in receive and transmit News and Information. As it follows, among the numerous capabilities of current mobile phones features such as sending text, photography, sound recording, filming, and Internet connectivity could indicate the potential of this medium of communication in the process of sending and receiving information. So that nowadays, mobile journalism as an important component of citizen journalism Has a unique role in information dissemination.Keywords: mobile, informing, receiving information, mobile journalism, citizen journalism
Procedia PDF Downloads 4105383 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools
Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri
Abstract:
The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq
Procedia PDF Downloads 1175382 Pro Life-Pro Choice Debate: Looking through the Prism of Abortion Right in the Indian Context
Authors: Satabdi Das
Abstract:
Background:The abortion debate has polarized women, pitting them against each other in the binary of pro-choice and pro-life. While the followers of pro-choice views the right to an abortion as inherent to a women's right to sovereignty, the latter believes that it is unethical to kill a unborn baby as it is in a way denying the foetus' right to life. So there are innumerable arguments and counter arguments without hyphenation and the dilemma remains that which one is more significant – the mother's right to terminate pregnancy or the foetus' right to life. This pro-life and pro-choice debate has an western root which is more about reproductive freedom. But the Western standard of looking at abortion debate is not fully relevant in the Indian context. The situation is entirely different here. Sex selective foeticide is a social ill in India which cannot be explained through the prism of abortion debate only. It must take into account the problems of forced female foeticide. Objectives: Against this backdrop the study sheds light on the following issues: -How the Reproductive debate has been evolved? -How it is relevant in the Indian Context where female foeticide is a harsh reality? -How one should address the dilemma between life and death in the context of pro life-pro choice debate? Methodology: The study employs historical analytical and descriptive analytical methods and uses primary documents like governmental documents and secondary sources like analytical articles in books, journals, and relevant websites. Findings: -Fertility control is not a modern day phenomenon. It has its roots throughout ancient, medieval and present epochs. However, there existed debates over the rights of the foetus and the question of ethics pertaining to the act of abortion. -Pre-natal sex determination for sex selective abortion is a common phenomenon in India because of the wish for male heirs. The cultural preferences for male child over female ones have resulted in the disappearance of girl children. -When does the life begin has not been recognized by any law. Considering Indian case, it can be said that the Pro life/ pro choice is not that relevant as it is in the US. Here the women are often denied the basic human rights. They are murdered at the womb in many places. Their right to lives are jeopardised in that way. In the liberal abortion regime of India, women's choice to end a pregnancy is limited among very few enlightened families. In many cases, it is the decision of the family to end a pregnancy for boy preference. For that pre natal sex determination plays a crucial role. Conclusion: In India, we can be pro life only when the right to life of the unborn can be secured irrespective of its sex. Similarly we belong to pro-choice group only when the choice to terminate a baby is entirely decided by the mother for her own reasons.Keywords: female foeticide, India, prolife/pro choice, right to abortion
Procedia PDF Downloads 1925381 Decision Making in Medicine and Treatment Strategies
Authors: Kamran Yazdanbakhsh, Somayeh Mahmoudi
Abstract:
Three reasons make good use of the decision theory in medicine: 1. Increased medical knowledge and their complexity makes it difficult treatment information effectively without resorting to sophisticated analytical methods, especially when it comes to detecting errors and identify opportunities for treatment from databases of large size. 2. There is a wide geographic variability of medical practice. In a context where medical costs are, at least in part, by the patient, these changes raise doubts about the relevance of the choices made by physicians. These differences are generally attributed to differences in estimates of probabilities of success of treatment involved, and differing assessments of the results on success or failure. Without explicit criteria for decision, it is difficult to identify precisely the sources of these variations in treatment. 3. Beyond the principle of informed consent, patients need to be involved in decision-making. For this, the decision process should be explained and broken down. A decision problem is to select the best option among a set of choices. The problem is what is meant by "best option ", or know what criteria guide the choice. The purpose of decision theory is to answer this question. The systematic use of decision models allows us to better understand the differences in medical practices, and facilitates the search for consensus. About this, there are three types of situations: situations certain, risky situations, and uncertain situations: 1. In certain situations, the consequence of each decision are certain. 2. In risky situations, every decision can have several consequences, the probability of each of these consequences is known. 3. In uncertain situations, each decision can have several consequences, the probability is not known. Our aim in this article is to show how decision theory can usefully be mobilized to meet the needs of physicians. The decision theory can make decisions more transparent: first, by clarifying the data systematically considered the problem and secondly by asking a few basic principles should guide the choice. Once the problem and clarified the decision theory provides operational tools to represent the available information and determine patient preferences, and thus assist the patient and doctor in their choices.Keywords: decision making, medicine, treatment strategies, patient
Procedia PDF Downloads 5795380 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables
Procedia PDF Downloads 3385379 Integrated Environmental Management System and Environmental Impact Assessment in Evaluation of Environmental Protective Action
Authors: Moustafa Osman
Abstract:
The paper describes and analyses different good practice examples of protective levels, and initiatives actions (“framework conditions”) and encourages the uptake of environmental management systems (EMSs) to small and medium-sized enterprises (SMEs). Most of industries tend to take EMS as tools leading towards sustainability planning. The application of these tools has numerous environmental obligations that neither suggests decision nor recommends what a company should achieve ultimately. These set up clearly defined criteria to evaluate environmental protective action (EEPA) into sustainability indicators. The physical integration will evaluate how to incorporate traditional knowledge into baseline information, preparing impact prediction, and planning mitigation measures in monitoring conditions. Thereby efforts between the government, industry and community led protective action to concern with present needs for future generations, meeting the goal of sustainable development. The paper discusses how to set out distinct aspects of sustainable indicators and reflects inputs, outputs, and modes of impact on the environment.Keywords: environmental management, sustainability, indicators, protective action
Procedia PDF Downloads 4435378 Analysis of Critical Success Factors for Implementing Industry 4.0 and Circular Economy to Enhance Food Traceability
Authors: Mahsa Pishdar
Abstract:
Food traceability through the supply chain is facing increased demand. IoT and blockchain are among the tools under consideration in the Industry 4.0 era that could be integrated to help implementation of the Circular Economy (CE) principles while enhancing food traceability solutions. However, such tools need intellectual system, and infrastructureto be settled as guidance through the way, helping overcoming obstacles. That is why the critical success factors for implementing Industry 4.0 and circular economy principles in food traceability concept are analyzed in this paper by combination of interval type 2 fuzzy Worst Best Method and Measurement Alternatives and Ranking according to Compromise Solution (Interval Type 2 fuzzy WBM-MARCOS). Results indicate that “Knowledge of Industry 4.0 obligations and CE principle” is the most important factor that is the basis of success following by “Management commitment and support”. This will assist decision makers to seize success in gaining a competitive advantage while reducing costs through the supply chain.Keywords: food traceability, industry 4.0, internet of things, block chain, best worst method, marcos
Procedia PDF Downloads 2055377 Mineralogical and Geochemical Constraints on the Origin and Environment of Numidian Siliceous Sedimentary Rocks of the Extreme Northwest Tunisia
Authors: Ben Yahia Nouha, Harris Chris, Sebei Abdelaziz, Boussen Slim, Chaabani Fredj
Abstract:
The present work has set itself the objective of studying non-detritic siliceous rocks of the extreme northwest Tunisia. It aims to examine the origin and their sedimentary depositional environment based on mineralogical and geochemical characteristics. The different sections were located in the area of Babouch and the area of Tabarka. The collected samples were subjected to mineralogical and geochemical characterization using different analytical methods: X-ray diffraction (XRD), geochemical analysis (ICP- AES), isotopic geochemistry (δ18O), to assess their suitability for industrial use. X-ray powder diffraction of the pure siliceous rock indicates quartz as the major mineral, with the total lack of amorphous silica. Trace impurities, such as carbonate and clay minerals, are concealed in the analytical results. The petrographic examination revealed allowed us to deduce that this rock was deriving from tests of siliceous organisms (the radiolarians). The chemical composition shows that SiO2, Al2O3, and Fe2O3 represent the most abundant oxides. The other oxides are present in negligible quantities. Geochemical data support a biogenic and non-hydrothermal origin of babouchite silica. Oxygen isotopic has shown that babouchites were formed in an environment with a high temperature ranging from 56 °C to 73 °C.Keywords: biogenic silica, babouchite formation, XRD, chemical analysis, oxygen isotopic, northwest tunisia
Procedia PDF Downloads 1455376 MFCA: An Environmental Management Accounting Technique for Optimal Resource Efficiency in Production Processes
Authors: Omolola A. Tajelawi, Hari L. Garbharran
Abstract:
Revenue leakages are one of the major challenges manufacturers face in production processes, as most of the input materials that should emanate as products from the lines are lost as waste. Rather than generating income from material input which is meant to end-up as products, losses are further incurred as costs in order to manage waste generated. In addition, due to the lack of a clear view of the flow of resources on the lines from input to output stage, acquiring information on the true cost of waste generated have become a challenge. This has therefore given birth to the conceptualization and implementation of waste minimization strategies by several manufacturing industries. This paper reviews the principles and applications of three environmental management accounting tools namely Activity-based Costing (ABC), Life-Cycle Assessment (LCA) and Material Flow Cost Accounting (MFCA) in the manufacturing industry and their effectiveness in curbing revenue leakages. The paper unveils the strengths and limitations of each of the tools; beaming a searchlight on the tool that could allow for optimal resource utilization, transparency in production process as well as improved cost efficiency. Findings from this review reveal that MFCA may offer superior advantages with regards to the provision of more detailed information (both in physical and monetary terms) on the flow of material inputs throughout the production process compared to the other environmental accounting tools. This paper therefore makes a case for the adoption of MFCA as a viable technique for the identification and reduction of waste in production processes, and also for effective decision making by production managers, financial advisors and other relevant stakeholders.Keywords: MFCA, environmental management accounting, resource efficiency, waste reduction, revenue losses
Procedia PDF Downloads 3365375 The Analysis of Priority Flood Control Management Using Analysis Hierarchy Process
Authors: Pravira Rizki Suwarno, Fanny Aliza Savitri, Priseyola Ayunda Prima, Pipin Surahman, Mahelga Levina Amran, Khoirunisa Ulya Nur Utari, Nora Permatasari
Abstract:
The Bogowonto River or commonly called the Bhagawanta River, is one of the rivers on Java Island. It is located in Central Java, Indonesia. Its watershed area is 35 km² with 57 km long. This river covers three regencies, namely Wonosobo Regency and Magelang Regency in the upstream and Purworejo Regency in the south and downstream. The Bogowonto River experiences channel narrowing and silting. It is caused by garbage along the river that comes from livestock and household waste. The narrowing channel and siltation cause a capacity reduction of the river to drain flood discharge. Comprehensive and sustainable actions are needed in dealing with current and future floods. Based on these current conditions, a priority scale is required. Therefore, this study aims to determine the priority scale of flood management in Purworejo Regency using the Analytical Hierarchy Process (AHP) method. This method will determine the appropriate actions based on the rating. In addition, there will be field observations through distributing questionnaires to several parties, including the stakeholders and the community. The results of this study will be in 2 (two) forms of actions, both structurally covering water structures and non-structural, including social, environmental, and law enforcement.Keywords: analytical hierarchy process, bogowonto, flood control, management
Procedia PDF Downloads 2085374 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes
Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet
Abstract:
Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree
Procedia PDF Downloads 3615373 Congruency of English Teachers’ Assessments Vis-à-Vis 21st Century Skills Assessment Standards
Authors: Mary Jane Suarez
Abstract:
A massive educational overhaul has taken place at the onset of the 21st century addressing the mismatches of employability skills with that of scholastic skills taught in schools. For a community to thrive in an ever-developing economy, the teaching of the necessary skills for job competencies should be realized by every educational institution. However, in harnessing 21st-century skills amongst learners, teachers, who often lack familiarity and thorough insights into the emerging 21st-century skills, are chained with the restraint of the need to comprehend the physiognomies of 21st-century skills learning and the requisite to implement the tenets of 21st-century skills teaching. With the endeavor to espouse 21st-century skills learning and teaching, a United States-based national coalition called Partnership 21st Century Skills (P21) has identified the four most important skills in 21st-century learning: critical thinking, communication, collaboration, and creativity and innovation with an established framework for 21st-century skills standards. Assessment of skills is the lifeblood of every teaching and learning encounter. It is correspondingly crucial to look at the 21st century standards and the assessment guides recognized by P21 to ensure that learners are 21st century ready. This mixed-method study sought to discover and describe what classroom assessments were used by English teachers in a public secondary school in the Philippines with course offerings on science, technology, engineering, and mathematics (STEM). The research evaluated the assessment tools implemented by English teachers and how these assessment tools were congruent to the 21st assessment standards of P21. A convergent parallel design was used to analyze assessment tools and practices in four phases. In the data-gathering phase, survey questionnaires, document reviews, interviews, and classroom observations were used to gather quantitative and qualitative data simultaneously, and how assessment tools and practices were consistent with the P21 framework with the four Cs as its foci. In the analysis phase, the data were treated using mean, frequency, and percentage. In the merging and interpretation phases, a side-by-side comparison was used to identify convergent and divergent aspects of the results. In conclusion, the results yielded assessments tools and practices that were inconsistent, if not at all, used by teachers. Findings showed that there were inconsistencies in implementing authentic assessments, there was a scarcity of using a rubric to critically assess 21st skills in both language and literature subjects, there were incongruencies in using portfolio and self-reflective assessments, there was an exclusion of intercultural aspects in assessing the four Cs and the lack of integrating collaboration in formative and summative assessments. As a recommendation, a harmonized assessment scheme of P21 skills was fashioned for teachers to plan, implement, and monitor classroom assessments of 21st-century skills, ensuring the alignment of such assessments to P21 standards for the furtherance of the institution’s thrust to effectively integrate 21st-century skills assessment standards to its curricula.Keywords: 21st-century skills, 21st-century skills assessments, assessment standards, congruency, four Cs
Procedia PDF Downloads 1935372 UV-Vis Spectroscopy as a Tool for Online Tar Measurements in Wood Gasification Processes
Authors: Philip Edinger, Christian Ludwig
Abstract:
The formation and control of tars remain one of the major challenges in the implementation of biomass gasification technologies. Robust, on-line analytical methods are needed to investigate the fate of tar compounds when different measures for their reduction are applied. This work establishes an on-line UV-Vis method, based on a liquid quench sampling system, to monitor tar compounds in biomass gasification processes. Recorded spectra from the liquid phase were analyzed for their tar composition by means of a classical least squares (CLS) and partial least squares (PLS) approach. This allowed for the detection of UV-Vis active tar compounds with detection limits in the low part per million by volume (ppmV) region. The developed method was then applied to two case studies. The first involved a lab-scale reactor, intended to investigate the decomposition of a limited number of tar compounds across a catalyst. The second study involved a gas scrubber as part of a pilot scale wood gasification plant. Tar compound quantification results showed good agreement with off-line based reference methods (GC-FID) when the complexity of tar composition was limited. The two case studies show that the developed method can provide rapid, qualitative information on the tar composition for the purpose of process monitoring. In cases with a limited number of tar species, quantitative information about the individual tar compound concentrations provides an additional benefit of the analytical method.Keywords: biomass gasification, on-line, tar, UV-Vis
Procedia PDF Downloads 2595371 European Food Safety Authority (EFSA) Safety Assessment of Food Additives: Data and Methodology Used for the Assessment of Dietary Exposure for Different European Countries and Population Groups
Authors: Petra Gergelova, Sofia Ioannidou, Davide Arcella, Alexandra Tard, Polly E. Boon, Oliver Lindtner, Christina Tlustos, Jean-Charles Leblanc
Abstract:
Objectives: To assess chronic dietary exposure to food additives in different European countries and population groups. Method and Design: The European Food Safety Authority’s (EFSA) Panel on Food Additives and Nutrient Sources added to Food (ANS) estimates chronic dietary exposure to food additives with the purpose of re-evaluating food additives that were previously authorized in Europe. For this, EFSA uses concentration values (usage and/or analytical occurrence data) reported through regular public calls for data by food industry and European countries. These are combined, at individual level, with national food consumption data from the EFSA Comprehensive European Food Consumption Database including data from 33 dietary surveys from 19 European countries and considering six different population groups (infants, toddlers, children, adolescents, adults and the elderly). EFSA ANS Panel estimates dietary exposure for each individual in the EFSA Comprehensive Database by combining the occurrence levels per food group with their corresponding consumption amount per kg body weight. An individual average exposure per day is calculated, resulting in distributions of individual exposures per survey and population group. Based on these distributions, the average and 95th percentile of exposure is calculated per survey and per population group. Dietary exposure is assessed based on two different sets of data: (a) Maximum permitted levels (MPLs) of use set down in the EU legislation (defined as regulatory maximum level exposure assessment scenario) and (b) usage levels and/or analytical occurrence data (defined as refined exposure assessment scenario). The refined exposure assessment scenario is sub-divided into the brand-loyal consumer scenario and the non-brand-loyal consumer scenario. For the brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the highest reported usage/analytical level for one food group, and at the mean level for the remaining food groups. For the non-brand-loyal consumer scenario, the consumer is considered to be exposed on long-term basis to the mean reported usage/analytical level for all food groups. An additional exposure from sources other than direct addition of food additives (i.e. natural presence, contaminants, and carriers of food additives) is also estimated, as appropriate. Results: Since 2014, this methodology has been applied in about 30 food additive exposure assessments conducted as part of scientific opinions of the EFSA ANS Panel. For example, under the non-brand-loyal scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 5.9 and 8.7 mg/kg body weight/day, respectively. The same estimates under the brand-loyal scenario in toddlers resulted in exposures of 8.1 and 20.7 mg/kg body weight/day, respectively. For the regulatory maximum level exposure assessment scenario, the highest 95th percentile of exposure to α-tocopherol (E 307) and ammonium phosphatides (E 442) was estimated in toddlers up to 11.9 and 30.3 mg/kg body weight/day, respectively. Conclusions: Detailed and up-to-date information on food additive concentration values (usage and/or analytical occurrence data) and food consumption data enable the assessment of chronic dietary exposure to food additives to more realistic levels.Keywords: α-tocopherol, ammonium phosphatides, dietary exposure assessment, European Food Safety Authority, food additives, food consumption data
Procedia PDF Downloads 3255370 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods
Authors: A. Senthil Kumar, V. Murali Bhaskaran
Abstract:
In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)
Procedia PDF Downloads 2865369 Facilitated Massive Open Online Course (MOOC) Based Teacher Professional Development in Kazakhstan: Connectivism-Oriented Practices
Authors: A. Kalizhanova, T. Shelestova
Abstract:
Teacher professional development (TPD) in Kazakhstan has followed a fairly standard format for centuries, with teachers learning new information from a lecturer and being tested using multiple-choice questions. In the online world, self-access courses have become increasingly popular. Due to their extensive multimedia content, peer-reviewed assignments, adaptable class times, and instruction from top university faculty from across the world, massive open online courses (MOOCs) have found a home in Kazakhstan's system for lifelong learning. Recent studies indicate the limited use of connectivism-based tools such as discussion forums by Kazakhstani pre-service and in-service English teachers, whose professional interests are limited to obtaining certificates rather than enhancing their teaching abilities and exchanging knowledge with colleagues. This paper highlights the significance of connectivism-based tools and instruments, such as MOOCs, for the continuous professional development of pre- and in-service English teachers, facilitators' roles, and their strategies for enhancing trainees' conceptual knowledge within the MOOCs' curriculum and online learning skills. Reviewing the most pertinent papers on Connectivism Theory, facilitators' function in TPD, and connectivism-based tools, such as MOOCs, a code extraction method was utilized. Three experts, former active participants in a series of projects initiated across Kazakhstan to improve the efficacy of MOOCs, evaluated the excerpts and selected the most appropriate ones to propose the matrix of teacher professional competencies that can be acquired through MOOCs. In this paper, we'll look at some of the strategies employed by course instructors to boost their students' English skills and knowledge of course material, both inside and outside of the MOOC platform. Participants' interactive learning contributed to their language and subject conceptual knowledge and prepared them for peer-reviewed assignments in the MOOCs, and this approach of small group interaction was given to highlight the outcomes of participants' interactive learning. Both formal and informal continuing education institutions can use the findings of this study to support teachers in gaining experience with MOOCs and creating their own online courses.Keywords: connectivism-based tools, teacher professional development, massive open online courses, facilitators, Kazakhstani context
Procedia PDF Downloads 805368 A Multicriteria Framework for Assessing Energy Audit Software for Low-Income Households
Authors: Charles Amoo, Joshua New, Bill Eckman
Abstract:
Buildings in the United States account for a significant proportion of energy consumption and greenhouse gas (GHG) emissions, and this trend is expected to continue as well as rise in the near future. Low-income households, in particular, bear a disproportionate burden of high building energy consumption and spending due to high energy costs. Energy efficiency improvements need to reach an average of 4% per year in this decade in order to meet global net zero emissions target by 2050, but less than 1 % of U.S. buildings are improved each year. The government has recognized the importance of technology in addressing this issue, and energy efficiency programs have been developed to tackle the problem. The Weatherization Assistance Program (WAP), the largest residential whole-house energy efficiency program in the U.S., is specifically designed to reduce energy costs for low-income households. Under the WAP, energy auditors must follow specific audit procedures and use Department of Energy (DOE) approved energy audit tools or software. This article proposes an expanded framework of factors that should be considered in energy audit software that is approved for use in energy efficiency programs, particularly for low-income households. The framework includes more than 50 factors organized under 14 assessment criteria and can be used to qualitatively and quantitatively score different energy audit software to determine their suitability for specific energy efficiency programs. While the tool can be useful for developers to build new tools and improve existing software, as well as for energy efficiency program administrators to approve or certify tools for use, there are limitations to the model, such as the lack of flexibility that allows continuous scoring to accommodate variability and subjectivity. These limitations can be addressed by using aggregate scores of each criterion as weights that could be combined with value function and direct rating scores in a multicriteria decision analysis for a more flexible scoring.Keywords: buildings, energy efficiency, energy audit, software
Procedia PDF Downloads 775367 Future Design and Innovative Economic Models for Futuristic Markets in Developing Countries
Authors: Nessreen Y. Ibrahim
Abstract:
Designing the future according to realistic analytical study for the futuristic market needs can be a milestone strategy to make a huge improvement in developing countries economics. In developing countries, access to high technology and latest science approaches is very limited. The financial problems in low and medium income countries have negative effects on the kind and quality of imported new technologies and application for their markets. Thus, there is a strong need for shifting paradigm thinking in the design process to improve and evolve their development strategy. This paper discusses future possibilities in developing countries, and how they can design their own future according to specific future models FDM (Future Design Models), which established to solve certain economical problems, as well as political and cultural conflicts. FDM is strategic thinking framework provides an improvement in both content and process. The content includes; beliefs, values, mission, purpose, conceptual frameworks, research, and practice, while the process includes; design methodology, design systems, and design managements tools. In this paper the main objective was building an innovative economic model to design a chosen possible futuristic scenario; by understanding the market future needs, analyze real world setting, solve the model questions by future driven design, and finally interpret the results, to discuss to what extent the results can be transferred to the real world. The paper discusses Egypt as a potential case study. Since, Egypt has highly complex economical problems, extra-dynamic political factors, and very rich cultural aspects; we considered Egypt is a very challenging example for applying FDM. The paper results recommended using FDM numerical modeling as a starting point to design the future.Keywords: developing countries, economic models, future design, possible futures
Procedia PDF Downloads 2675366 Characterization of (GRAS37) Gibberellin Acid Insensitive (GAI), Repressor (RGA), and Scarecrow (SCR) Gene by Using Bioinformatics Tools
Authors: Yusra Tariq
Abstract:
The Grass 37 gene is presently known in tomatoes, which are the source of healthy substances such as ascorbic acid, polyphenols, carotenoids and nutrients. It has a significant impact on the growth and development of humans. The GRASS 37 gene is a plant Transcription factor group assuming significant parts in various reactions of different Abiotic stresses such as (drought, salinity, thermal stresses, temperature, and bright waves) which could highly affect the growth. Tomatoes are very sensitive to temperature, and their growth or production occurs optimally in a temperature range from 21 C to 29.5 C during the daytime and from 18.5 C to 21 C during the night. This protein acts as a positive regulator of salt stress response and abscisic acid signaling. This study summarizes the structure characterized by molecular formula and protein-binding domains by different bioinformatics tools such as Expasy translate tool, Expasy Portparam, Swiss Prot and Inter Pro Scan, Clustal W tool regulatory procedure of GRASS gene components, also their reactions to both biotic and Abiotic stresses.Keywords: GRAS37, gene, bioinformatics, tool
Procedia PDF Downloads 535365 Cyber-Softbook: A Platform for Collaborative Content Development and Delivery for Cybersecurity Education
Authors: Eniye Tebekaemi, Martin Zhao
Abstract:
The dichotomy between the skills set of newly minted college graduates and the skills required by cybersecurity employers is on the rise. Colleges are struggling to cope with the rapid pace of technology evolution using outdated tools and practices. Industries are getting frustrated due to the need to retrain fresh college graduates on skills they should have acquired. There is a dire need for academic institutions to develop new tools and systems to deliver cybersecurity education to meet the ever-evolving technology demands of the industry. The Cyber-Softbook project’s goal is to bridge the tech industry and tech education gap by providing educators a framework to collaboratively design, manage, and deliver cybersecurity academic courses that meet the needs of the tech industry. The Cyber-Softbook framework, when developed, will provide a platform for academic institutions and tech industries to collaborate on tech education and for students to learn about cybersecurity with all the resources they need to understand concepts and gain valuable skills available on a single platform.Keywords: cybersecurity, education, skills, labs, curriculum
Procedia PDF Downloads 925364 Social Studies Teachers Experiences in Teaching Spatial Thinking in Social Studies Classrooms in Kuwait: Exploratory Study
Authors: Huda Alazmi
Abstract:
Social studies educational research has, so far, devoted very little attention towards spatial thinking in classroom teaching. To help address such paucity, this study explores the spatial thinking instructional experiences of middle school social studies teachers in Kuwait. The goal is to learn their teaching practices and assess teacher understanding for the spatial thinking concept to enable future improvements. Using a qualitative study approach, the researcher conducted semi-structured interviews to examine the relevant experiences of 14 social studies teachers. The findings revealed three major themes: (1) concepts of space, (2) tools of representation, and (3) spatial reasoning. These themes illustrated how social studies teachers focus predominantly upon simple concepts of space, using multiple tools of representation, but avoid addressing critical spatial reasoning. The findings help explain the current situation while identifying weaker areas for further analysis and improvement.Keywords: spatial thinking, concepts of space, spatial representation, spatial reasoning
Procedia PDF Downloads 795363 A Unified Approach for Digital Forensics Analysis
Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles
Abstract:
Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool
Procedia PDF Downloads 1965362 Identify Users Behavior from Mobile Web Access Logs Using Automated Log Analyzer
Authors: Bharat P. Modi, Jayesh M. Patel
Abstract:
Mobile Internet is acting as a major source of data. As the number of web pages continues to grow the Mobile web provides the data miners with just the right ingredients for extracting information. In order to cater to this growing need, a special term called Mobile Web mining was coined. Mobile Web mining makes use of data mining techniques and deciphers potentially useful information from web data. Web Usage mining deals with understanding the behavior of users by making use of Mobile Web Access Logs that are generated on the server while the user is accessing the website. A Web access log comprises of various entries like the name of the user, his IP address, a number of bytes transferred time-stamp etc. A variety of Log Analyzer tools exists which help in analyzing various things like users navigational pattern, the part of the website the users are mostly interested in etc. The present paper makes use of such log analyzer tool called Mobile Web Log Expert for ascertaining the behavior of users who access an astrology website. It also provides a comparative study between a few log analyzer tools available.Keywords: mobile web access logs, web usage mining, web server, log analyzer
Procedia PDF Downloads 3615361 Machine Learning for Feature Selection and Classification of Systemic Lupus Erythematosus
Authors: H. Zidoum, A. AlShareedah, S. Al Sawafi, A. Al-Ansari, B. Al Lawati
Abstract:
Systemic lupus erythematosus (SLE) is an autoimmune disease with genetic and environmental components. SLE is characterized by a wide variability of clinical manifestations and a course frequently subject to unpredictable flares. Despite recent progress in classification tools, the early diagnosis of SLE is still an unmet need for many patients. This study proposes an interpretable disease classification model that combines the high and efficient predictive performance of CatBoost and the model-agnostic interpretation tools of Shapley Additive exPlanations (SHAP). The CatBoost model was trained on a local cohort of 219 Omani patients with SLE as well as other control diseases. Furthermore, the SHAP library was used to generate individual explanations of the model's decisions as well as rank clinical features by contribution. Overall, we achieved an AUC score of 0.945, F1-score of 0.92 and identified four clinical features (alopecia, renal disorders, cutaneous lupus, and hemolytic anemia) along with the patient's age that was shown to have the greatest contribution on the prediction.Keywords: feature selection, classification, systemic lupus erythematosus, model interpretation, SHAP, Catboost
Procedia PDF Downloads 845360 Seismic Assessment of a Pre-Cast Recycled Concrete Block Arch System
Authors: Amaia Martinez Martinez, Martin Turek, Carlos Ventura, Jay Drew
Abstract:
This study aims to assess the seismic performance of arch and dome structural systems made from easy to assemble precast blocks of recycled concrete. These systems have been developed by Lock Block Ltd. Company from Vancouver, Canada, as an extension of their currently used retaining wall system. The characterization of the seismic behavior of these structures is performed by a combination of experimental static and dynamic testing, and analytical modeling. For the experimental testing, several tilt tests, as well as a program of shake table testing were undertaken using small scale arch models. A suite of earthquakes with different characteristics from important past events are chosen and scaled properly for the dynamic testing. Shake table testing applying the ground motions in just one direction (in the weak direction of the arch) and in the three directions were conducted and compared. The models were tested with increasing intensity until collapse occurred; which determines the failure level for each earthquake. Since the failure intensity varied with type of earthquake, a sensitivity analysis of the different parameters was performed, being impulses the dominant factor. For all cases, the arches exhibited the typical four-hinge failure mechanism, which was also shown in the analytical model. Experimental testing was also performed reinforcing the arches using a steel band over the structures anchored at both ends of the arch. The models were tested with different pretension levels. The bands were instrumented with strain gauges to measure the force produced by the shaking. These forces were used to develop engineering guidelines for the design of the reinforcement needed for these systems. In addition, an analytical discrete element model was created using 3DEC software. The blocks were designed as rigid blocks, assigning all the properties to the joints including also the contribution of the interlocking shear key between blocks. The model is calibrated to the experimental static tests and validated with the obtained results from the dynamic tests. Then the model can be used to scale up the results to the full scale structure and expanding it to different configurations and boundary conditions.Keywords: arch, discrete element model, seismic assessment, shake-table testing
Procedia PDF Downloads 206