Search results for: optimizing tools
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4767

Search results for: optimizing tools

4317 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process

Authors: Sonia Anand Knowlton

Abstract:

There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.

Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm

Procedia PDF Downloads 81
4316 Optimizing Protection of Medieval Glass Mosaic

Authors: J. Valach, S. Pospisil, S. Kuznecov

Abstract:

The paper deals with experimental estimation of future environmental load on medieval mosaic of Last Judgement on entrance to St. Vitus cathedral on Prague castle. The mosaic suffers from seasonal changes of weather pattern, as well as rains, their acidity, deposition of dust and sooth particles from polluted air and also from freeze-thaw cycles. These phenomena influence state of the mosaic. The mosaic elements, tesserae are mostly made from glass prone to weathering. To estimate future procedure of the best maintenance, relation between various weather scenarios and their effect on the mosaic was investigated. At the same time local method for evaluation of protective coating was developed. Together both methods will contribute to better care for the mosaic and also visitors aesthetical experience.

Keywords: environmental load, cultural heritage, glass mosaic, protection

Procedia PDF Downloads 280
4315 The Urban Project: Metropolization Tool and Sustainability Vector - Case of Constantine

Authors: Mouhoubi Nedjima, Sassi Boudemagh Souad, Chouabbia Khedidja

Abstract:

Cities grow, large or small; they seek to gain a place in the market competition, which talks to sell a product that is the city itself. The metropolis are large cities enjoying a legal status and assets providing their dominions elements on a territory larger than their range, do not escape this situation. Thus, the search for promising tool metropolises better development and durability meet the challenges as economic, social and environmental is timely. The urban project is a new way to build the city; it is involved in the metropolises of two ways, either to manage the crisis and to meet the internal needs of the metropolis, or by creating a regional attractiveness with their potential. This communication will address the issue of urban project as a tool that has and should find a place in the panoply of existing institutional tools. Based on the example of the modernization project of the metropolis of eastern Algeria "Constantine", we will examine what the urban project can bring to a city, the extent of its impact but also the relationship between the visions actors so metropolization a success.

Keywords: urban project, metropolis, institutional tools, Constantine

Procedia PDF Downloads 403
4314 Comparing Energy Labelling of Buildings in Spain

Authors: Carolina Aparicio-Fernández, Alejandro Vilar Abad, Mar Cañada Soriano, Jose-Luis Vivancos

Abstract:

The building sector is responsible for 40% of the total energy consumption in the European Union (EU). Thus, implementation of strategies for quantifying and reducing buildings energy consumption is indispensable for reaching the EU’s carbon neutrality and energy efficiency goals. Each Member State has transposed the European Directives according to its own peculiarities: existing technical legislation, constructive solutions, climatic zones, etc. Therefore, in accordance with the Energy Performance of Buildings Directive, Member States have developed different Energy Performance Certificate schemes, using proposed energy simulation software-tool for each national or regional area. Energy Performance Certificates provide a powerful and comprehensive information to predict, analyze and improve the energy demand of new and existing buildings. Energy simulation software and databases allow a better understanding of the current constructive reality of the European building stock. However, Energy Performance Certificates still have to face several issues to consider them as a reliable and global source of information since different calculation tools are used that do not allow the connection between them. In this document, TRNSYS (TRaNsient System Simulation program) software is used to calculate the energy demand of a building, and it is compared with the energy labeling obtained with Spanish Official software-tools. We demonstrate the possibility of using not official software-tools to calculate the Energy Performance Certificate. Thus, this approach could be used throughout the EU and compare the results in all possible cases proposed by the EU Member States. To implement the simulations, an isolated single-family house with different construction solutions is considered. The results are obtained for every climatic zone of the Spanish Technical Building Code.

Keywords: energy demand, energy performance certificate EPBD, trnsys, buildings

Procedia PDF Downloads 127
4313 Optimizing Machine Vision System Setup Accuracy by Six-Sigma DMAIC Approach

Authors: Joseph C. Chen

Abstract:

Machine vision system provides automatic inspection to reduce manufacturing costs considerably. However, only a few principles have been found to optimize machine vision system and help it function more accurately in industrial practice. Mostly, there were complicated and impractical design techniques to improve the accuracy of machine vision system. This paper discusses implementing the Six Sigma Define, Measure, Analyze, Improve, and Control (DMAIC) approach to optimize the setup parameters of machine vision system when it is used as a direct measurement technique. This research follows a case study showing how Six Sigma DMAIC methodology has been put into use.

Keywords: DMAIC, machine vision system, process capability, Taguchi Parameter Design

Procedia PDF Downloads 437
4312 Carbide Structure and Fracture Toughness of High Speed Tool Steels

Authors: Jung-Ho Moon, Tae Kwon Ha

Abstract:

M2 steels, the typical Co-free high speed steel (HSS) possessing hardness level of 63~65 HRc, are most widely used for cutting tools. On the other hand, Co-containing HSS’s, such as M35 and M42, show a higher hardness level of 65~67 HRc and used for high quality cutting tools. In the fabrication of HSS’s, it is very important to control cleanliness and eutectic carbide structure of the ingot and it is required to increase productivity at the same time. Production of HSS ingots includes a variety of processes such as casting, electro-slag remelting (ESR), forging, blooming, and wire rod rolling processes. In the present study, electro-slag rapid remelting (ESRR) process, an advanced ESR process combined by continuous casting, was successfully employed to fabricate HSS billets of M2, M35, and M42 steels. Distribution and structure of eutectic carbides of the billets were analysed and cleanliness, hardness, and composition profile of the billets were also evaluated.

Keywords: high speed tool steel, eutectic carbide, microstructure, hardness, fracture toughness

Procedia PDF Downloads 445
4311 Planning and Implementing Large-Scale Ecological Connectivity: A Review of Past and Ongoing Practices in Turkey

Authors: Tutku Ak, A. Esra Cengiz, Çiğdem Ayhan Kaptan

Abstract:

The conservation community has been increasingly promoting the concept of ecological connectivity towards the prevention and mitigation of landscape fragmentation. Many tools have been proposed for this purpose in not only Europe, but also around the world. Spatial planning for building connectivity, however, has many problems associated with the complexity of ecological processes at spatial and temporal scales. Furthermore, on the ground implementation could be very difficult potentially leading to ecologically disastrous results and waste of resources. These problems, on the other hand, can be avoided or rectified as more experience is gained with implementation. Therefore, it is the objective of this study to document the experiences gained with connectivity planning in Turkish landscapes. This paper is a preliminary review of the conservation initiatives and projects aimed at protecting and building ecological connectivity in and around Turkey. The objective is to scope existing conservation plans, tools and implementation approaches in Turkey and the ultimate goal is to understand to what degree they have been implemented and what are the constraints and opportunities that are being faced.

Keywords: ecological connectivity, large-scale landscapes, planning and implementation, Turkey

Procedia PDF Downloads 502
4310 The Capabilities of New Communication Devices in Development of Informing: Case Study Mobile Functions in Iran

Authors: Mohsen Shakerinejad

Abstract:

Due to the growing momentum of technology, the present age is called age of communication and information. And With Astounding progress of Communication and information tools, current world Is likened to the "global village". That a message can be sent from one point to another point of the world in a Time scale Less than a minute. However, one of the new sociologists -Alain Touraine- in describing the destructive effects of new changes arising from the development of information appliances refers to the "new fields for undemocratic social control And the incidence of acute and unrest social and political tensions", Yet, in this era That With the advancement of the industry, the life of people has been industrial too, quickly and accurately Data Transfer, Causes Blowing new life in the Body of Society And according to the features of each society and the progress of science and technology, Various tools should be used. One of these communication tools is Mobile. Cellular phone As Communication and telecommunication revolution in recent years, Has had a great influence on the individual and collective life of societies. This powerful communication tool Have had an Undeniable effect, On all aspects of life, including social, economic, cultural, scientific, etc. so that Ignoring It in Design, Implementation and enforcement of any system is not wise. Nowadays knowledge and information are one of the most important aspects of human life. Therefore, in this article, it has been tried to introduce mobile potentials in receive and transmit News and Information. As it follows, among the numerous capabilities of current mobile phones features such as sending text, photography, sound recording, filming, and Internet connectivity could indicate the potential of this medium of communication in the process of sending and receiving information. So that nowadays, mobile journalism as an important component of citizen journalism Has a unique role in information dissemination.

Keywords: mobile, informing, receiving information, mobile journalism, citizen journalism

Procedia PDF Downloads 410
4309 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools

Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri

Abstract:

The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.

Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq

Procedia PDF Downloads 117
4308 Integrated Environmental Management System and Environmental Impact Assessment in Evaluation of Environmental Protective Action

Authors: Moustafa Osman

Abstract:

The paper describes and analyses different good practice examples of protective levels, and initiatives actions (“framework conditions”) and encourages the uptake of environmental management systems (EMSs) to small and medium-sized enterprises (SMEs). Most of industries tend to take EMS as tools leading towards sustainability planning. The application of these tools has numerous environmental obligations that neither suggests decision nor recommends what a company should achieve ultimately. These set up clearly defined criteria to evaluate environmental protective action (EEPA) into sustainability indicators. The physical integration will evaluate how to incorporate traditional knowledge into baseline information, preparing impact prediction, and planning mitigation measures in monitoring conditions. Thereby efforts between the government, industry and community led protective action to concern with present needs for future generations, meeting the goal of sustainable development. The paper discusses how to set out distinct aspects of sustainable indicators and reflects inputs, outputs, and modes of impact on the environment.

Keywords: environmental management, sustainability, indicators, protective action

Procedia PDF Downloads 443
4307 Analysis of Critical Success Factors for Implementing Industry 4.0 and Circular Economy to Enhance Food Traceability

Authors: Mahsa Pishdar

Abstract:

Food traceability through the supply chain is facing increased demand. IoT and blockchain are among the tools under consideration in the Industry 4.0 era that could be integrated to help implementation of the Circular Economy (CE) principles while enhancing food traceability solutions. However, such tools need intellectual system, and infrastructureto be settled as guidance through the way, helping overcoming obstacles. That is why the critical success factors for implementing Industry 4.0 and circular economy principles in food traceability concept are analyzed in this paper by combination of interval type 2 fuzzy Worst Best Method and Measurement Alternatives and Ranking according to Compromise Solution (Interval Type 2 fuzzy WBM-MARCOS). Results indicate that “Knowledge of Industry 4.0 obligations and CE principle” is the most important factor that is the basis of success following by “Management commitment and support”. This will assist decision makers to seize success in gaining a competitive advantage while reducing costs through the supply chain.

Keywords: food traceability, industry 4.0, internet of things, block chain, best worst method, marcos

Procedia PDF Downloads 207
4306 MFCA: An Environmental Management Accounting Technique for Optimal Resource Efficiency in Production Processes

Authors: Omolola A. Tajelawi, Hari L. Garbharran

Abstract:

Revenue leakages are one of the major challenges manufacturers face in production processes, as most of the input materials that should emanate as products from the lines are lost as waste. Rather than generating income from material input which is meant to end-up as products, losses are further incurred as costs in order to manage waste generated. In addition, due to the lack of a clear view of the flow of resources on the lines from input to output stage, acquiring information on the true cost of waste generated have become a challenge. This has therefore given birth to the conceptualization and implementation of waste minimization strategies by several manufacturing industries. This paper reviews the principles and applications of three environmental management accounting tools namely Activity-based Costing (ABC), Life-Cycle Assessment (LCA) and Material Flow Cost Accounting (MFCA) in the manufacturing industry and their effectiveness in curbing revenue leakages. The paper unveils the strengths and limitations of each of the tools; beaming a searchlight on the tool that could allow for optimal resource utilization, transparency in production process as well as improved cost efficiency. Findings from this review reveal that MFCA may offer superior advantages with regards to the provision of more detailed information (both in physical and monetary terms) on the flow of material inputs throughout the production process compared to the other environmental accounting tools. This paper therefore makes a case for the adoption of MFCA as a viable technique for the identification and reduction of waste in production processes, and also for effective decision making by production managers, financial advisors and other relevant stakeholders.

Keywords: MFCA, environmental management accounting, resource efficiency, waste reduction, revenue losses

Procedia PDF Downloads 336
4305 Frequent-Pattern Tree Algorithm Application to S&P and Equity Indexes

Authors: E. Younsi, H. Andriamboavonjy, A. David, S. Dokou, B. Lemrabet

Abstract:

Software and time optimization are very important factors in financial markets, which are competitive fields, and emergence of new computer tools further stresses the challenge. In this context, any improvement of technical indicators which generate a buy or sell signal is a major issue. Thus, many tools have been created to make them more effective. This worry about efficiency has been leading in present paper to seek best (and most innovative) way giving largest improvement in these indicators. The approach consists in attaching a signature to frequent market configurations by application of frequent patterns extraction method which is here most appropriate to optimize investment strategies. The goal of proposed trading algorithm is to find most accurate signatures using back testing procedure applied to technical indicators for improving their performance. The problem is then to determine the signatures which, combined with an indicator, outperform this indicator alone. To do this, the FP-Tree algorithm has been preferred, as it appears to be the most efficient algorithm to perform this task.

Keywords: quantitative analysis, back-testing, computational models, apriori algorithm, pattern recognition, data mining, FP-tree

Procedia PDF Downloads 361
4304 Congruency of English Teachers’ Assessments Vis-à-Vis 21st Century Skills Assessment Standards

Authors: Mary Jane Suarez

Abstract:

A massive educational overhaul has taken place at the onset of the 21st century addressing the mismatches of employability skills with that of scholastic skills taught in schools. For a community to thrive in an ever-developing economy, the teaching of the necessary skills for job competencies should be realized by every educational institution. However, in harnessing 21st-century skills amongst learners, teachers, who often lack familiarity and thorough insights into the emerging 21st-century skills, are chained with the restraint of the need to comprehend the physiognomies of 21st-century skills learning and the requisite to implement the tenets of 21st-century skills teaching. With the endeavor to espouse 21st-century skills learning and teaching, a United States-based national coalition called Partnership 21st Century Skills (P21) has identified the four most important skills in 21st-century learning: critical thinking, communication, collaboration, and creativity and innovation with an established framework for 21st-century skills standards. Assessment of skills is the lifeblood of every teaching and learning encounter. It is correspondingly crucial to look at the 21st century standards and the assessment guides recognized by P21 to ensure that learners are 21st century ready. This mixed-method study sought to discover and describe what classroom assessments were used by English teachers in a public secondary school in the Philippines with course offerings on science, technology, engineering, and mathematics (STEM). The research evaluated the assessment tools implemented by English teachers and how these assessment tools were congruent to the 21st assessment standards of P21. A convergent parallel design was used to analyze assessment tools and practices in four phases. In the data-gathering phase, survey questionnaires, document reviews, interviews, and classroom observations were used to gather quantitative and qualitative data simultaneously, and how assessment tools and practices were consistent with the P21 framework with the four Cs as its foci. In the analysis phase, the data were treated using mean, frequency, and percentage. In the merging and interpretation phases, a side-by-side comparison was used to identify convergent and divergent aspects of the results. In conclusion, the results yielded assessments tools and practices that were inconsistent, if not at all, used by teachers. Findings showed that there were inconsistencies in implementing authentic assessments, there was a scarcity of using a rubric to critically assess 21st skills in both language and literature subjects, there were incongruencies in using portfolio and self-reflective assessments, there was an exclusion of intercultural aspects in assessing the four Cs and the lack of integrating collaboration in formative and summative assessments. As a recommendation, a harmonized assessment scheme of P21 skills was fashioned for teachers to plan, implement, and monitor classroom assessments of 21st-century skills, ensuring the alignment of such assessments to P21 standards for the furtherance of the institution’s thrust to effectively integrate 21st-century skills assessment standards to its curricula.

Keywords: 21st-century skills, 21st-century skills assessments, assessment standards, congruency, four Cs

Procedia PDF Downloads 193
4303 Data Clustering Algorithm Based on Multi-Objective Periodic Bacterial Foraging Optimization with Two Learning Archives

Authors: Chen Guo, Heng Tang, Ben Niu

Abstract:

Clustering splits objects into different groups based on similarity, making the objects have higher similarity in the same group and lower similarity in different groups. Thus, clustering can be treated as an optimization problem to maximize the intra-cluster similarity or inter-cluster dissimilarity. In real-world applications, the datasets often have some complex characteristics: sparse, overlap, high dimensionality, etc. When facing these datasets, simultaneously optimizing two or more objectives can obtain better clustering results than optimizing one objective. However, except for the objectives weighting methods, traditional clustering approaches have difficulty in solving multi-objective data clustering problems. Due to this, evolutionary multi-objective optimization algorithms are investigated by researchers to optimize multiple clustering objectives. In this paper, the Data Clustering algorithm based on Multi-objective Periodic Bacterial Foraging Optimization with two Learning Archives (DC-MPBFOLA) is proposed. Specifically, first, to reduce the high computing complexity of the original BFO, periodic BFO is employed as the basic algorithmic framework. Then transfer the periodic BFO into a multi-objective type. Second, two learning strategies are proposed based on the two learning archives to guide the bacterial swarm to move in a better direction. On the one hand, the global best is selected from the global learning archive according to the convergence index and diversity index. On the other hand, the personal best is selected from the personal learning archive according to the sum of weighted objectives. According to the aforementioned learning strategies, a chemotaxis operation is designed. Third, an elite learning strategy is designed to provide fresh power to the objects in two learning archives. When the objects in these two archives do not change for two consecutive times, randomly initializing one dimension of objects can prevent the proposed algorithm from falling into local optima. Fourth, to validate the performance of the proposed algorithm, DC-MPBFOLA is compared with four state-of-art evolutionary multi-objective optimization algorithms and one classical clustering algorithm on evaluation indexes of datasets. To further verify the effectiveness and feasibility of designed strategies in DC-MPBFOLA, variants of DC-MPBFOLA are also proposed. Experimental results demonstrate that DC-MPBFOLA outperforms its competitors regarding all evaluation indexes and clustering partitions. These results also indicate that the designed strategies positively influence the performance improvement of the original BFO.

Keywords: data clustering, multi-objective optimization, bacterial foraging optimization, learning archives

Procedia PDF Downloads 139
4302 Multidirectional Product Support System for Decision Making in Textile Industry Using Collaborative Filtering Methods

Authors: A. Senthil Kumar, V. Murali Bhaskaran

Abstract:

In the information technology ground, people are using various tools and software for their official use and personal reasons. Nowadays, people are worrying to choose data accessing and extraction tools at the time of buying and selling their products. In addition, worry about various quality factors such as price, durability, color, size, and availability of the product. The main purpose of the research study is to find solutions to these unsolved existing problems. The proposed algorithm is a Multidirectional Rank Prediction (MDRP) decision making algorithm in order to take an effective strategic decision at all the levels of data extraction, uses a real time textile dataset and analyzes the results. Finally, the results are obtained and compared with the existing measurement methods such as PCC, SLCF, and VSS. The result accuracy is higher than the existing rank prediction methods.

Keywords: Knowledge Discovery in Database (KDD), Multidirectional Rank Prediction (MDRP), Pearson’s Correlation Coefficient (PCC), VSS (Vector Space Similarity)

Procedia PDF Downloads 286
4301 Facilitated Massive Open Online Course (MOOC) Based Teacher Professional Development in Kazakhstan: Connectivism-Oriented Practices

Authors: A. Kalizhanova, T. Shelestova

Abstract:

Teacher professional development (TPD) in Kazakhstan has followed a fairly standard format for centuries, with teachers learning new information from a lecturer and being tested using multiple-choice questions. In the online world, self-access courses have become increasingly popular. Due to their extensive multimedia content, peer-reviewed assignments, adaptable class times, and instruction from top university faculty from across the world, massive open online courses (MOOCs) have found a home in Kazakhstan's system for lifelong learning. Recent studies indicate the limited use of connectivism-based tools such as discussion forums by Kazakhstani pre-service and in-service English teachers, whose professional interests are limited to obtaining certificates rather than enhancing their teaching abilities and exchanging knowledge with colleagues. This paper highlights the significance of connectivism-based tools and instruments, such as MOOCs, for the continuous professional development of pre- and in-service English teachers, facilitators' roles, and their strategies for enhancing trainees' conceptual knowledge within the MOOCs' curriculum and online learning skills. Reviewing the most pertinent papers on Connectivism Theory, facilitators' function in TPD, and connectivism-based tools, such as MOOCs, a code extraction method was utilized. Three experts, former active participants in a series of projects initiated across Kazakhstan to improve the efficacy of MOOCs, evaluated the excerpts and selected the most appropriate ones to propose the matrix of teacher professional competencies that can be acquired through MOOCs. In this paper, we'll look at some of the strategies employed by course instructors to boost their students' English skills and knowledge of course material, both inside and outside of the MOOC platform. Participants' interactive learning contributed to their language and subject conceptual knowledge and prepared them for peer-reviewed assignments in the MOOCs, and this approach of small group interaction was given to highlight the outcomes of participants' interactive learning. Both formal and informal continuing education institutions can use the findings of this study to support teachers in gaining experience with MOOCs and creating their own online courses.

Keywords: connectivism-based tools, teacher professional development, massive open online courses, facilitators, Kazakhstani context

Procedia PDF Downloads 80
4300 A Multicriteria Framework for Assessing Energy Audit Software for Low-Income Households

Authors: Charles Amoo, Joshua New, Bill Eckman

Abstract:

Buildings in the United States account for a significant proportion of energy consumption and greenhouse gas (GHG) emissions, and this trend is expected to continue as well as rise in the near future. Low-income households, in particular, bear a disproportionate burden of high building energy consumption and spending due to high energy costs. Energy efficiency improvements need to reach an average of 4% per year in this decade in order to meet global net zero emissions target by 2050, but less than 1 % of U.S. buildings are improved each year. The government has recognized the importance of technology in addressing this issue, and energy efficiency programs have been developed to tackle the problem. The Weatherization Assistance Program (WAP), the largest residential whole-house energy efficiency program in the U.S., is specifically designed to reduce energy costs for low-income households. Under the WAP, energy auditors must follow specific audit procedures and use Department of Energy (DOE) approved energy audit tools or software. This article proposes an expanded framework of factors that should be considered in energy audit software that is approved for use in energy efficiency programs, particularly for low-income households. The framework includes more than 50 factors organized under 14 assessment criteria and can be used to qualitatively and quantitatively score different energy audit software to determine their suitability for specific energy efficiency programs. While the tool can be useful for developers to build new tools and improve existing software, as well as for energy efficiency program administrators to approve or certify tools for use, there are limitations to the model, such as the lack of flexibility that allows continuous scoring to accommodate variability and subjectivity. These limitations can be addressed by using aggregate scores of each criterion as weights that could be combined with value function and direct rating scores in a multicriteria decision analysis for a more flexible scoring.

Keywords: buildings, energy efficiency, energy audit, software

Procedia PDF Downloads 77
4299 Optimization for Autonomous Robotic Construction by Visual Guidance through Machine Learning

Authors: Yangzhi Li

Abstract:

Network transfer of information and performance customization is now a viable method of digital industrial production in the era of Industry 4.0. Robot platforms and network platforms have grown more important in digital design and construction. The pressing need for novel building techniques is driven by the growing labor scarcity problem and increased awareness of construction safety. Robotic approaches in construction research are regarded as an extension of operational and production tools. Several technological theories related to robot autonomous recognition, which include high-performance computing, physical system modeling, extensive sensor coordination, and dataset deep learning, have not been explored using intelligent construction. Relevant transdisciplinary theory and practice research still has specific gaps. Optimizing high-performance computing and autonomous recognition visual guidance technologies improves the robot's grasp of the scene and capacity for autonomous operation. Intelligent vision guidance technology for industrial robots has a serious issue with camera calibration, and the use of intelligent visual guiding and identification technologies for industrial robots in industrial production has strict accuracy requirements. It can be considered that visual recognition systems have challenges with precision issues. In such a situation, it will directly impact the effectiveness and standard of industrial production, necessitating a strengthening of the visual guiding study on positioning precision in recognition technology. To best facilitate the handling of complicated components, an approach for the visual recognition of parts utilizing machine learning algorithms is proposed. This study will identify the position of target components by detecting the information at the boundary and corner of a dense point cloud and determining the aspect ratio in accordance with the guidelines for the modularization of building components. To collect and use components, operational processing systems assign them to the same coordinate system based on their locations and postures. The RGB image's inclination detection and the depth image's verification will be used to determine the component's present posture. Finally, a virtual environment model for the robot's obstacle-avoidance route will be constructed using the point cloud information.

Keywords: robotic construction, robotic assembly, visual guidance, machine learning

Procedia PDF Downloads 86
4298 Application of Neural Network on the Loading of Copper onto Clinoptilolite

Authors: John Kabuba

Abstract:

The study investigated the implementation of the Neural Network (NN) techniques for prediction of the loading of Cu ions onto clinoptilolite. The experimental design using analysis of variance (ANOVA) was chosen for testing the adequacy of the Neural Network and for optimizing of the effective input parameters (pH, temperature and initial concentration). Feed forward, multi-layer perceptron (MLP) NN successfully tracked the non-linear behavior of the adsorption process versus the input parameters with mean squared error (MSE), correlation coefficient (R) and minimum squared error (MSRE) of 0.102, 0.998 and 0.004 respectively. The results showed that NN modeling techniques could effectively predict and simulate the highly complex system and non-linear process such as ion-exchange.

Keywords: clinoptilolite, loading, modeling, neural network

Procedia PDF Downloads 415
4297 Characterization of (GRAS37) Gibberellin Acid Insensitive (GAI), Repressor (RGA), and Scarecrow (SCR) Gene by Using Bioinformatics Tools

Authors: Yusra Tariq

Abstract:

The Grass 37 gene is presently known in tomatoes, which are the source of healthy substances such as ascorbic acid, polyphenols, carotenoids and nutrients. It has a significant impact on the growth and development of humans. The GRASS 37 gene is a plant Transcription factor group assuming significant parts in various reactions of different Abiotic stresses such as (drought, salinity, thermal stresses, temperature, and bright waves) which could highly affect the growth. Tomatoes are very sensitive to temperature, and their growth or production occurs optimally in a temperature range from 21 C to 29.5 C during the daytime and from 18.5 C to 21 C during the night. This protein acts as a positive regulator of salt stress response and abscisic acid signaling. This study summarizes the structure characterized by molecular formula and protein-binding domains by different bioinformatics tools such as Expasy translate tool, Expasy Portparam, Swiss Prot and Inter Pro Scan, Clustal W tool regulatory procedure of GRASS gene components, also their reactions to both biotic and Abiotic stresses.

Keywords: GRAS37, gene, bioinformatics, tool

Procedia PDF Downloads 54
4296 Investigation of Amorphous Silicon A-Si Thin Films Deposited on Silicon Substrate by Raman Spectroscopy

Authors: Amirouche Hammouda, Nacer Boucherou, Aicha Ziouche, Hayet Boudjellal

Abstract:

Silicon has excellent physical and electrical properties for optoelectronics industry. It is a promising material with many advantages. On Raman characterization of thin films deposited on crystalline silicon substrate, the signal Raman of amorphous silicon is often disturbed by the Raman signal of the crystalline silicon substrate. In this paper, we propose to characterize thin layers of amorphous silicon deposited on crystalline silicon substrates. The results obtained have shown the possibility to bring out the Raman spectrum of deposited layers by optimizing experimental parameters.

Keywords: raman scattering, amorphous silicon, crystalline silicon, thin films

Procedia PDF Downloads 73
4295 Cyber-Softbook: A Platform for Collaborative Content Development and Delivery for Cybersecurity Education

Authors: Eniye Tebekaemi, Martin Zhao

Abstract:

The dichotomy between the skills set of newly minted college graduates and the skills required by cybersecurity employers is on the rise. Colleges are struggling to cope with the rapid pace of technology evolution using outdated tools and practices. Industries are getting frustrated due to the need to retrain fresh college graduates on skills they should have acquired. There is a dire need for academic institutions to develop new tools and systems to deliver cybersecurity education to meet the ever-evolving technology demands of the industry. The Cyber-Softbook project’s goal is to bridge the tech industry and tech education gap by providing educators a framework to collaboratively design, manage, and deliver cybersecurity academic courses that meet the needs of the tech industry. The Cyber-Softbook framework, when developed, will provide a platform for academic institutions and tech industries to collaborate on tech education and for students to learn about cybersecurity with all the resources they need to understand concepts and gain valuable skills available on a single platform.

Keywords: cybersecurity, education, skills, labs, curriculum

Procedia PDF Downloads 92
4294 Social Studies Teachers Experiences in Teaching Spatial Thinking in Social Studies Classrooms in Kuwait: Exploratory Study

Authors: Huda Alazmi

Abstract:

Social studies educational research has, so far, devoted very little attention towards spatial thinking in classroom teaching. To help address such paucity, this study explores the spatial thinking instructional experiences of middle school social studies teachers in Kuwait. The goal is to learn their teaching practices and assess teacher understanding for the spatial thinking concept to enable future improvements. Using a qualitative study approach, the researcher conducted semi-structured interviews to examine the relevant experiences of 14 social studies teachers. The findings revealed three major themes: (1) concepts of space, (2) tools of representation, and (3) spatial reasoning. These themes illustrated how social studies teachers focus predominantly upon simple concepts of space, using multiple tools of representation, but avoid addressing critical spatial reasoning. The findings help explain the current situation while identifying weaker areas for further analysis and improvement.

Keywords: spatial thinking, concepts of space, spatial representation, spatial reasoning

Procedia PDF Downloads 79
4293 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 196
4292 Identify Users Behavior from Mobile Web Access Logs Using Automated Log Analyzer

Authors: Bharat P. Modi, Jayesh M. Patel

Abstract:

Mobile Internet is acting as a major source of data. As the number of web pages continues to grow the Mobile web provides the data miners with just the right ingredients for extracting information. In order to cater to this growing need, a special term called Mobile Web mining was coined. Mobile Web mining makes use of data mining techniques and deciphers potentially useful information from web data. Web Usage mining deals with understanding the behavior of users by making use of Mobile Web Access Logs that are generated on the server while the user is accessing the website. A Web access log comprises of various entries like the name of the user, his IP address, a number of bytes transferred time-stamp etc. A variety of Log Analyzer tools exists which help in analyzing various things like users navigational pattern, the part of the website the users are mostly interested in etc. The present paper makes use of such log analyzer tool called Mobile Web Log Expert for ascertaining the behavior of users who access an astrology website. It also provides a comparative study between a few log analyzer tools available.

Keywords: mobile web access logs, web usage mining, web server, log analyzer

Procedia PDF Downloads 362
4291 Machine Learning for Feature Selection and Classification of Systemic Lupus Erythematosus

Authors: H. Zidoum, A. AlShareedah, S. Al Sawafi, A. Al-Ansari, B. Al Lawati

Abstract:

Systemic lupus erythematosus (SLE) is an autoimmune disease with genetic and environmental components. SLE is characterized by a wide variability of clinical manifestations and a course frequently subject to unpredictable flares. Despite recent progress in classification tools, the early diagnosis of SLE is still an unmet need for many patients. This study proposes an interpretable disease classification model that combines the high and efficient predictive performance of CatBoost and the model-agnostic interpretation tools of Shapley Additive exPlanations (SHAP). The CatBoost model was trained on a local cohort of 219 Omani patients with SLE as well as other control diseases. Furthermore, the SHAP library was used to generate individual explanations of the model's decisions as well as rank clinical features by contribution. Overall, we achieved an AUC score of 0.945, F1-score of 0.92 and identified four clinical features (alopecia, renal disorders, cutaneous lupus, and hemolytic anemia) along with the patient's age that was shown to have the greatest contribution on the prediction.

Keywords: feature selection, classification, systemic lupus erythematosus, model interpretation, SHAP, Catboost

Procedia PDF Downloads 84
4290 Enhance the Power of Sentiment Analysis

Authors: Yu Zhang, Pedro Desouza

Abstract:

Since big data has become substantially more accessible and manageable due to the development of powerful tools for dealing with unstructured data, people are eager to mine information from social media resources that could not be handled in the past. Sentiment analysis, as a novel branch of text mining, has in the last decade become increasingly important in marketing analysis, customer risk prediction and other fields. Scientists and researchers have undertaken significant work in creating and improving their sentiment models. In this paper, we present a concept of selecting appropriate classifiers based on the features and qualities of data sources by comparing the performances of five classifiers with three popular social media data sources: Twitter, Amazon Customer Reviews, and Movie Reviews. We introduced a couple of innovative models that outperform traditional sentiment classifiers for these data sources, and provide insights on how to further improve the predictive power of sentiment analysis. The modelling and testing work was done in R and Greenplum in-database analytic tools.

Keywords: sentiment analysis, social media, Twitter, Amazon, data mining, machine learning, text mining

Procedia PDF Downloads 353
4289 Engineering Optimization of Flexible Energy Absorbers

Authors: Reza Hedayati, Meysam Jahanbakhshi

Abstract:

Elastic energy absorbers which consist of a ring-liked plate and springs can be a good choice for increasing the impact duration during an accident. In the current project, an energy absorber system is optimized using four optimizing methods Kuhn-Tucker, Sequential Linear Programming (SLP), Concurrent Subspace Design (CSD), and Pshenichny-Lim-Belegundu-Arora (PLBA). Time solution, convergence, Programming Length and accuracy of the results were considered to find the best solution algorithm. Results showed the superiority of PLBA over the other algorithms.

Keywords: Concurrent Subspace Design (CSD), Kuhn-Tucker, Pshenichny-Lim-Belegundu-Arora (PLBA), Sequential Linear Programming (SLP)

Procedia PDF Downloads 399
4288 Stakeholder Management for Successful Software Projects

Authors: Kassem Saleh

Abstract:

An alarming number of software projects fail to deliver the required functionalities within the provided budget and timeframe and with the required qualities. Some of the main reasons for this problem include bad stakeholder management, poor communications and informal change management. Informal processes to identify, engage and control stakeholders lead to these reasons. Recently, to emphasize its importance, the Project Management Institute (PMI) updated the Project Management Body of Knowledge (PMBoK) to explicitly include the stakeholder management knowledge area. This knowledge area consists of four processes to identify stakeholders, plan stakeholder management, and manage and control stakeholder engagement. The use of appropriate techniques for stakeholder management in software projects will definitely lead to higher quality and successful software. In this paper, we describe some of the proven techniques that can be used during the execution of the four processes for stakeholder management. Development of collaboration tools for automating these processes are recommended and need to be integrated in available software project management tools.

Keywords: project management, stakeholder management, software development, project management body of knowledge

Procedia PDF Downloads 311