Search results for: WEKA data mining tool
26249 The Effect of Computer-Based Formative Assessment on Learning Outcome
Authors: Van Thien NGO
Abstract:
The purpose of the study is to examine the effect of student response systems in computer-based formative assessment on learning outcomes. The backward design course is a tool to be applied for collecting necessary assessment evidence. The quasi-experimental research design involves collecting pre and posttest data on students assigned to the control group and the experimental group. The sample group consists of 150 college students randomly selected from two of the eight classes of electrical and electronics students at Cao Thang Technical College in Ho Chi Minh City, Vietnam. Findings from this research revealed that the experimental group, in which student response systems were applied, got better results than the controlled group, who did not apply them. Results show that using student response systems for technology-based formative assessment is vital and meaningful not only for teachers but also for students in the teaching and learning process.Keywords: student response system, computer-based formative assessment, learning outcome, backward design course
Procedia PDF Downloads 13326248 The Use of Technology in Theatrical Performances as a Tool of Audience’S Engagement
Authors: Chrysoula Bousiouta
Abstract:
Throughout the history of theatre, technology has played an important role both in influencing the relationship between performance and audience and offering different kinds of experiences. The use of technology dates back in ancient times, when the introduction of artifacts, such as “Deus ex machine” in ancient Greek theatre, started. Taking into account the key techniques and experiences used throughout history, this paper investigates how technology, through new media, influences contemporary theatre. In the context of this research, technology is defined as projections, audio environments, video-projections, sensors, tele-connections, all alongside with the performance, challenging audience’s participation. The theoretical framework of the research covers, except for the history of theatre, the theory of “experience economy” that took over the service and goods economy. The research is based on the qualitative and comparative analysis of two case studies, Contact Theatre in Manchester (United Kingdom) and Bios in Athens (Greece). The data selection includes desk research and is complemented with semi structured interviews. Building on the results of the research one could claim that the intended experience of modern/contemporary theatre is that of engagement. In this context, technology -as defined above- plays a leading role in creating it. This experience passes through and exists in the middle of the realms of entertainment, education, estheticism and escapism. Furthermore, it is observed that nowadays, theatre is not only about acting but also about performing; it is that one where the performances are unfinished without the participation of the audience. Both case studies try to achieve the experience of engagement through practices that promote the attraction of attention, the increase of imagination, the interaction, the intimacy and the true activity. These practices are achieved through the script, the scenery, the language and the environment of a performance. Contact and Bios consider technology as an intimate tool in order to accomplish the above, and they make an extended use of it. The research completes a notable record of technological techniques that modern theatres use. The use of technology, inside or outside the limits of film technique’s, helps to rivet the attention of the audience, to make performances enjoyable, to give the sense of the “unfinished” or to be used for things that take place around the spectators and force them to take action, being spect-actors. The advantage of technology is that it can be used as a hook for interaction in all stages of a performance. Further research on the field could involve exploring alternative ways of binding technology and theatre or analyzing how the performance is perceived through the use of technological artifacts.Keywords: experience of engagement, interactive theatre, modern theatre, performance, technology
Procedia PDF Downloads 25026247 Academic Staff Perspective of Adoption of Augmented Reality in Teaching Practice to Support Students Learning Remotely in a Crisis Time in Higher
Authors: Ebtisam Alqahtani
Abstract:
The purpose of this study is to investigate academic staff perspectives on using Augmented Reality in teaching practice to support students learning remotely during the COVID pandemic. the study adopted the DTPB theoretical model to guide the identification of key potential factors that could motivate academic staff to use or not use AR in teaching practices. A mixing method design was adopted for a better understanding of the study problem. A survey was completed by 851 academic staff, and this was followed by interviews with 20 academic staff. Statistical analyses were used to assess the survey data, and thematic analysis was used to assess the interview data. The study finding indicates that 75% of academic staff were aware of AR as a pedagogical tool, and they agreed on the potential benefits of AR in teaching and learning practices. However, 36% of academic staff use it in teaching and learning practice, and most of them agree with most of the potential barriers to adopting AR in educational environments. In addition, the study results indicate that 91% of them are planning to use it in the future. The most important factors that motivated them to use it in the future are the COVID pandemic factor, hedonic motivation factor, and academic staff attitude factor. The perceptions of academic staff differed according to the universities they attended, the faculties they worked in, and their gender. This study offers further empirical support for the DTPB model, as well as recommendations to help higher education implement technology in its educational environment based on the findings of the study. It is unprecedented the study the necessity of the use of AR technologies in the time of Covid-19. Therefore, the contribution is both theoretical and practiceKeywords: higher education, academic staff, AR technology as pedological tools, teaching and learning practice, benefits of AR, barriers of adopting AR, and motivating factors to adopt AR
Procedia PDF Downloads 12826246 Designing an Online Case-Based Library for Technology Integration in Teacher Education
Authors: Mustafa Tevfik Hebebci, Sirin Kucuk, Ismail Celik, A. Oguz Akturk, Ismail Sahin, Fetah Eren
Abstract:
The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for interactive case-based library. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology into education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.Keywords: ADDIE, case-based library, design, technology integration
Procedia PDF Downloads 44526245 Optimum Design of Steel Space Frames by Hybrid Teaching-Learning Based Optimization and Harmony Search Algorithms
Authors: Alper Akin, Ibrahim Aydogdu
Abstract:
This study presents a hybrid metaheuristic algorithm to obtain optimum designs for steel space buildings. The optimum design problem of three-dimensional steel frames is mathematically formulated according to provisions of LRFD-AISC (Load and Resistance factor design of American Institute of Steel Construction). Design constraints such as the strength requirements of structural members, the displacement limitations, the inter-story drift and the other structural constraints are derived from LRFD-AISC specification. In this study, a hybrid algorithm by using teaching-learning based optimization (TLBO) and harmony search (HS) algorithms is employed to solve the stated optimum design problem. These algorithms are two of the recent additions to metaheuristic techniques of numerical optimization and have been an efficient tool for solving discrete programming problems. Using these two algorithms in collaboration creates a more powerful tool and mitigates each other’s weaknesses. To demonstrate the powerful performance of presented hybrid algorithm, the optimum design of a large scale steel building is presented and the results are compared to the previously obtained results available in the literature.Keywords: optimum structural design, hybrid techniques, teaching-learning based optimization, harmony search algorithm, minimum weight, steel space frame
Procedia PDF Downloads 54526244 Exploring the Impact of ChatGPT on the English Writing Skills of a Group of International EFL Uzbek Students: A Qualitative Case Study Conducted at a Private University College in Malaysia
Authors: Uranus Saadat
Abstract:
ChatGPT, as one of the well-known artificial intelligence (AI) tools, has recently been integrated into English language education and has had several impacts on learners. Accordingly, concerns regarding the overuse of this tool among EFL/ESL learners are rising, which could lead to several disadvantages in their writing skills development. The use of ChatGPT in facilitating writing skills is a novel concept that demands further studies in different contexts and learners. In this study, a qualitative case study is applied to investigate the impact of ChatGPT on the writing skills of a group of EFL bachelor’s students from Uzbekistan studying Teaching English as the Second Language (TESL) at a private university in Malaysia. The data was collected through the triangulation of document analysis, semi-structured interviews, classroom observations, and focus group discussions. Subsequently, the data was analyzed by using thematic analysis. Some of the emerging themes indicated that ChatGPT is helpful in engaging students by reducing their anxiety in class and providing them with constructive feedback and support. Conversely, certain emerging themes revealed excessive reliance on ChatGPT, resulting in a decrease in students’ creativity and critical thinking skills, memory span, and tolerance for ambiguity. The study suggests a number of strategies to alleviate its negative impacts, such as peer review activities, workshops for familiarizing students with AI, and gradual withdrawal of AI support activities. This study emphasizes the need for cautious AI integration into English language education to cultivate independent learners with higher-order thinking skills.Keywords: ChatGPT, EFL/ESL learners, English writing skills, artificial intelligence tools, critical thinking skills
Procedia PDF Downloads 2226243 Regulation on the Protection of Personal Data Versus Quality Data Assurance in the Healthcare System Case Report
Authors: Elizabeta Krstić Vukelja
Abstract:
Digitization of personal data is a consequence of the development of information and communication technologies that create a new work environment with many advantages and challenges, but also potential threats to privacy and personal data protection. Regulation (EU) 2016/679 of the European Parliament and of the Council is becoming a law and obligation that should address the issues of personal data protection and information security. The existence of the Regulation leads to the conclusion that national legislation in the field of virtual environment, protection of the rights of EU citizens and processing of their personal data is insufficiently effective. In the health system, special emphasis is placed on the processing of special categories of personal data, such as health data. The healthcare industry is recognized as a particularly sensitive area in which a large amount of medical data is processed, the digitization of which enables quick access and quick identification of the health insured. The protection of the individual requires quality IT solutions that guarantee the technical protection of personal categories. However, the real problems are the technical and human nature and the spatial limitations of the application of the Regulation. Some conclusions will be drawn by analyzing the implementation of the basic principles of the Regulation on the example of the Croatian health care system and comparing it with similar activities in other EU member states.Keywords: regulation, healthcare system, personal dana protection, quality data assurance
Procedia PDF Downloads 3926242 V0 Physics at LHCb. RIVET Analysis Module for Z Boson Decay to Di-Electron
Authors: A. E. Dumitriu
Abstract:
The LHCb experiment is situated at one of the four points around CERN’s Large Hadron Collider, being a single-arm forward spectrometer covering 10 mrad to 300 (250) mrad in the bending (non-bending) plane, designed primarily to study particles containing b and c quarks. Each one of LHCb’s sub-detectors specializes in measuring a different characteristic of the particles produced by colliding protons, its significant detection characteristics including a high precision tracking system and 2 ring-imaging Cherenkov detectors for particle identification. The major two topics that I am currently concerned in are: the RIVET project (Robust Independent Validation of Experiment and Theory) which is an efficient and portable tool kit of C++ class library useful for validation and tuning of Monte Carlo (MC) event generator models by providing a large collection of standard experimental analyses useful for High Energy Physics MC generator development, validation, tuning and regression testing and V0 analysis for 2013 LHCb NoBias type data (trigger on bunch + bunch crossing) at √s=2.76 TeV.Keywords: LHCb physics, RIVET plug-in, RIVET, CERN
Procedia PDF Downloads 42826241 APP-Based Language Teaching Using Mobile Response System in the Classroom
Authors: Martha Wilson
Abstract:
With the peak of Computer-Assisted Language Learning slowly coming to pass and Mobile-Assisted Language Learning, at times, a bit lacking in the communicative department, we are now faced with a challenging question: How can we engage the interest of our digital native students and, most importantly, sustain it? As previously mentioned, our classrooms are now experiencing an influx of “digital natives” – people who have grown up using and having unlimited access to technology. While modernizing our curriculum and digitalizing our classrooms are necessary in order to accommodate this new learning style, it is a huge financial burden and a massive undertaking for language institutes. Instead, opting for a more compact, simple, yet multidimensional pedagogical tool may be the solution to the issue at hand. This paper aims to give a brief overview into an existing device referred to as Student Response Systems (SRS) and to expand on this notion to include a new prototype of response system that will be designed as a mobile application to eliminate the need for costly hardware and software. Additionally, an analysis into recent attempts by other institutes to develop the Mobile Response System (MRS) and customer reviews of the existing MRSs will be provided, as well as the lessons learned from those projects. Finally, while the new model of MRS is still in its infancy stage, this paper will discuss the implications of incorporating such an application as a tool to support and to enrich traditional techniques and also offer practical classroom applications with the existing response systems that are immediately available on the market.Keywords: app, clickers, mobile app, mobile response system, student response system
Procedia PDF Downloads 37126240 Parallel Vector Processing Using Multi Level Orbital DATA
Authors: Nagi Mekhiel
Abstract:
Many applications use vector operations by applying single instruction to multiple data that map to different locations in conventional memory. Transferring data from memory is limited by access latency and bandwidth affecting the performance gain of vector processing. We present a memory system that makes all of its content available to processors in time so that processors need not to access the memory, we force each location to be available to all processors at a specific time. The data move in different orbits to become available to other processors in higher orbits at different time. We use this memory to apply parallel vector operations to data streams at first orbit level. Data processed in the first level move to upper orbit one data element at a time, allowing a processor in that orbit to apply another vector operation to deal with serial code limitations inherited in all parallel applications and interleaved it with lower level vector operations.Keywords: Memory Organization, Parallel Processors, Serial Code, Vector Processing
Procedia PDF Downloads 27026239 Reconstructability Analysis for Landslide Prediction
Authors: David Percy
Abstract:
Landslides are a geologic phenomenon that affects a large number of inhabited places and are constantly being monitored and studied for the prediction of future occurrences. Reconstructability analysis (RA) is a methodology for extracting informative models from large volumes of data that work exclusively with discrete data. While RA has been used in medical applications and social science extensively, we are introducing it to the spatial sciences through applications like landslide prediction. Since RA works exclusively with discrete data, such as soil classification or bedrock type, working with continuous data, such as porosity, requires that these data are binned for inclusion in the model. RA constructs models of the data which pick out the most informative elements, independent variables (IVs), from each layer that predict the dependent variable (DV), landslide occurrence. Each layer included in the model retains its classification data as a primary encoding of the data. Unlike other machine learning algorithms that force the data into one-hot encoding type of schemes, RA works directly with the data as it is encoded, with the exception of continuous data, which must be binned. The usual physical and derived layers are included in the model, and testing our results against other published methodologies, such as neural networks, yields accuracy that is similar but with the advantage of a completely transparent model. The results of an RA session with a data set are a report on every combination of variables and their probability of landslide events occurring. In this way, every combination of informative state combinations can be examined.Keywords: reconstructability analysis, machine learning, landslides, raster analysis
Procedia PDF Downloads 6626238 Evaluating the Possibility of Expanding National Health Insurance Funding From Zakat, Sudan
Authors: Fawzia Mohammed Idris
Abstract:
Zakat is an Islamic procedure for wealth distribution as a social protection mechanism for needy people. This study aimed to assess the possibility to expand the share of fund for national health insurance fund from zakat funds allocated for poor people by measuring the reduction of poverty that result from the investing on direct payment to the needy or by covering them in social health insurance. This study used stata regression as a statistical analysis tool and the finding clarified that there is no significant relationship between the poverty rate as the main indicator and, the number of poor people covered by national health insurance on one hand and the number of benefits poor people from the distribution of zakat fund. This study experienced many difficulties regarding the quality and the consistency of the data. The study suggested that a joint mission between national health insurance fund and zakat chamber to conduct study to assess the efficient use of zakat fund allocated to poor people.Keywords: health finance, poverty, social health insurance, zakat
Procedia PDF Downloads 14626237 NGOs from the Promotion of Civic Participation to Public Problems Solving: Case Study Urmia, Iran
Authors: Amin Banae Babazadeh
Abstract:
In the contemporary world, NGOs are considered as important tool for motivating the community. So they committed their true mission and the promotion of civic participation and strengthen social identities. Functional characteristics of non-governmental organizations are the element to leverage the centers of political and social development of powerful governments since they are concrete and familiar with the problems of society and the operational strategies which would facilitate this process of mutual trust between the people and organizations. NGOs on the one hand offer reasonable solutions in line with approved organizations as agents to match between the facts and reality of society and on the other hand changes to a tool to have true political, social and economic behavior. However, the NGOs are active in the formulation of national relations and policy formulation in an organized and disciplined based on three main factors, i.e., resources, policies, and institutions. Organizations are not restricted to state administration in centralized system bodies and this process in the democratic system limits the accumulation of desires and expectations and at the end reaches to the desired place. Hence, this research will attempt to emphasis on field research (questionnaire) and according to the development evolution and role of NGOs analyze the effects of this center on youth. Therefore, the hypothesis is that there is a direct relationship between the Enlightenment and the effectiveness of policy towards NGOs and solving social damages.Keywords: civic participation, community vulnerability, insightful, NGO, urmia
Procedia PDF Downloads 24126236 The Relationship between Transcendence and Psychological Well-Being: A Systematic Scientific Literature Review
Authors: Monir Ahmed
Abstract:
The main purpose of this literature review was to investigate the existing quantitative clinical studies on the relationship between transcendence and psychological well-being. The primary objective of the literature review is to determine whether the existing studies adequately demonstrate the relationship between transcendence and psychological well-being, including spiritual well-being. A further objective of this literature review is to see if the ‘creatio ex nihilo’ doctrine is necessary to understand transcendence and its relationship with psychological well-being. Systematic literature review methods including studies identified from search engines, extracting data from the studies and assessing their quality for the planned review were used. The outcome of this literature review indicates that self-transcendence (STa), spiritual transcendence (STb) are positively related to psychological well-being. However, such positive relationships present limited scope for understanding transcendence and its relationship with well-being. The findings of this review support the need for further research in the area of transcendence and well-being. This literature review reveals the importance of developing a new transcendence tool for determining an individual’s ability to transcend and the relationship between his/her ability for transcendence and psychological well-being. The author of this paper proposes that the inclusion of the theological doctrine (‘creatio ex nihilo’) in understanding transcendence and psychological well-being is crucial, necessary and unavoidable.Keywords: transcendence, psychological well-being, self-transcendence, spiritual transcendence, ‘creatio ex nihilo’
Procedia PDF Downloads 13526235 Data Analytics in Hospitality Industry
Authors: Tammy Wee, Detlev Remy, Arif Perdana
Abstract:
In the recent years, data analytics has become the buzzword in the hospitality industry. The hospitality industry is another example of a data-rich industry that has yet fully benefited from the insights of data analytics. Effective use of data analytics can change how hotels operate, market and position themselves competitively in the hospitality industry. However, at the moment, the data obtained by individual hotels remain under-utilized. This research is a preliminary research on data analytics in the hospitality industry, using an in-depth face-to-face interview on one hotel as a start to a multi-level research. The main case study of this research, hotel A, is a chain brand of international hotel that has been systematically gathering and collecting data on its own customer for the past five years. The data collection points begin from the moment a guest book a room until the guest leave the hotel premises, which includes room reservation, spa booking, and catering. Although hotel A has been gathering data intelligence on its customer for some time, they have yet utilized the data to its fullest potential, and they are aware of their limitation as well as the potential of data analytics. Currently, the utilization of data analytics in hotel A is limited in the area of customer service improvement, namely to enhance the personalization of service for each individual customer. Hotel A is able to utilize the data to improve and enhance their service which in turn, encourage repeated customers. According to hotel A, 50% of their guests returned to their hotel, and 70% extended nights because of the personalized service. Apart from using the data analytics for enhancing customer service, hotel A also uses the data in marketing. Hotel A uses the data analytics to predict or forecast the change in consumer behavior and demand, by tracking their guest’s booking preference, payment preference and demand shift between properties. However, hotel A admitted that the data they have been collecting was not fully utilized due to two challenges. The first challenge of using data analytics in hotel A is the data is not clean. At the moment, the data collection of one guest profile is meaningful only for one department in the hotel but meaningless for another department. Cleaning up the data and getting standards correctly for usage by different departments are some of the main concerns of hotel A. The second challenge of using data analytics in hotel A is the non-integral internal system. At the moment, the internal system used by hotel A do not integrate with each other well, limiting the ability to collect data systematically. Hotel A is considering another system to replace the current one for more comprehensive data collection. Hotel proprietors recognized the potential of data analytics as reported in this research, however, the current challenges of implementing a system to collect data come with a cost. This research has identified the current utilization of data analytics and the challenges faced when it comes to implementing data analytics.Keywords: data analytics, hospitality industry, customer relationship management, hotel marketing
Procedia PDF Downloads 18026234 Evaluation of Ensemble Classifiers for Intrusion Detection
Authors: M. Govindarajan
Abstract:
One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection.Keywords: data mining, ensemble, radial basis function, support vector machine, accuracy
Procedia PDF Downloads 24826233 Qualitative Profiling in Practice: The Italian Public Employment Services Experience
Authors: L. Agneni, F. Carta, C. Micheletta, V. Tersigni
Abstract:
The development of a qualitative method to profile jobseekers is needed to improve the quality of the Public Employment Services (PES) in Italy. This is why the National Agency for Active Labour Market Policies (ANPAL) decided to introduce a Qualitative Profiling Service in the context of the activities carried out by local employment offices’ operators. The qualitative profiling service provides information and data regarding the jobseeker’s personal transition status, through a semi-structured questionnaire administered to PES clients during the guidance interview. The questionnaire responses allow PES staff to identify, for each client, proper activities and policy measures to support jobseekers in their reintegration into the labour market. Data and information gathered by the qualitative profiling tool are the following: frequency, modalities and motivations for clients to apply to local employment offices; clients’ expectations and skills; difficulties that they have faced during the previous working experiences; strategies, actions undertaken and activated channels for job search. These data are used to assess jobseekers’ personal and career characteristics and to measure their employability level (qualitative profiling index), in order to develop and deliver tailor-made action programmes for each client. This paper illustrates the use of the above-mentioned qualitative profiling service on the national territory and provides an overview of the main findings of the survey: concerning the difficulties that unemployed people face in finding a job and their perception of different aspects related to the transition in the labour market. The survey involved over 10.000 jobseekers registered with the PES. Most of them are beneficiaries of the “citizens' income”, a specific active labour policy and social inclusion measure. Furthermore, data analysis allows classifying jobseekers into a specific group of clients with similar features and behaviours, on the basis of socio-demographic variables, customers' expectations, needs and required skills for the profession for which they seek employment. Finally, the survey collects PES staff opinions and comments concerning clients’ difficulties in finding a new job and also their strengths. This is a starting point for PESs’ operators to define adequate strategies to facilitate jobseekers’ access or reintegration into the labour market.Keywords: labour market transition, public employment services, qualitative profiling, vocational guidance
Procedia PDF Downloads 14126232 Realization of a (GIS) for Drilling (DWS) through the Adrar Region
Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz
Abstract:
Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.Keywords: GIS, DWS, drilling, Adrar
Procedia PDF Downloads 30926231 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction
Authors: Radul Shishkov, Orlin Davchev
Abstract:
The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction
Procedia PDF Downloads 6326230 Sequential Data Assimilation with High-Frequency (HF) Radar Surface Current
Authors: Lei Ren, Michael Hartnett, Stephen Nash
Abstract:
The abundant measured surface current from HF radar system in coastal area is assimilated into model to improve the modeling forecasting ability. A simple sequential data assimilation scheme, Direct Insertion (DI), is applied to update model forecast states. The influence of Direct Insertion data assimilation over time is analyzed at one reference point. Vector maps of surface current from models are compared with HF radar measurements. Root-Mean-Squared-Error (RMSE) between modeling results and HF radar measurements is calculated during the last four days with no data assimilation.Keywords: data assimilation, CODAR, HF radar, surface current, direct insertion
Procedia PDF Downloads 57526229 Measured versus Default Interstate Traffic Data in New Mexico, USA
Authors: M. A. Hasan, M. R. Islam, R. A. Tarefder
Abstract:
This study investigates how the site specific traffic data differs from the Mechanistic Empirical Pavement Design Software default values. Two Weigh-in-Motion (WIM) stations were installed in Interstate-40 (I-40) and Interstate-25 (I-25) to developed site specific data. A computer program named WIM Data Analysis Software (WIMDAS) was developed using Microsoft C-Sharp (.Net) for quality checking and processing of raw WIM data. A complete year data from November 2013 to October 2014 was analyzed using the developed WIM Data Analysis Program. After that, the vehicle class distribution, directional distribution, lane distribution, monthly adjustment factor, hourly distribution, axle load spectra, average number of axle per vehicle, axle spacing, lateral wander distribution, and wheelbase distribution were calculated. Then a comparative study was done between measured data and AASHTOWare default values. It was found that the measured general traffic inputs for I-40 and I-25 significantly differ from the default values.Keywords: AASHTOWare, traffic, weigh-in-motion, axle load distribution
Procedia PDF Downloads 34326228 Design of Knowledge Management System with Geographic Information System
Authors: Angga Hidayah Ramadhan, Luciana Andrawina, M. Azani Hasibuan
Abstract:
Data will be as a core of the decision if it has a good treatment or process, which is process that data into information, and information into knowledge to make a wisdom or decision. Today, many companies have not realize it include XYZ University Admission Directorate as executor of National Admission called Seleksi Masuk Bersama (SMB) that during the time, the workers only uses their feeling to make a decision. Whereas if it done, then that company can analyze the data to make a right decision to get a pin sales from student candidate or registrant that follow SMB as many as possible. Therefore, needs Knowledge Management System (KMS) with Geographic Information System (GIS) use 5C4C that can process that company data becomes more useful and can help make decisions. This information system can process data into information based on the pin sold data with 5C (Contextualized, Categorize, Calculation, Correction, Condensed) and convert information into knowledge with 4C (Comparing, Consequence, Connection, Conversation) that has been several steps until these data can be useful to make easier to take a decision or wisdom, resolve problems, communicate, and quicker to learn to the employees have not experience and also for ease of viewing/visualization based on spatial data that equipped with GIS functionality that can be used to indicate events in each province with indicator that facilitate in this system. The system also have a function to save the tacit on the system then to be proceed into explicit in expert system based on the problems that will be found from the consequences of information. With the system each team can make a decision with same ways, structured, and the important is based on the actual event/data.Keywords: 5C4C, data, information, knowledge
Procedia PDF Downloads 46326227 An Ideational Grammatical Metaphor of Narrative History in Chinua Achebe's 'There Was a Country'
Authors: Muhammed-Badar Salihu Jibrin, Chibabi Makedono Darlington
Abstract:
This paper studied Ideational Grammatical Metaphor (IGM) of Narrative History in Chinua Achebe’s There Was a Country. It started with a narrative historical style as a recent genre out of the conventional historical writings. In order to explore the linguistic phenomenon using a particular lexico-grammatical tool of IGM, the theoretical background was examined based on Hallidayan Systemic Functional Linguistics. Furthermore, the study considered the possibility of applying IGM to the Part 4 of Achebe’s historical text with recourse to the concept of congruence in IGM and research questions before formulating a working methodology. The analysis of Achebe’s memoir was, thus, presented in tabular forms to account for the quantitative content analysis with qualitative research technique, as well as the metaphorical and congruent wording through nominalization and process types with samples. The frequencies and percentage were given appropriately with respect to each subheadings of the text. To this end, the findings showed that material and relational types indicated dominance. The discussion and implications were that the findings confirmed earlier study by MAK Halliday and C.I.M.I.M. Matthiessen’s suggestion that IGM should show dominance of material type process. The implication is that IGM can be an effective tool for the analysis of a narrative historical text. In conclusion, it was observed that IGM does not only carry grammatical function but also an ideological role in shaping the historical discourse within the narrative mode between writers and readers.Keywords: ideational grammatical metaphor, nominalization, narrative history, memoire, dominance
Procedia PDF Downloads 22026226 In situ Stabilization of Arsenic in Soils with Birnessite and Goethite
Authors: Saeed Bagherifam, Trevor Brown, Chris Fellows, Ravi Naidu
Abstract:
Over the last century, rapid urbanization, industrial emissions, and mining activities have resulted in widespread contamination of the environment by heavy metal(loid)s. Arsenic (As) is a toxic metalloid belonging to group 15 of the periodic table, which occurs naturally at low concentrations in soils and the earth’s crust, although concentrations can be significantly elevated in natural systems as a result of dispersion from anthropogenic sources, e.g., mining activities. Bioavailability is the fraction of a contaminant in soils that is available for uptake by plants, food chains, and humans and therefore presents the greatest risk to terrestrial ecosystems. Numerous attempts have been made to establish in situ and ex-situ technologies of remedial action for remediation of arsenic-contaminated soils. In situ stabilization techniques are based on deactivation or chemical immobilization of metalloid(s) in soil by means of soil amendments, which consequently reduce the bioavailability (for biota) and bioaccessibility (for humans) of metalloids due to the formation of low-solubility products or precipitates. This study investigated the effectiveness of two different types of synthetic manganese and iron oxides (birnessite and goethite) for stabilization of As in a soil spiked with 1000 mg kg⁻¹ of As and treated with 10% dosages of soil amendments. Birnessite was made using HCl and KMnO₄, and goethite was synthesized by the dropwise addition of KOH into Fe(NO₃) solution. The resulting contaminated soils were subjected to a series of chemical extraction studies including sequential extraction (BCR method), single-step extraction with distilled (DI) water, 2M HNO₃ and simplified bioaccessibility extraction tests (SBET) for estimation of bioaccessible fractions of As in two different soil fractions ( < 250 µm and < 2 mm). Concentrations of As in samples were measured using inductively coupled plasma mass spectrometry (ICP-MS). The results showed that soil with birnessite reduced bioaccessibility of As by up to 92% in both soil fractions. Furthermore, the results of single-step extractions revealed that the application of both birnessite and Goethite reduced DI water and HNO₃ extractable amounts of arsenic by 75, 75, 91, and 57%, respectively. Moreover, the results of the sequential extraction studies showed that both birnessite and goethite dramatically reduced the exchangeable fraction of As in soils. However, the amounts of recalcitrant fractions were higher in birnessite, and Goethite amended soils. The results revealed that the application of both birnessite and goethite significantly reduced bioavailability and the exchangeable fraction of As in contaminated soils, and therefore birnessite and Goethite amendments might be considered as promising adsorbents for stabilization and remediation of As contaminated soils.Keywords: arsenic, bioavailability, in situ stabilisation, metalloid(s) contaminated soils
Procedia PDF Downloads 13526225 Frequency of Alloimmunization in Sickle Cell Disease Patients in Africa: A Systematic Review with Meta-analysis
Authors: Theresa Ukamaka Nwagha, Angela Ogechukwu Ugwu, Martins Nweke
Abstract:
Background and Objectives: Blood transfusion is an effective and proven treatment for some severe complications of sickle cell disease. Recurrent transfusions have put patients with sickle cell disease at risk of developing antibodies against the various antigens they were exposed to. This study aims to investigate the frequency of red blood cell alloimmunization in patients with sickle disease in Africa. Materials and Methods: This is a systematic review of peer-reviewed literature published in English. The review was conducted consistent with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses checklist. Data sources for the review include MEDLINE, PubMed, CINAHL, and Academic Search Complete. Included in this review are articles that reported the frequency/prevalence of red blood cell alloimmunization in sickle cell disease patients in Africa. Eligible studies were subjected to independent full-text screening and data extraction. Risk of bias assessment was conducted with the aid of the mixed method appraisal tool. We employed a random-effects model of meta-analysis to estimate the pooled prevalence. We computed Cochrane’s Q statistics and I2 and prediction interval to quantify heterogeneity in effect size. Results: The prevalence estimates range from 2.6% to 29%. Pooled prevalence was estimated to be 10.4% (CI 7.7.–13.8); PI = 3.0 – 34.0%), with significant heterogeneity (I2 = 84.62; PI = 2.0-32.0%) and publication bias (Egger’s t-test = 1.744, p = 0.0965). Conclusion: The frequency of red cell alloantibody varies considerably in Africa. The alloantibodies appeared frequent in this order: the Rhesus, Kell, Lewis, Duffy, MNS, and LutheranKeywords: frequency, red blood cell, alloimmunization, sickle cell disease, Africa
Procedia PDF Downloads 10026224 Application of RayMan Model in Quantifying the Impacts of the Built Environment and Surface Properties on Surrounding Temperature
Authors: Maryam Karimi, Rouzbeh Nazari
Abstract:
Introduction: Understanding thermal distribution in the micro-urban climate has now been necessary for urban planners or designers due to the impact of complex micro-scale features of Urban Heat Island (UHI) on the built environment and public health. Hence, understanding the interrelation between urban components and thermal pattern can assist planners in the proper addition of vegetation to build-environment, which can minimize the UHI impact. To characterize the need for urban green infrastructure (UGI) through better urban planning, this study proposes the use of RayMan model to measure the impact of air quality and increased temperature based on urban morphology in the selected metropolitan cities. This project will measure the impact of build environment for urban and regional planning using human biometeorological evaluations (Tmrt). Methods: We utilized the RayMan model to estimate the Tmrt in an urban environment incorporating location and height of buildings and trees as a supplemental tool in urban planning and street design. The estimated Tmrt value will be compared with existing surface and air temperature data to find the actual temperature felt by pedestrians. Results: Our current results suggest a strong relationship between sky-view factor (SVF) and increased surface temperature in megacities based on current urban morphology. Conclusion: This study will help with Quantifying the impacts of the built environment and surface properties on surrounding temperature, identifying priority urban neighborhoods by analyzing Tmrt and air quality data at the pedestrian level, and characterizing the need for urban green infrastructure cooling potential.Keywords: built environment, urban planning, urban cooling, extreme heat
Procedia PDF Downloads 12326223 Development of Optimized Eye Mascara Packages with Bioinspired Spiral Methodology
Authors: Daniela Brioschi, Rovilson Mafalda, Silvia Titotto
Abstract:
In the present days, packages are considered a fundamental element in the commercialization of products and services. A good package is capable of helping to attract new customers and also increasing a product’s purchase intent. In this scenario, packaging design emerges as an important tool, since products and design of their packaging are so interconnected that they are no longer seen as separate elements. Packaging design is, in fact, capable of generating desire for a product. The packaging market for cosmetics, especially makeup market, has also been experiencing an increasing level of sophistication and requirements. Considering packaging represents an important link of communication with the final user and plays a significant role on the sales process, it is of great importance that packages accomplish not only with functional requirements but also with the visual appeal. One of the possibilities for the design of packages and, in this context, packages for make-up, is the bioinspired design – or biomimicry. The bio-inspired design presents a promising paradigm for innovation in both design and sustainable design, by using biological system analogies to develop solutions. It has gained importance as a widely diffused movement in design for environmentally conscious development and is also responsible for several useful and innovative designs. As eye mascara packages are also part of the constant evolution on the design for cosmetics area and the traditional packages present the disadvantage of product drying along time, this project aims to develop a new and innovative package for this product, by using a selected bioinspired design methodology during the development process and also suitable computational tools. In order to guide the development process of the package, it was chosen the spiral methodology, conceived by The Biomimicry Institut, which consists of a reliable tool, since it was based on traditional design methodologies. The spiral design comprises identification, translation, discovery, abstraction, emulation and evaluation steps, that can work iteratively as the process develops as a spiral. As support tool for packaging, 3D modelling is being used by the software Inventor Autodesk Inventor 2018. Although this is an ongoing research, first results showed that spiral methodology design, together with Autodesk Inventor, consist of suitable instruments for the bio-inspired design process, and also nature proved itself to be an amazing and inexhaustible source of inspiration.Keywords: bio-inspired design, design methodology, packaging, cosmetics
Procedia PDF Downloads 18826222 A Policy Strategy for Building Energy Data Management in India
Authors: Shravani Itkelwar, Deepak Tewari, Bhaskar Natarajan
Abstract:
The energy consumption data plays a vital role in energy efficiency policy design, implementation, and impact assessment. Any demand-side energy management intervention's success relies on the availability of accurate, comprehensive, granular, and up-to-date data on energy consumption. The Building sector, including residential and commercial, is one of the largest consumers of energy in India after the Industrial sector. With economic growth and increasing urbanization, the building sector is projected to grow at an unprecedented rate, resulting in a 5.6 times escalation in energy consumption till 2047 compared to 2017. Therefore, energy efficiency interventions will play a vital role in decoupling the floor area growth and associated energy demand, thereby increasing the need for robust data. In India, multiple institutions are involved in the collection and dissemination of data. This paper focuses on energy consumption data management in the building sector in India for both residential and commercial segments. It evaluates the robustness of data available through administrative and survey routes to estimate the key performance indicators and identify critical data gaps for making informed decisions. The paper explores several issues in the data, such as lack of comprehensiveness, non-availability of disaggregated data, the discrepancy in different data sources, inconsistent building categorization, and others. The identified data gaps are justified with appropriate examples. Moreover, the paper prioritizes required data in order of relevance to policymaking and groups it into "available," "easy to get," and "hard to get" categories. The paper concludes with recommendations to address the data gaps by leveraging digital initiatives, strengthening institutional capacity, institutionalizing exclusive building energy surveys, and standardization of building categorization, among others, to strengthen the management of building sector energy consumption data.Keywords: energy data, energy policy, energy efficiency, buildings
Procedia PDF Downloads 18526221 Neural Network based Risk Detection for Dyslexia and Dysgraphia in Sinhala Language Speaking Children
Authors: Budhvin T. Withana, Sulochana Rupasinghe
Abstract:
The educational system faces a significant concern with regards to Dyslexia and Dysgraphia, which are learning disabilities impacting reading and writing abilities. This is particularly challenging for children who speak the Sinhala language due to its complexity and uniqueness. Commonly used methods to detect the risk of Dyslexia and Dysgraphia rely on subjective assessments, leading to limited coverage and time-consuming processes. Consequently, delays in diagnoses and missed opportunities for early intervention can occur. To address this issue, the project developed a hybrid model that incorporates various deep learning techniques to detect the risk of Dyslexia and Dysgraphia. Specifically, Resnet50, VGG16, and YOLOv8 models were integrated to identify handwriting issues. The outputs of these models were then combined with other input data and fed into an MLP model. Hyperparameters of the MLP model were fine-tuned using Grid Search CV, enabling the identification of optimal values for the model. This approach proved to be highly effective in accurately predicting the risk of Dyslexia and Dysgraphia, providing a valuable tool for early detection and intervention. The Resnet50 model exhibited a training accuracy of 0.9804 and a validation accuracy of 0.9653. The VGG16 model achieved a training accuracy of 0.9991 and a validation accuracy of 0.9891. The MLP model demonstrated impressive results with a training accuracy of 0.99918, a testing accuracy of 0.99223, and a loss of 0.01371. These outcomes showcase the high accuracy achieved by the proposed hybrid model in predicting the risk of Dyslexia and Dysgraphia.Keywords: neural networks, risk detection system, dyslexia, dysgraphia, deep learning, learning disabilities, data science
Procedia PDF Downloads 6426220 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures
Authors: Silvina Caíno-Lores, Jesús Carretero
Abstract:
Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing
Procedia PDF Downloads 259