Search results for: LCA tools and data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26421

Search results for: LCA tools and data

26181 Solar Photovoltaic Pumping and Water Treatment Tools: A Case Study in Ethiopian Village

Authors: Corinna Barraco, Ornella Salimbene

Abstract:

This research involves the Ethiopian locality of Jeldi (North Africa), an area particularly affected by water shortage and in which the pumping and treatment of drinking water are extremely sensitive issues. The study aims to develop and apply low-cost tools for the design of solar water pumping and water purification systems in a not developed country. Consequently, two technical tools have been implemented in Excel i) Solar photovoltaic Pumping (Spv-P) ii) Water treatment (Wt). The Spv-P tool was applied to the existing well (depth 110 [m], dynamic water level 90 [m], static water level 53 [m], well yield 0.1728 [m³h⁻¹]) in the Jeldi area, where estimated water demand is about 50 [m3d-1]. Through the application of the tool, it was designed the water extraction system of the well, obtaining the number of pumps and solar panels necessary for water pumping from the well of Jeldi. Instead, the second tool Wt has been applied in the subsequent phase of extracted water treatment. According to the chemical-physical parameters of the water, Wt returns as output the type of purification treatment(s) necessary to potable the extracted water. In the case of the well of Jeldi, the tool identified a high criticality regarding the turbidity parameter (12 [NTU] vs 5 [NTU]), and a medium criticality regarding the exceeding limits of sodium concentration (234 [mg/L Na⁺] vs 200 [mg/L Na⁺]) and ammonia (0.64 [mg/L NH³-N] vs 0.5 [mg/L NH³-N]). To complete these tools, two specific manuals are provided for the users. The joint use of the two tools would help reduce problems related to access to water resources compared to the current situation and represents a simplified solution for the design of pumping systems and analysis of purification treatments to be performed in undeveloped countries.

Keywords: drinking water, Ethiopia, treatments, water pumping

Procedia PDF Downloads 127
26180 Improvement of Students’ Active Experience through the Provision of Foundational Architecture Pedagogy by Virtual Reality Tools

Authors: Mehdi Khakzand, Flora Fakourian

Abstract:

It has been seen in recent years that architects are using virtual modeling to help them visualize their projects. Research has indicated that virtual media, particularly virtual reality, enhances architects' comprehension of design and spatial perception. Creating a communal experience for active learning is an essential component of the design process in architecture pedagogy. It has been particularly challenging to replicate design principles as a critical teaching function, and this is a complex issue that demands comprehension. Nonetheless, the usage of simulation should be studied and limited as appropriate. In conjunction with extensive technology, 3D geometric illustration can bridge the gap between the real and virtual worlds. This research intends to deliver a pedagogical experience in the architecture basics course to improve the architectural design process utilizing virtual reality tools. This tool seeks to tackle current challenges in current ways of architectural illustration by offering building geometry illustration, building information (data from the building information model), and simulation results. These tools were tested over three days in a design workshop with 12 architectural students. This article provided an architectural VR-based course and explored its application in boosting students' active experiences. According to the research, this technology can improve students' cognitive skills from challenging simulations by boosting visual understanding.

Keywords: active experience, architecture pedagogy, virtual reality, spatial perception

Procedia PDF Downloads 45
26179 Geographic Information System for District Level Energy Performance Simulations

Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck

Abstract:

The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.

Keywords: CityGML, EnergyADE, energy performance simulation, GIS

Procedia PDF Downloads 141
26178 Performance Analysis of Proprietary and Non-Proprietary Tools for Regression Testing Using Genetic Algorithm

Authors: K. Hema Shankari, R. Thirumalaiselvi, N. V. Balasubramanian

Abstract:

The present paper addresses to the research in the area of regression testing with emphasis on automated tools as well as prioritization of test cases. The uniqueness of regression testing and its cyclic nature is pointed out. The difference in approach between industry, with business model as basis, and academia, with focus on data mining, is highlighted. Test Metrics are discussed as a prelude to our formula for prioritization; a case study is further discussed to illustrate this methodology. An industrial case study is also described in the paper, where the number of test cases is so large that they have to be grouped as Test Suites. In such situations, a genetic algorithm proposed by us can be used to reconfigure these Test Suites in each cycle of regression testing. The comparison is made between a proprietary tool and an open source tool using the above-mentioned metrics. Our approach is clarified through several tables.

Keywords: APFD metric, genetic algorithm, regression testing, RFT tool, test case prioritization, selenium tool

Procedia PDF Downloads 401
26177 Analyzing Information Management in Science and Technology Institute Libraries in India

Authors: P. M. Naushad Ali

Abstract:

India’s strength in basic research is recognized internationally. Science and Technology research in India has been performed by six distinct bodies or organizations such as Cooperative Research Associations, Autonomous Research Council, Institute under Ministries, Industrial R&D Establishment, Universities, Private Institutions. All most all these institutions are having a well-established library/information center to cater the information needs of their users like scientists and technologists. Information Management (IM) comprises disciplines concerned with the study and the effective and efficient management of information and resources, products and services as well as the understanding of the involved technologies and the people engaged in this activity. It is also observed that the libraries and information centers in India are also using modern technologies for the management of various activities and services to serve their users in a better way. Science and Technology libraries in the country are usually better equipped because the investment in Science and Technology in the country are much larger than those in other fields. Thus, most of the Science and Technology libraries are equipped with modern IT-based tools for handling and management of library services. In spite of these facts Science and Technology libraries are having all the characteristics of a model organization where computer application is found most successful, however, the adoption of this IT based management tool is not uniform in these libraries. The present study will help to know about the level use of IT-based management tools for the information management of Science and Technology libraries in India. The questionnaire, interview, observation and document review techniques have been used in data collection. Finally, the author discusses findings of the study and put forward some suggestions to improve the quality of Science and Technology institute library services in India.

Keywords: information management, science and technology libraries, India, IT-based tools

Procedia PDF Downloads 366
26176 Institutional Engineering and Party Politics in Nigeria’s Fourth Republic

Authors: Emmanuel Ayobami Adesiyan

Abstract:

Political theorists have identified ethnicity as an obstacle to democratic stability in deeply divided societies. Nigeria belongs to the categories of problematic states labeled divided or deeply divided societies, as such post-independence politics is characterized by ethnicity with its ruinous effect on democratic governance and development. Institutional Engineering, the purposive manipulation of the electoral rule relating to party organization and the electoral formula has been established in comparative political studies as a policy measure for managing ethnicity in order to stabilize politics in divided societies. This paper examines the use of electoral engineering tools in managing ethnic politics in Nigeria’s Fourth Republic. The study is guided by rational institutional theory. Secondary data on electoral rules and disaggregated results of presidential elections were collected from archival documents. Data were subjected to content analysis. Institutional changes in electoral rules have promoted the development of inter-ethnic bargaining and compromises within the party system. Presidential Electoral Formula aided the emergence of national rather parochial parties. Electoral engineering tools moved Nigerian Politics from ethnic parochialism to inclusion and accommodation. These innovations should be strengthened to enhance democratic stability.

Keywords: Nigeria, presidential-elections, ethnic politics, institutional engineering

Procedia PDF Downloads 194
26175 Sustainability of High-Rise Affordable Housing: Critical Issues in Applying Green Building Rating Tools

Authors: Poh Im. Lim, Hillary Yee Qin. Tan

Abstract:

Nowadays, going green has become a trend, and being emphasized in the construction industry. In Malaysia, there are several green rating tools available in the industry and among these, GBI and GreenRE are considered as the most common tools adopted for residential buildings. However, being green is not equal to or making something sustainable. Being sustainable is to take economic, environmental and social aspects into consideration. This is particularly essential in the affordable housing sector as the end-users belong to lower-income and places importance on many socio-economic needs beyond the environmental criteria. This paper discusses the arguments in proposing a sustainability framework that is tailor-made for high-rise affordable housing. In-depth interviews and observation mapping methods were used in gathering inputs from the end-users, non-governmental organisations (NGOs) as well as the professionals. ‘Bottom-up’ approach was applied in this research to show the significance of participation from the local community in the decision-making process. The proposed sustainability framework illustrates the discrepancies between user priorities and what the industry is providing. The outcome of this research suggests that integrating sustainability into high-rise affordable housing is achievable and beneficial to the industry, society, and the environment.

Keywords: green building rating tools, high-rise affordable housing, sustainability framework, sustainable development

Procedia PDF Downloads 108
26174 Big Data in Construction Project Management: The Colombian Northeast Case

Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez

Abstract:

In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.

Keywords: big data, building information modeling, tecnology, project manamegent

Procedia PDF Downloads 100
26173 On the Development of Medical Additive Manufacturing in Egypt

Authors: Khalid Abdelghany

Abstract:

Additive Manufacturing (AM) is the manufacturing technology that is used to fabricate fast products direct from CAD models in very short time and with minimum operation steps. Jointly with the advancement in medical computer modeling, AM proved to be a very efficient tool to help physicians, orthopedic surgeons and dentists design and fabricate patient-tailored surgical guides, templates and customized implants from the patient’s CT / MRI images. AM jointly with computer-assisted designing/computer-assisted manufacturing (CAD/CAM) technology have enabled medical practitioners to tailor physical models in a patient-and purpose-specific fashion and helped to design and manufacture of templates, appliances and devices with a high range of accuracy using biocompatible materials. In developing countries, there are some technical and financial limitations of implementing such advanced tools as an essential portion of medical applications. CMRDI institute in Egypt has been working in the field of Medical Additive Manufacturing since 2003 and has assisted in the recovery of hundreds of poor patients using these advanced tools. This paper focuses on the surgical and dental use of 3D printing technology in Egypt as a developing country. The presented case studies have been designed and processed using the software tools and additive manufacturing machines in CMRDI through cooperative engineering and medical works. Results showed that the implementation of the additive manufacturing tools in developed countries is successful and could be economical comparing to long treatment plans.

Keywords: additive manufacturing, dental and orthopeadic stents, patient specific surgical tools, titanium implants

Procedia PDF Downloads 282
26172 Early Childhood Education: Teachers Ability to Assess

Authors: Ade Dwi Utami

Abstract:

Pedagogic competence is the basic competence of teachers to perform their tasks as educators. The ability to assess has become one of the demands in teachers pedagogic competence. Teachers ability to assess is related to curriculum instructions and applications. This research is aimed at obtaining data concerning teachers ability to assess that comprises of understanding assessment, determining assessment type, tools and procedure, conducting assessment process, and using assessment result information. It uses mixed method of explanatory technique in which qualitative data is used to verify the quantitative data obtained through a survey. The technique of quantitative data collection is by test whereas the qualitative data collection is by observation, interview and documentation. Then, the analyzed data is processed through a proportion study technique to be categorized into high, medium and low. The result of the research shows that teachers ability to assess can be grouped into 3 namely, 2% of high, 4% of medium and 94% of low. The data shows that teachers ability to assess is still relatively low. Teachers are lack of knowledge and comprehension in assessment application. The statement is verified by the qualitative data showing that teachers did not state which aspect was assessed in learning, record children’s behavior, and use the data result as a consideration to design a program. Teachers have assessment documents yet they only serve as means of completing teachers administration for the certification program. Thus, assessment documents were not used with the basis of acquired knowledge. The condition should become a consideration of the education institution of educators and the government to improve teachers pedagogic competence, including the ability to assess.

Keywords: assessment, early childhood education, pedagogic competence, teachers

Procedia PDF Downloads 219
26171 Comparison of Quality Indices for Sediment Assessment in Ireland

Authors: Tayyaba Bibi, Jenny Ronan, Robert Hernan, Kathleen O’Rourke, Brendan McHugh, Evin McGovern, Michelle Giltrap, Gordon Chambers, James Wilson

Abstract:

Sediment contamination is a major source of ecosystem stress and has received significant attention from the scientific community. Both the Water Framework Directive (WFD) and Marine Strategy Framework Directive (MSFD) require a robust set of tools for biological and chemical monitoring. For the MSFD in particular, causal links between contaminant and effects need to be assessed. Appropriate assessment tools are required in order to make an accurate evaluation. In this study, a range of recommended sediment bioassays and chemical measurements are assessed in a number of potentially impacted and lowly impacted locations around Ireland. Previously, assessment indices have been developed on individual compartments, i.e. contaminant levels or biomarker/bioassay responses. A number of assessment indices are applied to chemical and ecotoxicological data from the Seachange project (Project code) and compared including the metal pollution index (MPI), pollution load index (PLI) and Chapman index for chemistry as well as integrated biomarker response (IBR). The benefits and drawbacks of the use of indices and aggregation techniques are discussed. In addition to this, modelling of raw data is investigated to analyse links between contaminant and effects.

Keywords: bioassays, contamination indices, ecotoxicity, marine environment, sediments

Procedia PDF Downloads 194
26170 Forensic Analysis of Thumbnail Images in Windows 10

Authors: George Kurian, Hongmei Chi

Abstract:

Digital evidence plays a critical role in most legal investigations. In many cases, thumbnail databases show important information in that investigation. The probability of having digital evidence retrieved from a computer or smart device has increased, even though the previous user removed data and deleted apps on those devices. Due to the increase in digital forensics, the ability to store residual information from various thumbnail applications has improved. This paper will focus on investigating thumbnail information from Windows 10. Thumbnail images of interest in forensic investigations may be intact even when the original pictures have been deleted. It is our research goal to recover useful information from thumbnails. In this research project, we use various forensics tools to collect left thumbnail information from deleted videos or pictures. We examine and describe the various thumbnail sources in Windows and propose a methodology for thumbnail collection and analysis from laptops or desktops. A machine learning algorithm is adopted to help speed up content from thumbnail pictures.

Keywords: digital forensic, forensic tools, soundness, thumbnail, machine learning, OCR

Procedia PDF Downloads 88
26169 Neuropalliative Care in Patients with Progressive Neurological Disease in Czech Republic: Study Protocol

Authors: R. Bužgová, R. Kozáková, M. Škutová, M. Bar, P. Ressner, P. Bártová

Abstract:

Introduction: Currently, there has been an increasing concern about the provision of palliative care in non-oncological patients in both professional literature and clinical practice. However, there is not much scientific information on how to provide neurological and palliative care together. The main objective of the project is to create and to verify a concept of neuro-palliative and rehabilitative care for patients with selected neurological diseases in an advanced stage of the disease and also to evaluate bio-psychosocial and spiritual needs of these patients and their caregivers related to the quality of life using created standardized tools. Methodology: Triangulation of research methods (qualitative and quantitative) will be used. A concept of care and assessment tools will be developed by analyzing interviews and focus groups. Qualitative data will be analyzed using grounded theory. The concept of care will be tested in the context of the intervention study. Using quantitative analysis, we will assess the effect of an intervention provided on the saturation of needs, quality of life, and quality of care. A research sample will be made up of the patients with selected neurological diseases (Parkinson´s syndrome, motor neuron disease, multiple sclerosis, Huntington’s disease), together with patients´ family members. Based on the results, educational materials and a certified course for health care professionals will be created. Findings: Based on qualitative data analysis, we will propose the concept of integrated care model combining neurological, rehabilitative and specialist palliative care for patients with selected neurological diseases in different settings of care and services. Patients´ needs related to quality of life will be described by newly created and validated measuring tools before the start of intervention (application of neuro-palliative and palliative approach) and then in the time interval. Conclusion: Based on the results, educational materials and a certified course for doctors and health care professionals will be created.

Keywords: multidisciplinary approach, neuropalliative care, research, quality of life

Procedia PDF Downloads 261
26168 A Simulated Scenario of WikiGIS to Support the Iteration and Traceability Management of the Geodesign Process

Authors: Wided Batita, Stéphane Roche, Claude Caron

Abstract:

Geodesign is an emergent term related to a new and complex process. Hence, it needs to rethink tools, technologies and platforms in order to efficiently achieve its goals. A few tools have emerged since 2010 such as CommunityViz, GeoPlanner, etc. In the era of Web 2.0 and collaboration, WikiGIS has been proposed as a new category of tools. In this paper, we present WikiGIS functionalities dealing mainly with the iteration and traceability management to support the collaboration of the Geodesign process. Actually, WikiGIS is built on GeoWeb 2.0 technologies —and primarily on wiki— and aims at managing the tracking of participants’ editing. This paper focuses on a simplified simulation to illustrate the strength of WikiGIS in the management of traceability and in the access to history in a Geodesign process. Indeed, a cartographic user interface has been implemented, and then a hypothetical use case has been imagined as proof of concept.

Keywords: geodesign, history, traceability, tracking of participants’ editing, WikiGIS

Procedia PDF Downloads 217
26167 Materialized View Effect on Query Performance

Authors: Yusuf Ziya Ayık, Ferhat Kahveci

Abstract:

Currently, database management systems have various tools such as backup and maintenance, and also provide statistical information such as resource usage and security. In terms of query performance, this paper covers query optimization, views, indexed tables, pre-computation materialized view, query performance analysis in which query plan alternatives can be created and the least costly one selected to optimize a query. Indexes and views can be created for related table columns. The literature review of this study showed that, in the course of time, despite the growing capabilities of the database management system, only database administrators are aware of the need for dealing with archival and transactional data types differently. These data may be constantly changing data used in everyday life, and also may be from the completed questionnaire whose data input was completed. For both types of data, the database uses its capabilities; but as shown in the findings section, instead of repeating similar heavy calculations which are carrying out same results with the same query over a survey results, using materialized view results can be in a more simple way. In this study, this performance difference was observed quantitatively considering the cost of the query.

Keywords: cost of query, database management systems, materialized view, query performance

Procedia PDF Downloads 250
26166 The Development of an Automated Computational Workflow to Prioritize Potential Resistance Variants in HIV Integrase Subtype C

Authors: Keaghan Brown

Abstract:

The prioritization of drug resistance mutations impacting protein folding or protein-drug and protein-DNA interactions within macromolecular systems is critical to the success of treatment regimens. With a continual increase in computational tools to assess these impacts, the need for scalability and reproducibility became an essential component of computational analysis and experimental research. Here it introduce a bioinformatics pipeline that combines several structural analysis tools in a simplified workflow, by optimizing the present computational hardware and software to automatically ease the flow of data transformations. Utilizing preestablished software tools, it was possible to develop a pipeline with a set of pre-defined functions that will automate mutation introduction into the HIV-1 Integrase protein structure, calculate the gain and loss of polar interactions and calculate the change in energy of protein fold. Additionally, an automated molecular dynamics analysis was implemented which reduces the constant need for user input and output management. The resulting pipeline, Automated Mutation Introduction and Analysis (AMIA) is an open source set of scripts designed to introduce and analyse the effects of mutations on the static protein structure as well as the results of the multi-conformational states from molecular dynamic simulations. The workflow allows the user to visualize all outputs in a user friendly manner thereby successfully enabling the prioritization of variant systems for experimental validation.

Keywords: automated workflow, variant prioritization, drug resistance, HIV Integrase

Procedia PDF Downloads 36
26165 Functional and Efficient Query Interpreters: Principle, Application and Performances’ Comparison

Authors: Laurent Thiry, Michel Hassenforder

Abstract:

This paper presents a general approach to implement efficient queries’ interpreters in a functional programming language. Indeed, most of the standard tools actually available use an imperative and/or object-oriented language for the implementation (e.g. Java for Jena-Fuseki) but other paradigms are possible with, maybe, better performances. To proceed, the paper first explains how to model data structures and queries in a functional point of view. Then, it proposes a general methodology to get performances (i.e. number of computation steps to answer a query) then it explains how to integrate some optimization techniques (short-cut fusion and, more important, data transformations). It then compares the functional server proposed to a standard tool (Fuseki) demonstrating that the first one can be twice to ten times faster to answer queries.

Keywords: data transformation, functional programming, information server, optimization

Procedia PDF Downloads 126
26164 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins

Authors: Manju Kanu, Subrata Sinha, Surabhi Johari

Abstract:

Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.

Keywords: epitope, b cell, immunogenicity, ebola

Procedia PDF Downloads 284
26163 Hybrid Quasi-Steady Thermal Lattice Boltzmann Model for Studying the Behavior of Oil in Water Emulsions Used in Machining Tool Cooling and Lubrication

Authors: W. Hasan, H. Farhat, A. Alhilo, L. Tamimi

Abstract:

Oil in water (O/W) emulsions are utilized extensively for cooling and lubricating cutting tools during parts machining. A robust Lattice Boltzmann (LBM) thermal-surfactants model, which provides a useful platform for exploring complex emulsions’ characteristics under variety of flow conditions, is used here for the study of the fluid behavior during conventional tools cooling. The transient thermal capabilities of the model are employed for simulating the effects of the flow conditions of O/W emulsions on the cooling of cutting tools. The model results show that the temperature outcome is slightly affected by reversing the direction of upper plate (workpiece). On the other hand, an important increase in effective viscosity is seen which supports better lubrication during the work.

Keywords: hybrid lattice Boltzmann method, Gunstensen model, thermal, surfactant-covered droplet, Marangoni stress

Procedia PDF Downloads 277
26162 AI-Based Technologies in International Arbitration: An Exploratory Study on the Practicability of Applying AI Tools in International Arbitration

Authors: Annabelle Onyefulu-Kingston

Abstract:

One of the major purposes of AI today is to evaluate and analyze millions of micro and macro data in order to determine what is relevant in a particular case and proffer it in an adequate manner. Microdata, as far as it relates to AI in international arbitration, is the millions of key issues specifically mentioned by either one or both parties or by their counsels, arbitrators, or arbitral tribunals in arbitral proceedings. This can be qualifications of expert witness and admissibility of evidence, amongst others. Macro data, on the other hand, refers to data derived from the resolution of the dispute and, consequently, the final and binding award. A notable example of this includes the rationale of the award and specific and general damages awarded, amongst others. This paper aims to critically evaluate and analyze the possibility of technological inclusion in international arbitration. This research will be imploring the qualitative method by evaluating existing literature on the consequence of applying AI to both micro and macro data in international arbitration, and how this can be of assistance to parties, counsels, and arbitrators.

Keywords: AI-based technologies, algorithms, arbitrators, international arbitration

Procedia PDF Downloads 33
26161 Exploring Gaming-Learning Interaction in MMOG Using Data Mining Methods

Authors: Meng-Tzu Cheng, Louisa Rosenheck, Chen-Yen Lin, Eric Klopfer

Abstract:

The purpose of the research is to explore some of the ways in which gameplay data can be analyzed to yield results that feedback into the learning ecosystem. Back-end data for all users as they played an MMOG, The Radix Endeavor, was collected, and this study reports the analyses on a specific genetics quest by using the data mining techniques, including the decision tree method. In the study, different reasons for quest failure between participants who eventually succeeded and who never succeeded were revealed. Regarding the in-game tools use, trait examiner was a key tool in the quest completion process. Subsequently, the results of decision tree showed that a lack of trait examiner usage can be made up with additional Punnett square uses, displaying multiple pathways to success in this quest. The methods of analysis used in this study and the resulting usage patterns indicate some useful ways that gameplay data can provide insights in two main areas. The first is for game designers to know how players are interacting with and learning from their game. The second is for players themselves as well as their teachers to get information on how they are progressing through the game, and to provide help they may need based on strategies and misconceptions identified in the data.

Keywords: MMOG, decision tree, genetics, gaming-learning interaction

Procedia PDF Downloads 334
26160 Streamlining the Fuzzy Front-End and Improving the Usability of the Tools Involved

Authors: Michael N. O'Sullivan, Con Sheahan

Abstract:

Researchers have spent decades developing tools and techniques to aid teams in the new product development (NPD) process. Despite this, it is evident that there is a huge gap between their academic prevalence and their industry adoption. For the fuzzy front-end, in particular, there is a wide range of tools to choose from, including the Kano Model, the House of Quality, and many others. In fact, there are so many tools that it can often be difficult for teams to know which ones to use and how they interact with one another. Moreover, while the benefits of using these tools are obvious to industrialists, they are rarely used as they carry a learning curve that is too steep and they become too complex to manage over time. In essence, it is commonly believed that they are simply not worth the effort required to learn and use them. This research explores a streamlined process for the fuzzy front-end, assembling the most effective tools and making them accessible to everyone. The process was developed iteratively over the course of 3 years, following over 80 final year NPD teams from engineering, design, technology, and construction as they carried a product from concept through to production specification. Questionnaires, focus groups, and observations were used to understand the usability issues with the tools involved, and a human-centred design approach was adopted to produce a solution to these issues. The solution takes the form of physical toolkit, similar to a board game, which allows the team to play through an example of a new product development in order to understand the process and the tools, before using it for their own product development efforts. A complimentary website is used to enhance the physical toolkit, and it provides more examples of the tools being used, as well as deeper discussions on each of the topics, allowing teams to adapt the process to their skills, preferences and product type. Teams found the solution very useful and intuitive and experienced significantly less confusion and mistakes with the process than teams who did not use it. Those with a design background found it especially useful for the engineering principles like Quality Function Deployment, while those with an engineering or technology background found it especially useful for design and customer requirements acquisition principles, like Voice of the Customer. Products developed using the toolkit are added to the website as more examples of how it can be used, creating a loop which helps future teams understand how the toolkit can be adapted to their project, whether it be a small consumer product or a large B2B service. The toolkit unlocks the potential of these beneficial tools to those in industry, both for large, experienced teams and for inexperienced start-ups. It allows users to assess the market potential of their product concept faster and more effectively, arriving at the product design stage with technical requirements prioritized according to their customers’ needs and wants.

Keywords: new product development, fuzzy front-end, usability, Kano model, quality function deployment, voice of customer

Procedia PDF Downloads 88
26159 Implementation of Lean Manufacturing in Some Companies in Colombia: A Case Study

Authors: Natalia Marulanda, Henry González, Gonzalo León, Alejandro Hincapié

Abstract:

Continuous improvement tools are the result of a set of studies that developed theories and methodologies. These methodologies enable organizations to increase their levels of efficiency, effectiveness, and productivity. Based on these methodologies, lean manufacturing philosophy, which is based on the optimization of resources, waste disposal, and generation of value to products and services, was developed. Lean application has been massive globally, but Colombian companies have been made it incipiently. Therefore, the purpose of this article is to identify the impacts generated by the implementation of lean manufacturing tools in five companies located in Colombia and Medellín metropolitan area. It also seeks to make a comparison of the results obtained from the implementation of lean philosophy and Theory of Constraints. The methodology is qualitative and quantitative, is based on the case study interview from dialogue with the leaders of the processes that used lean tools. The most used tools by research companies are 5's with 100% and TPM with 80%. The less used tool is the synchronous production with 20%. The main reason for the implementation of lean was supply chain management with 83.3%. For the application of lean and TOC, we did not find significant differences between the impact, in terms of methodology, areas of application, staff initiatives, supply chain management, planning, and training.

Keywords: business strategy, lean manufacturing, theory of constraints, supply chain

Procedia PDF Downloads 319
26158 Exploring Affordable Care Practs in Nigeria’s Health Insurance Discourse

Authors: Emmanuel Chinaguh, Kehinde Adeosun

Abstract:

Nigerians die untimely, with 55.75 years of life expectancy, which is 17.45 below the world average of 73.2 (Worldometer, 2020). This is due, among other factors, to the country's limited access to high-quality healthcare. To increase access to good and affordable healthcare services, the National Health Insurance Authority (NHIA) Bill 2022 – which repealed the National Health Insurance Scheme Act 2004 – was passed into law. Applying Jacob Mey’s (2001) pragmatics act (pract) theory, this study explores how NHIA seeks to actualise these healthcare goals by characterising the general situational prototype or pragmemes and pragmatic acts in institutional communications. Data was sourced from the NHIA operational guidelines, which has 147 pages and four sections, and shared posters on NHIA Nigeria Twitter Handle with 14,200 followers. Digital humanities tools, like AntConc and Voyant, were engaged in the data analysis for text encoding and data visualisation. This study identifies these discourse tokens in the data: advertisement and programmes, standards and accreditation, records and information, and offences and penalties. Advertisement and programmes pract facilitating, propagating, prospecting, advising and informing; standards and accreditation, and records and information pract stating, informing and instructing; and offences and penalties pract stating and sanctioning. These practs combined to advance the goals of affordable care and universal accessibility to quality healthcare services. The pragmatic acts were marked by these pragmatic tools: shared situational knowledge (SSK), relevance (REL), reference (REF) and inference (INF). This paper adds to the understanding of health insurance discourse in Nigeria as a mediated social practice that promotes the health of Nigerians.

Keywords: affordable care, NHIA, Nigeria’s health insurance discourse, pragmatic acts.

Procedia PDF Downloads 50
26157 Modified InVEST for Whatsapp Messages Forensic Triage and Search through Visualization

Authors: Agria Rhamdhan

Abstract:

WhatsApp as the most popular mobile messaging app has been used as evidence in many criminal cases. As the use of mobile messages generates large amounts of data, forensic investigation faces the challenge of large data problems. The hardest part of finding this important evidence is because current practice utilizes tools and technique that require manual analysis to check all messages. That way, analyze large sets of mobile messaging data will take a lot of time and effort. Our work offers methodologies based on forensic triage to reduce large data to manageable sets resulting easier to do detailed reviews, then show the results through interactive visualization to show important term, entities and relationship through intelligent ranking using Term Frequency-Inverse Document Frequency (TF-IDF) and Latent Dirichlet Allocation (LDA) Model. By implementing this methodology, investigators can improve investigation processing time and result's accuracy.

Keywords: forensics, triage, visualization, WhatsApp

Procedia PDF Downloads 130
26156 Improving Overall Equipment Effectiveness of CNC-VMC by Implementing Kobetsu Kaizen

Authors: Nakul Agrawal, Y. M. Puri

Abstract:

TPM methodology is a proven approach to increase Overall Equipment Effectiveness (OEE) of machine. OEE is an established method to monitor and improve the effectiveness of manufacturing process. OEE is a product of equipment availability, performance efficiency and quality performance of manufacturing operations. The paper presents a project work for improving OEE of CNC-VMC in a manufacturing industry with the help of TPM tools Kaizen and Autonomous Maintenance. The aim of paper is to enhance OEE by minimizing the breakdown and re-work, increase availability, performance and quality. The calculated OEE of bottle necking machines for 4 months is lower of 53.3%. Root Cause Analysis RCA tools like fishbone diagram, Pareto chart are used for determining the reasons behind low OEE. While Tool like Why-Why analysis is use for determining the basis reasons for low OEE. Tools like Kaizen and Autonomous Maintenance are effectively implemented on CNC-VMC which eliminate the causes of breakdown and prevent from reoccurring. The result obtains from approach shows that OEE of CNC-VMC improved from 53.3% to 73.7% which saves an average sum of Rs.3, 19,000.

Keywords: OEE, TPM, Kaizen, CNC-VMC, why-why analysis, RCA

Procedia PDF Downloads 353
26155 Usage of Visual Tools for Light Exploring with Children in the Geographical Istria Region Kindergartens in Republic of Croatia and Republic of Slovenia

Authors: Urianni Merlin, Đeni Zuliani Blašković

Abstract:

Inspired by the Reggio Pedagogy approach that explores light from physical, mathematical, artistic, and natural perspectives, emphasizes the value of visual tools in light exploring that opens up a wide area of experiential discovery and knowledge, especially if used in kindergartens with children. While there is some literature evidence of visual tool usage for light exploring in kindergartens in the Republic of Slovenia, in the Republic of Croatia there are few researches, and those published are focused at shadow exploring, exploring of physical characteristics and teatrical play of light and shadow. The objectives of this research are to assess how much visual tools are used for light exploring by preschool teachers from geographical Istria kindergartens as part of the activities offered to children and if the usage of the visual tool for light exploring it’s different regarding the work environment (Slovenian and Croatian Istria kindergartens; city vs. village kindergartens; preschool teachers age and length of service). One hundred one preschool teachers from Croatian Istria Region and 70 preschool teachers from Slovenian Istria Region responded to a self-made questionnaire regarding visual tool usage habits in their work. As predicted, results show significant differences in visual tool usage regarding preschool teachers' work environment, length of service, and age. Preschool teachers from Slovenian Istria that work in kindergartens located in the city that have from 15 to 19 years of service and are more than 30 years of age use significantly more visual tools for light exploring. The results highlight the differences in visual tools usage for light exploring in the small Istria peninsula that can be attributed to different University art curricula in Slovenia and Croatia or lifelong education offered in Slovenia that is more open to Italian reggio pedagogy influence and are further used by older preschool teachers with more service experience. Considering the small number of researches, this research significantly contributes to science and motivates preschool teachers and scientists to implement the use of light tools in the preschool and university curriculum, especially in Croatia.

Keywords: activities with light, light exploring, preschool children, visual tools

Procedia PDF Downloads 53
26154 Nursing Preceptors' Perspectives of Assessment Competency

Authors: Watin Alkhelaiwi, Iseult Wilson, Marian Traynor, Katherine Rogers

Abstract:

Clinical nursing education allows nursing students to gain essential knowledge from practice experience and develop nursing skills in a variety of clinical environments. Integrating theoretical knowledge and practical skills is made easier for nursing students by providing opportunities for practice in a clinical environment. Nursing competency is an essential capability required to fulfill nursing responsibilities. Effective mentoring in clinical settings helps nursing students develop the necessary competence and promotes the integration of theory and practice. Preceptors play a considerable role in clinical nursing education, including the supervision of nursing students undergoing a rigorous clinical practicum. Preceptors are also involved in the clinical assessment of nursing students’ competency. The assessment of nursing students’ competence by professional practitioners is essential to investigate whether nurses have developed an adequate level of competence to deliver safe nursing care. Competency assessment remains challenging among nursing educators and preceptors, particularly owing to the complexity of the process. Consistency in terms of assessment methods and tools and valid and reliable assessment tools for measuring competence in clinical practice are lacking. Nurse preceptors must assess students’ competencies to prepare them for future professional responsibilities. Preceptors encounter difficulties in the assessment of competency owing to the nature of the assessment process, lack of standardised assessment tools, and a demanding clinical environment. The purpose of the study is to examine nursing preceptors’ experiences of assessing nursing interns’ competency in Saudi Arabia. There are three objectives in this study; the first objective is to examine the preceptors’ view of the Saudi assessment tool in relation to preceptorship, assessment, the assessment tool, the nursing curriculum, and the grading system. The second and third objectives are to examine preceptors’ view of "competency'' in nursing and their interpretations of the concept of competency and to assess the implications of the research in relation to the Saudi 2030 vision. The study uses an exploratory sequential mixed-methods design that involves a two-phase project: a qualitative focus group study is conducted in phase 1, and a quantitative study- a descriptive cross-sectional design (online survey) is conducted in phase 2. The results will inform the preceptors’ view of the Saudi assessment tool in relation to specific areas, including preceptorship and how the preceptors are prepared to be assessors, and assessment and assessment tools through identifying the appropriateness of the instrument for clinical practice. The results will also inform the challenges and difficulties that face the preceptors. These results will be analysed thematically for the focus group interview data, and SPSS software will be used for the analysis of the online survey data.

Keywords: clinical assessment tools, clinical competence, competency assessment, mentor, nursing, nurses, preceptor

Procedia PDF Downloads 31
26153 Knowledge Sharing Practices in the Healthcare Sector: Evidences from Primary Health Care Organizations in Indonesia

Authors: Galih Imaduddin

Abstract:

Knowledge has been viewed as one of the most important resources in organizations, including those that operate in the healthcare sector. On that basis, Knowledge Management (KM) is crucial for healthcare organizations to improve their productivity and ensure effective utilization of their resources. Despite the growing interests to understand how KM might work for healthcare organizations, there is only a modest amount of empirical inquiries which have specifically focused on the tools and initiatives to share knowledge. Hence, the main purpose of this paper is to investigate the way healthcare organizations, particularly public sector ones, utilize knowledge sharing tools and initiatives for the benefit of patient-care. Employing a qualitative method, 13 (thirteen) Community Health Centers (CHCs) from a high-performing district health setting in Indonesia were observed. Data collection and analysis involved a repetition of document retrievals and interviews (n=41) with multidisciplinary health professionals who work in these CHCs. A single case study was cultivated reflecting on the means that were used to share knowledge, along with the factors that inhibited the exchange of knowledge among those health professionals. The study discovers that all of the thirteen CHCs exhibited and applied knowledge sharing means which included knowledge documents, virtual communication channels (i.e. emails and chatting applications), and social learning forums such as staff meetings, morning briefings, and communities of practices. However, the intensity of utilization was different among these CHCs, in which organizational culture, leadership, professional boundaries, and employees’ technological aptitude were presumed to be the factors that inhibit knowledge sharing processes. Making a distance with the KM literature of other sectors, this study denounces the primacy of technology-based tools, suggesting that socially-based initiatives could be more reliable for sharing knowledge. This suggestion is largely due to the nature of healthcare work which is still predominantly based on the tacit form of knowledge.

Keywords: knowledge management, knowledge sharing, knowledge sharing tools and initiatives, knowledge sharing inhibitors, primary health care organizations

Procedia PDF Downloads 220
26152 Semi-Automatic Segmentation of Mitochondria on Transmission Electron Microscopy Images Using Live-Wire and Surface Dragging Methods

Authors: Mahdieh Farzin Asanjan, Erkan Unal Mumcuoglu

Abstract:

Mitochondria are cytoplasmic organelles of the cell, which have a significant role in the variety of cellular metabolic functions. Mitochondria act as the power plants of the cell and are surrounded by two membranes. Significant morphological alterations are often due to changes in mitochondrial functions. A powerful technique in order to study the three-dimensional (3D) structure of mitochondria and its alterations in disease states is Electron microscope tomography. Detection of mitochondria in electron microscopy images due to the presence of various subcellular structures and imaging artifacts is a challenging problem. Another challenge is that each image typically contains more than one mitochondrion. Hand segmentation of mitochondria is tedious and time-consuming and also special knowledge about the mitochondria is needed. Fully automatic segmentation methods lead to over-segmentation and mitochondria are not segmented properly. Therefore, semi-automatic segmentation methods with minimum manual effort are required to edit the results of fully automatic segmentation methods. Here two editing tools were implemented by applying spline surface dragging and interactive live-wire segmentation tools. These editing tools were applied separately to the results of fully automatic segmentation. 3D extension of these tools was also studied and tested. Dice coefficients of 2D and 3D for surface dragging using splines were 0.93 and 0.92. This metric for 2D and 3D for live-wire method were 0.94 and 0.91 respectively. The root mean square symmetric surface distance values of 2D and 3D for surface dragging was measured as 0.69, 0.93. The same metrics for live-wire tool were 0.60 and 2.11. Comparing the results of these editing tools with the results of automatic segmentation method, it shows that these editing tools, led to better results and these results were more similar to ground truth image but the required time was higher than hand-segmentation time

Keywords: medical image segmentation, semi-automatic methods, transmission electron microscopy, surface dragging using splines, live-wire

Procedia PDF Downloads 138