Search results for: Agile SDLC Tools
1016 Enhancing Operational Efficiency and Patient Care at Johns Hopkins Aramco Healthcare through a Business Intelligence Framework
Authors: Muneera Mohammed Al-Dossary, Fatimah Mohammed Al-Dossary, Mashael Al-Shahrani, Amal Al-Tammemi
Abstract:
Johns Hopkins Aramco Healthcare (JAHA), a joint venture between Saudi Aramco and Johns Hopkins Medicine, delivers comprehensive healthcare services to a diverse patient population. Despite achieving high patient satisfaction rates and surpassing several operational targets, JAHA faces challenges such as appointment delays and resource inefficiencies. These issues highlight the need for an advanced, integrated approach to operational management. This paper proposes a Business Intelligence (BI) framework to address these challenges, leveraging tools such as Epic electronic health records and Tableau dashboards. The framework focuses on data integration, real-time monitoring, and predictive analytics to streamline operations and enhance decision-making. Key outcomes include reduced wait times (e.g., a 23% reduction in specialty clinic wait times) and improved operating room efficiency (from 95.83% to 98% completion rates). These advancements align with JAHA’s strategic objectives of optimizing resource utilization and delivering superior patient care. The findings underscore the transformative potential of BI in healthcare, enabling a shift from reactive to proactive operations management. The success of this implementation lays the foundation for future innovations, including machine learning models for more precise demand forecasting and resource allocation.Keywords: business intelligence, operational efficiency, healthcare management, predictive analytics, patient care improvement, data integration, real-time monitoring, resource optimization, Johns Hopkins Aramco Healthcare, electronic health records, Tableau dashboards, predictive modeling, efficiency metrics, resource utilization, patient satisfaction
Procedia PDF Downloads 21015 Energy Trading for Cooperative Microgrids with Renewable Energy Resources
Authors: Ziaullah, Shah Wahab Ali
Abstract:
Micro-grid equipped with heterogeneous energy resources present the idea of small scale distributed energy management (DEM). DEM helps in minimizing the transmission and operation costs, power management and peak load demands. Micro-grids are collections of small, independent controllable power-generating units and renewable energy resources. Micro-grids also motivate to enable active customer participation by giving accessibility of real-time information and control to the customer. The capability of fast restoration against faulty situation, integration of renewable energy resources and Information and Communication Technologies (ICT) make micro-grid as an ideal system for distributed power systems. Micro-grids can have a bank of energy storage devices. The energy management system of micro-grid can perform real-time energy forecasting of renewable resources, energy storage elements and controllable loads in making proper short-term scheduling to minimize total operating costs. We present a review of existing micro-grids optimization objectives/goals, constraints, solution approaches and tools used in micro-grids for energy management. Cost-benefit analysis of micro-grid reveals that cooperation among different micro-grids can play a vital role in the reduction of import energy cost and system stability. Cooperative micro-grids energy trading is an approach to electrical distribution energy resources that allows local energy demands more control over the optimization of power resources and uses. Cooperation among different micro-grids brings the interconnectivity and power trading issues. According to the literature, it shows that open area of research is available for cooperative micro-grids energy trading. In this paper, we proposed and formulated the efficient energy management/trading module for interconnected micro-grids. It is believed that this research will open new directions in future for energy trading in cooperative micro-grids/interconnected micro-grids.Keywords: distributed energy management, information and communication technologies, microgrid, energy management
Procedia PDF Downloads 3751014 A Constructivist and Strategic Approach to School Learning: A Study in a Tunisian Primary School
Authors: Slah Eddine Ben Fadhel
Abstract:
Despite the development of new pedagogic methods, current teaching practices put more emphasis on the learning products than on the processes learners deploy. In school syllabi, for instance, very little time is devoted to both the explanation and analysis of strategies aimed at resolving problems by means of targeting students’ metacognitive procedures. Within a cognitive framework, teaching/learning contexts are conceived of in terms of cognitive, metacognitive and affective activities intended for the treatment of information. During these activities, learners come to develop an array of knowledge and strategies which can be subsumed within an active and constructive process. Through the investigation of strategies and metacognition concepts, the purpose is to reflect upon the modalities at the heart of the learning process and to demonstrate, similarly, the inherent significance of a cognitive approach to learning. The scope of this paper is predicated on a study where the population is a group of 76 primary school pupils who experienced difficulty with learning French. The population was divided into two groups: the first group was submitted during three months to a strategy-based training to learn French. All through this phase, the teachers centred class activities round making learners aware of the strategies the latter deployed and geared them towards appraising the steps these learners had themselves taken by means of a variety of tools, most prominent among which is the logbook. The second group was submitted to the usual learning context with no recourse whatsoever to any strategy-oriented tasks. The results of both groups point out the improvement of linguistic competences in the French language in the case of those pupils who were trained by means of strategic procedures. Furthermore, this improvement was noted in relation with the native language (Arabic), a fact that tends to highlight the importance of the interdisciplinary investigation of (meta-)cognitive strategies. These results show that strategic learning promotes in pupils the development of a better awareness of their own processes, which contributes to improving their general linguistic competences.Keywords: constructive approach, cognitive strategies, metacognition, learning
Procedia PDF Downloads 2121013 Transforming Data Science Curriculum Through Design Thinking
Authors: Samar Swaid
Abstract:
Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.Keywords: data science, design thinking, AI, currculum, transformation
Procedia PDF Downloads 821012 Maturity Classification of Oil Palm Fresh Fruit Bunches Using Thermal Imaging Technique
Authors: Shahrzad Zolfagharnassab, Abdul Rashid Mohamed Shariff, Reza Ehsani, Hawa Ze Jaffar, Ishak Aris
Abstract:
Ripeness estimation of oil palm fresh fruit is important processes that affect the profitableness and salability of oil palm fruits. The adulthood or ripeness of the oil palm fruits influences the quality of oil palm. Conventional procedure includes physical grading of Fresh Fruit Bunches (FFB) maturity by calculating the number of loose fruits per bunch. This physical classification of oil palm FFB is costly, time consuming and the results may have human error. Hence, many researchers try to develop the methods for ascertaining the maturity of oil palm fruits and thereby, deviously the oil content of distinct palm fruits without the need for exhausting oil extraction and analysis. This research investigates the potential of infrared images (Thermal Images) as a predictor to classify the oil palm FFB ripeness. A total of 270 oil palm fresh fruit bunches from most common cultivar of oil palm bunches Nigresens according to three maturity categories: under ripe, ripe and over ripe were collected. Each sample was scanned by the thermal imaging cameras FLIR E60 and FLIR T440. The average temperature of each bunches were calculated by using image processing in FLIR Tools and FLIR ThermaCAM researcher pro 2.10 environment software. The results show that temperature content decreased from immature to over mature oil palm FFBs. An overall analysis-of-variance (ANOVA) test was proved that this predictor gave significant difference between underripe, ripe and overripe maturity categories. This shows that the temperature as predictors can be good indicators to classify oil palm FFB. Classification analysis was performed by using the temperature of the FFB as predictors through Linear Discriminant Analysis (LDA), Mahalanobis Discriminant Analysis (MDA), Artificial Neural Network (ANN) and K- Nearest Neighbor (KNN) methods. The highest overall classification accuracy was 88.2% by using Artificial Neural Network. This research proves that thermal imaging and neural network method can be used as predictors of oil palm maturity classification.Keywords: artificial neural network, maturity classification, oil palm FFB, thermal imaging
Procedia PDF Downloads 3631011 The Role of Acoustical Design within Architectural Design in the Early Design Phase
Authors: O. Wright, N. Perkins, M. Donn, M. Halstead
Abstract:
This research responded to anecdotal evidence that suggested inefficiencies within the Architect and Acoustician relationship may lead to ineffective acoustic design decisions. The acoustician spoken to believed that he was approached too late in the design phase. The approached architect valued acoustical qualities, yet, struggled to interpret common measurement parameters. The preliminary investigation of these opinions indicated a gap in the current New Zealand Architectural discourse and currently informs the creation of a 2016 Master of Architecture (Prof) thesis research. Little meaningful information about acoustic intervention in the early design phase could be found from past literature. In the information that was sourced, authors focus on software as an incorporation tool without investigating why the flaws in the relationship originally exist. To further explore this relationship, a survey was designed. It underwent three phases to ensure its consistency, and was delivered to a group of 51 acousticians from one international Acoustics company. The results were then separated between New Zealand and off-shore to identify trends. The survey results suggest that 75% of acousticians meet the architect less than 5 times per project. Instead of regular contact, a mediated method is adopted though a mix of telecommunication and written reports. Acousticians tend to be introduced later into New Zealand building project than the corresponding off-shore building. This delay corresponds to an increase in remedial action for each of the building types in the survey except Auditoria and Office Buildings. 31 participants have had their specifications challenged by an architect. Furthermore, 71% of the acousticians believe that architects do not have the knowledge to understand why the acoustic specifications are in place. The issues raised in this investigation align to the colloquial evidence expressed by the two consultants. It identifies a larger gap in the industry were acoustics is remedially treated rather than identified as a possible design driver. Further research through design is suggested to understand the role of acoustics within architectural design and potential tools for its inclusion during, not after, the design process.Keywords: architectural acoustics, early-design, interdisciplinary communication, remedial response
Procedia PDF Downloads 2531010 Sustainability through Resilience: How Emergency Responders Cope with Stressors
Authors: Sophie Kroeling, Agnetha Schuchardt
Abstract:
Striving for sustainability brings a lot of challenges for different fields of interest, e. g. security or health concerns. In Germany, civil protection is predominantly carried out by emergency responders who perform essential tasks of civil protection. Based on theoretical concepts of different psychological stress theories this contribution focuses on the question, how the resilience of emergency responders can be improved. The goal is to identify resources and successful coping strategies that help to prevent and reduce negative outcomes during or after stressful events. The paper will present results from a qualitative analysis of semi-structured qualitative interviews with 20 emergency responders. These results provide insights into the complexity of coping processes (e. g. controlling the situation, downplaying perceived personal threats through humor) and show the diversity of stressors (like complexity of the disastrous situation, intrusive press and media, or lack of social support within the organization). Self-efficacy expectation was a very important resource for coping with stressful situations. The results served as a starting point for a quantitative survey (that was conducted in March 2017), the development of education and training tools for emergency responders and the improvement of critical incident stress management processes. First results from the quantitative study with more than 700 participants show that, e. g., the emergency responders use social coping within their private social network and also within their aid organization and that both are correlated to resilience. Moreover, missing information, bureaucratic problems and social conflicts within the organization are events that the majority of the participants considered very onerous. Further results from regression analysis will be presented. The proposed paper will combine findings from the qualitative study with the quantitative results, illustrating figures and correlations with respective statements from the interviews. At the end, suggestions for the improvement of the emergency responder’s resilience are given and it is discussed how this can make a contribution to strive for civil security and furthermore a sustainable development.Keywords: civil security, emergency responders, stress, resilience, resources
Procedia PDF Downloads 1451009 Developing a Framework for Open Source Software Adoption in a Higher Education Institution in Uganda. A case of Kyambogo University
Authors: Kafeero Frank
Abstract:
This study aimed at developing a frame work for open source software adoption in an institution of higher learning in Uganda, with the case of KIU as a study area. There were mainly four research questions based on; individual staff interaction with open source software forum, perceived FOSS characteristics, organizational characteristics and external characteristics as factors that affect open source software adoption. The researcher used causal-correlation research design to study effects of these variables on open source software adoption. A quantitative approach was used in this study with self-administered questionnaire on a purposively and randomly sampled sample of university ICT staff. Resultant data was analyzed using means, correlation coefficients and multivariate multiple regression analysis as statistical tools. The study reveals that individual staff interaction with open source software forum and perceived FOSS characteristics were the primary factors that significantly affect FOSS adoption while organizational and external factors were secondary with no significant effect but significant correlation to open source software adoption. It was concluded that for effective open source software adoption to occur there must be more effort on primary factors with subsequent reinforcement of secondary factors to fulfill the primary factors and adoption of open source software. Lastly recommendations were made in line with conclusions for coming up with Kyambogo University frame work for open source software adoption in institutions of higher learning. Areas of further research recommended include; Stakeholders’ analysis of open source software adoption in Uganda; Challenges and way forward. Evaluation of Kyambogo University frame work for open source software adoption in institutions of higher learning. Framework development for cloud computing adoption in Ugandan universities. Framework for FOSS development in Uganda IT industryKeywords: open source software., organisational characteristics, external characteristics, cloud computing adoption
Procedia PDF Downloads 721008 Sizing of Drying Processes to Optimize Conservation of the Nuclear Power Plants on Stationary
Authors: Assabo Mohamed, Bile Mohamed, Ali Farah, Isman Souleiman, Olga Alos Ramos, Marie Cadet
Abstract:
The life of a nuclear power plant is regularly punctuated by short or long period outages to carry out maintenance operations and/or nuclear fuel reloading. During these stops periods, it is essential to conserve all the secondary circuit equipment to avoid corrosion priming. This kind of circuit is one of the main components of a nuclear reactor. Indeed, the conservation materials on shutdown of a nuclear unit improve circuit performance and reduce the maintenance cost considerably. This study is a part of the optimization of the dry preservation of equipment from the water station of the nuclear reactor. The main objective is to provide tools to guide Electricity Production Nuclear Centre (EPNC) in order to achieve the criteria required by the chemical specifications of conservation materials. A theoretical model of drying exchangers of water station is developed by the software Engineering Equation Solver (EES). It used to size requirements and air quality needed for dry conservation of equipment. This model is based on heat transfer and mass transfer governing the drying operation. A parametric study is conducted to know the influence of aerothermal factor taking part in the drying operation. The results show that the success of dry conservation of equipment of the secondary circuit of nuclear reactor depends strongly on the draining, the quality of drying air and the flow of air injecting in the secondary circuit. Finally, theoretical case study performed on EES highlights the importance of mastering the entire system to balance the air system to provide each exchanger optimum flow depending on its characteristics. From these results, recommendations to nuclear power plants can be formulated to optimize drying practices and achieve good performance in the conservation of material from the water at the stop position.Keywords: dry conservation, optimization, sizing, water station
Procedia PDF Downloads 2631007 The Significance of Computer Assisted Language Learning in Teaching English Grammar in Tribal Zone of Chhattisgarh
Authors: Yogesh Kumar Tiwari
Abstract:
Chhattisgarh has realized the fundamental role of information and communication technology in the globalized world where knowledge is at the top for the growth and intellectual development. They are spreading so widely that one feels lagging behind if not using them. The influence of these radiating and technological tools has encompassed all aspects of the educational, business, and economic sectors of our world. Undeniably the computer has not only established itself globally in all walks of life but has acquired a fundamental role of paramount importance in the educational process also. This role is getting all pervading and more powerful as computers are being manufactured to be cheaper, smaller in size, adaptable and easy to handle. Computers are becoming indispensable to teachers because of their enormous capabilities and extensive competence. This study aims at observing the effect of using computer based software program of English language on the achievement of undergraduate level students studying in tribal area like Sarguja Division, Chhattisgarh, India. To testify the effect of an innovative teaching in the graduate classroom in tribal area 50 students were randomly selected and separated into two groups. The first group of 25 students were taught English grammar i.e., passive voice/narration, through traditional method using chalk and blackboard asking some formal questions. The second group, the experimental one, was taught English grammar i.e., passive voice/narration, using computer, projector with power point presentation of grammatical items. The statistical analysis was done on the students’ learning capacities and achievement. The result was extremely mesmerizing not only for the teacher but for taught also. The process of the recapitulation demonstrated that the students of experimental group responded the answers of the questions enthusiastically with innovative sense of learning. In light of the findings of the study, it was recommended that teachers and professors of English ought to use self-made instructional program in their teaching process particularly in tribal areas.Keywords: achievement computer assisted language learning, use of instructional program
Procedia PDF Downloads 1501006 A Review of Critical Framework Assessment Matrices for Data Analysis on Overheating in Buildings Impact
Authors: Martin Adlington, Boris Ceranic, Sally Shazhad
Abstract:
In an effort to reduce carbon emissions, changes in UK regulations, such as Part L Conservation of heat and power, dictates improved thermal insulation and enhanced air tightness. These changes were a direct response to the UK Government being fully committed to achieving its carbon targets under the Climate Change Act 2008. The goal is to reduce emissions by at least 80% by 2050. Factors such as climate change are likely to exacerbate the problem of overheating, as this phenomenon expects to increase the frequency of extreme heat events exemplified by stagnant air masses and successive high minimum overnight temperatures. However, climate change is not the only concern relevant to overheating, as research signifies, location, design, and occupation; construction type and layout can also play a part. Because of this growing problem, research shows the possibility of health effects on occupants of buildings could be an issue. Increases in temperature can perhaps have a direct impact on the human body’s ability to retain thermoregulation and therefore the effects of heat-related illnesses such as heat stroke, heat exhaustion, heat syncope and even death can be imminent. This review paper presents a comprehensive evaluation of the current literature on the causes and health effects of overheating in buildings and has examined the differing applied assessment approaches used to measure the concept. Firstly, an overview of the topic was presented followed by an examination of overheating research work from the last decade. These papers form the body of the article and are grouped into a framework matrix summarizing the source material identifying the differing methods of analysis of overheating. Cross case evaluation has identified systematic relationships between different variables within the matrix. Key areas focused on include, building types and country, occupants behavior, health effects, simulation tools, computational methods.Keywords: overheating, climate change, thermal comfort, health
Procedia PDF Downloads 3511005 A Readiness Framework for Digital Innovation in Education: The Context of Academics and Policymakers in Higher Institutions of Learning to Assess the Preparedness of Their Institutions to Adopt and Incorporate Digital Innovation
Authors: Lufungula Osembe
Abstract:
The field of education has witnessed advances in technology and digital transformation. The methods of teaching have undergone significant changes in recent years, resulting in effects on various areas such as pedagogies, curriculum design, personalized teaching, gamification, data analytics, cloud-based learning applications, artificial intelligence tools, advanced plug-ins in LMS, and the emergence of multimedia creation and design. The field of education has not been immune to the changes brought about by digital innovation in recent years, similar to other fields such as engineering, health, science, and technology. There is a need to look at the variables/elements that digital innovation brings to education and develop a framework for higher institutions of learning to assess their readiness to create a viable environment for digital innovation to be successfully adopted. Given the potential benefits of digital innovation in education, it is essential to develop a framework that can assist academics and policymakers in higher institutions of learning to evaluate the effectiveness of adopting and adapting to the evolving landscape of digital innovation in education. The primary research question addressed in this study is to establish the preparedness of higher institutions of learning to adopt and adapt to the evolving landscape of digital innovation. This study follows a Design Science Research (DSR) paradigm to develop a framework for academics and policymakers in higher institutions of learning to evaluate the readiness of their institutions to adopt digital innovation in education. The Design Science Research paradigm is proposed to aid in developing a readiness framework for digital innovation in education. This study intends to follow the Design Science Research (DSR) methodology, which includes problem awareness, suggestion, development, evaluation, and conclusion. One of the major contributions of this study will be the development of the framework for digital innovation in education. Given the various opportunities offered by digital innovation in recent years, the need to create a readiness framework for digital innovation will play a crucial role in guiding academics and policymakers in their quest to align with emerging technologies facilitated by digital innovation in education.Keywords: digital innovation, DSR, education, opportunities, research
Procedia PDF Downloads 701004 Two-Level Graph Causality to Detect and Predict Random Cyber-Attacks
Authors: Van Trieu, Shouhuai Xu, Yusheng Feng
Abstract:
Tracking attack trajectories can be difficult, with limited information about the nature of the attack. Even more difficult as attack information is collected by Intrusion Detection Systems (IDSs) due to the current IDSs having some limitations in identifying malicious and anomalous traffic. Moreover, IDSs only point out the suspicious events but do not show how the events relate to each other or which event possibly cause the other event to happen. Because of this, it is important to investigate new methods capable of performing the tracking of attack trajectories task quickly with less attack information and dependency on IDSs, in order to prioritize actions during incident responses. This paper proposes a two-level graph causality framework for tracking attack trajectories in internet networks by leveraging observable malicious behaviors to detect what is the most probable attack events that can cause another event to occur in the system. Technically, given the time series of malicious events, the framework extracts events with useful features, such as attack time and port number, to apply to the conditional independent tests to detect the relationship between attack events. Using the academic datasets collected by IDSs, experimental results show that the framework can quickly detect the causal pairs that offer meaningful insights into the nature of the internet network, given only reasonable restrictions on network size and structure. Without the framework’s guidance, these insights would not be able to discover by the existing tools, such as IDSs. It would cost expert human analysts a significant time if possible. The computational results from the proposed two-level graph network model reveal the obvious pattern and trends. In fact, more than 85% of causal pairs have the average time difference between the causal and effect events in both computed and observed data within 5 minutes. This result can be used as a preventive measure against future attacks. Although the forecast may be short, from 0.24 seconds to 5 minutes, it is long enough to be used to design a prevention protocol to block those attacks.Keywords: causality, multilevel graph, cyber-attacks, prediction
Procedia PDF Downloads 1571003 Learning, Teaching and Assessing Students’ ESP Skills via Exe and Hot Potatoes Software Programs
Authors: Naira Poghosyan
Abstract:
In knowledge society the content of the studies, the methods used and the requirements for an educator’s professionalism regularly undergo certain changes. It follows that in knowledge society the aim of education is not only to educate professionals for a certain field but also to help students to be aware of cultural values, form human mutual relationship, collaborate, be open, adapt to the new situation, creatively express their ideas, accept responsibility and challenge. In this viewpoint, the development of communicative language competence requires a through coordinated approach to ensure proper comprehension and memorization of subject-specific words starting from high school level. On the other hand, ESP (English for Specific Purposes) teachers and practitioners are increasingly faced with the task of developing and exploiting new ways of assessing their learners’ literacy while learning and teaching ESP. The presentation will highlight the latest achievements in this field. The author will present some practical methodological issues and principles associated with learning, teaching and assessing ESP skills of the learners, using the two software programs of EXE 2.0 and Hot Potatoes 6. On the one hand the author will display the advantages of the two programs as self-learning and self-assessment interactive tools in the course of academic study and professional development of the CLIL learners, on the other hand, she will comprehensively shed light upon some methodological aspects of working out appropriate ways of selection, introduction, consolidation of subject specific materials via EXE 2.0 and Hot Potatoes 6. Then the author will go further to distinguish ESP courses by the general nature of the learners’ specialty identifying three large categories of EST (English for Science and Technology), EBE (English for Business and Economics) and ESS (English for the Social Sciences). The cornerstone of the presentation will be the introduction of the subject titled “The methodology of teaching ESP in non-linguistic institutions”, where a unique case of teaching ESP on Architecture and Construction via EXE 2.0 and Hot Potatoes 6 will be introduced, exemplifying how the introduction, consolidation and assessment can be used as a basis for feedback to the ESP learners in a particular professional field.Keywords: ESP competences, ESP skill assessment/ self-assessment tool, eXe 2.0 / HotPotatoes software program, ESP teaching strategies and techniques
Procedia PDF Downloads 3781002 Surface Modification of Co-Based Nanostructures to Develop Intrinsic Fluorescence and Catalytic Activity
Authors: Monalisa Pal, Kalyan Mandal
Abstract:
Herein we report the molecular functionalization of promising transition metal oxide nanostructures, such as Co3O4 nanocubes, using nontoxic and biocompati-ble organic ligand sodium tartrate. The electronic structural modification of the nanocubes imparted through functionalization and subsequent water solubilization reveals multiple absorption bands in the UV-vis region. Further surface modification of the solubilized nanocubes, leads to the emergence of intrinsic multi-color fluorescence (from blue, cyan, green to red region of the spectrum), upon excitation at proper wavelengths, where the respective excitation wavelengths have a direct correlation with the observed UV-vis absorption bands. Using a multitude of spectroscopic tools we have investigated the mechanistic insight behind the origin of different UV-vis absorption bands and emergence of multicolor photoluminescence from the functionalized nanocubes. Our detailed study shows that ligand to metal charge transfer (LMCT) from tartrate ligand to Co2+/Co3+ ions and d-d transitions involving Co2+/Co3+ ions are responsible for generation of this novel optical properties. Magnetic study reveals that, antiferromagnetic nature of Co3O4 nanocubes changes to ferromagnetic behavior upon functionalization, however, the overall magnetic response was very weak. To combine strong magnetism with this novel optical property, we followed the same surface modification strategy in case of CoFe2O4 nanoparticles, which reveals that irrespective of size and shape, all Co-based oxides can develop intrinsic multi-color fluorescence upon facile functionalization with sodium tartrate ligands and the magnetic response was significantly higher. Surface modified Co-based oxide nanostructures also show excellent catalytic activity in degradation of biologically and environmentally harmful dyes. We hope that, our developed facile functionalization strategy of Co-based oxides will open up new opportunities in the field of biomedical applications such as bio-imaging and targeted drug delivery.Keywords: co-based oxide nanostructures, functionalization, multi-color fluorescence, catalysis
Procedia PDF Downloads 3871001 Lean Production to Increase Reproducibility and Work Safety in the Laser Beam Melting Process Chain
Authors: C. Bay, A. Mahr, H. Groneberg, F. Döpper
Abstract:
Additive Manufacturing processes are becoming increasingly established in the industry for the economic production of complex prototypes and functional components. Laser beam melting (LBM), the most frequently used Additive Manufacturing technology for metal parts, has been gaining in industrial importance for several years. The LBM process chain – from material storage to machine set-up and component post-processing – requires many manual operations. These steps often depend on the manufactured component and are therefore not standardized. These operations are often not performed in a standardized manner, but depend on the experience of the machine operator, e.g., levelling of the build plate and adjusting the first powder layer in the LBM machine. This lack of standardization limits the reproducibility of the component quality. When processing metal powders with inhalable and alveolar particle fractions, the machine operator is at high risk due to the high reactivity and the toxic (e.g., carcinogenic) effect of the various metal powders. Faulty execution of the operation or unintentional omission of safety-relevant steps can impair the health of the machine operator. In this paper, all the steps of the LBM process chain are first analysed in terms of their influence on the two aforementioned challenges: reproducibility and work safety. Standardization to avoid errors increases the reproducibility of component quality as well as the adherence to and correct execution of safety-relevant operations. The corresponding lean method 5S will therefore be applied, in order to develop approaches in the form of recommended actions that standardize the work processes. These approaches will then be evaluated in terms of ease of implementation and their potential for improving reproducibility and work safety. The analysis and evaluation showed that sorting tools and spare parts as well as standardizing the workflow are likely to increase reproducibility. Organizing the operational steps and production environment decreases the hazards of material handling and consequently improves work safety.Keywords: additive manufacturing, lean production, reproducibility, work safety
Procedia PDF Downloads 1841000 Brachypodium: A Model Genus to Study Grass Genome Organisation at the Cytomolecular Level
Authors: R. Hasterok, A. Betekhtin, N. Borowska, A. Braszewska-Zalewska, E. Breda, K. Chwialkowska, R. Gorkiewicz, D. Idziak, J. Kwasniewska, M. Kwasniewski, D. Siwinska, A. Wiszynska, E. Wolny
Abstract:
In contrast to animals, the organisation of plant genomes at the cytomolecular level is still relatively poorly studied and understood. However, the Brachypodium genus in general and B. distachyon in particular represent exceptionally good model systems for such study. This is due not only to their highly desirable ‘model’ biological features, such as small nuclear genome, low chromosome number and complex phylogenetic relations, but also to the rapidly and continuously growing repertoire of experimental tools, such as large collections of accessions, WGS information, large insert (BAC) libraries of genomic DNA, etc. Advanced cytomolecular techniques, such as fluorescence in situ hybridisation (FISH) with evermore sophisticated probes, empowered by cutting-edge microscope and digital image acquisition and processing systems, offer unprecedented insight into chromatin organisation at various phases of the cell cycle. A good example is chromosome painting which uses pools of chromosome-specific BAC clones, and enables the tracking of individual chromosomes not only during cell division but also during interphase. This presentation outlines the present status of molecular cytogenetic analyses of plant genome structure, dynamics and evolution using B. distachyon and some of its relatives. The current projects focus on important scientific questions, such as: What mechanisms shape the karyotypes? Is the distribution of individual chromosomes within an interphase nucleus determined? Are there hot spots of structural rearrangement in Brachypodium chromosomes? Which epigenetic processes play a crucial role in B. distachyon embryo development and selective silencing of rRNA genes in Brachypodium allopolyploids? The authors acknowledge financial support from the Polish National Science Centre (grants no. 2012/04/A/NZ3/00572 and 2011/01/B/NZ3/00177)Keywords: Brachypodium, B. distachyon, chromosome, FISH, molecular cytogenetics, nucleus, plant genome organisation
Procedia PDF Downloads 351999 Documenting the 15th Century Prints with RTI
Authors: Peter Fornaro, Lothar Schmitt
Abstract:
The Digital Humanities Lab and the Institute of Art History at the University of Basel are collaborating in the SNSF research project ‘Digital Materiality’. Its goal is to develop and enhance existing methods for the digital reproduction of cultural heritage objects in order to support art historical research. One part of the project focuses on the visualization of a small eye-catching group of early prints that are noteworthy for their subtle reliefs and glossy surfaces. Additionally, this group of objects – known as ‘paste prints’ – is characterized by its fragile state of preservation. Because of the brittle substances that were used for their production, most paste prints are heavily damaged and thus very hard to examine. These specific material properties make a photographic reproduction extremely difficult. To obtain better results we are working with Reflectance Transformation Imaging (RTI), a computational photographic method that is already used in archaeological and cultural heritage research. This technique allows documenting how three-dimensional surfaces respond to changing lighting situations. Our first results show that RTI can capture the material properties of paste prints and their current state of preservation more accurately than conventional photographs, although there are limitations with glossy surfaces because the mathematical models that are included in RTI are kept simple in order to keep the software robust and easy to use. To improve the method, we are currently developing tools for a more detailed analysis and simulation of the reflectance behavior. An enhanced analytical model for the representation and visualization of gloss will increase the significance of digital representations of cultural heritage objects. For collaborative efforts, we are working on a web-based viewer application for RTI images based on WebGL in order to make acquired data accessible to a broader international research community. At the ICDH Conference, we would like to present unpublished results of our work and discuss the implications of our concept for art history, computational photography and heritage science.Keywords: art history, computational photography, paste prints, reflectance transformation imaging
Procedia PDF Downloads 276998 Evaluation of Video Quality Metrics and Performance Comparison on Contents Taken from Most Commonly Used Devices
Authors: Pratik Dhabal Deo, Manoj P.
Abstract:
With the increasing number of social media users, the amount of video content available has also significantly increased. Currently, the number of smartphone users is at its peak, and many are increasingly using their smartphones as their main photography and recording devices. There have been a lot of developments in the field of Video Quality Assessment (VQA) and metrics like VMAF, SSIM etc. are said to be some of the best performing metrics, but the evaluation of these metrics is dominantly done on professionally taken video contents using professional tools, lighting conditions etc. No study particularly pinpointing the performance of the metrics on the contents taken by users on very commonly available devices has been done. Datasets that contain a huge number of videos from different high-end devices make it difficult to analyze the performance of the metrics on the content from most used devices even if they contain contents taken in poor lighting conditions using lower-end devices. These devices face a lot of distortions due to various factors since the spectrum of contents recorded on these devices is huge. In this paper, we have presented an analysis of the objective VQA metrics on contents taken only from most used devices and their performance on them, focusing on full-reference metrics. To carry out this research, we created a custom dataset containing a total of 90 videos that have been taken from three most commonly used devices, and android smartphone, an IOS smartphone and a DSLR. On the videos taken on each of these devices, the six most common types of distortions that users face have been applied on addition to already existing H.264 compression based on four reference videos. These six applied distortions have three levels of degradation each. A total of the five most popular VQA metrics have been evaluated on this dataset and the highest values and the lowest values of each of the metrics on the distortions have been recorded. Finally, it is found that blur is the artifact on which most of the metrics didn’t perform well. Thus, in order to understand the results better the amount of blur in the data set has been calculated and an additional evaluation of the metrics was done using HEVC codec, which is the next version of H.264 compression, on the camera that proved to be the sharpest among the devices. The results have shown that as the resolution increases, the performance of the metrics tends to become more accurate and the best performing metric among them is VQM with very few inconsistencies and inaccurate results when the compression applied is H.264, but when the compression is applied is HEVC, SSIM and VMAF have performed significantly better.Keywords: distortion, metrics, performance, resolution, video quality assessment
Procedia PDF Downloads 204997 Detecting Natural Fractures and Modeling Them to Optimize Field Development Plan in Libyan Deep Sandstone Reservoir (Case Study)
Authors: Tarek Duzan
Abstract:
Fractures are a fundamental property of most reservoirs. Despite their abundance, they remain difficult to detect and quantify. The most effective characterization of fractured reservoirs is accomplished by integrating geological, geophysical, and engineering data. Detection of fractures and defines their relative contribution is crucial in the early stages of exploration and later in the production of any field. Because fractures could completely change our thoughts, efforts, and planning to produce a specific field properly. From the structural point of view, all reservoirs are fractured to some point of extent. North Gialo field is thought to be a naturally fractured reservoir to some extent. Historically, natural fractured reservoirs are more complicated in terms of their exploration and production efforts, and most geologists tend to deny the presence of fractures as an effective variable. Our aim in this paper is to determine the degree of fracturing, and consequently, our evaluation and planning can be done properly and efficiently from day one. The challenging part in this field is that there is no enough data and straightforward well testing that can let us completely comfortable with the idea of fracturing; however, we cannot ignore the fractures completely. Logging images, available well testing, and limited core studies are our tools in this stage to evaluate, model, and predict possible fracture effects in this reservoir. The aims of this study are both fundamental and practical—to improve the prediction and diagnosis of natural-fracture attributes in N. Gialo hydrocarbon reservoirs and accurately simulate their influence on production. Moreover, the production of this field comes from 2-phase plan; a self depletion of oil and then gas injection period for pressure maintenance and increasing ultimate recovery factor. Therefore, well understanding of fracturing network is essential before proceeding with the targeted plan. New analytical methods will lead to more realistic characterization of fractured and faulted reservoir rocks. These methods will produce data that can enhance well test and seismic interpretations, and that can readily be used in reservoir simulators.Keywords: natural fracture, sandstone reservoir, geological, geophysical, and engineering data
Procedia PDF Downloads 93996 Numerical Simulation of Precast Concrete Panels for Airfield Pavement
Authors: Josef Novák, Alena Kohoutková, Vladimír Křístek, Jan Vodička
Abstract:
Numerical analysis software belong to the main tools for simulating the real behavior of various concrete structures and elements. In comparison with experimental tests, they offer an affordable way to study the mechanical behavior of structures under various conditions. The contribution deals with a precast element of an innovative airfield pavement system which is being developed within an ongoing scientific project. The proposed system consists a two-layer surface course of precast concrete panels positioned on a two-layer base of fiber-reinforced concrete with recycled aggregate. As the panels are supposed to be installed directly on the hardened base course, imperfections at the interface between the base course and surface course are expected. Considering such circumstances, three various behavior patterns could be established and considered when designing the precast element. Enormous costs of full-scale experiments force to simulate the behavior of the element in a numerical analysis software using finite element method. The simulation was conducted on a nonlinear model in order to obtain such results which could fully compensate results from the experiments. First, several loading schemes were considered with the aim to observe the critical one which was used for the simulation later on. The main objective of the simulation was to optimize reinforcement of the element subject to quasi-static loading from airplanes. When running the simulation several parameters were considered. Namely, it concerns geometrical imperfections, manufacturing imperfections, stress state in reinforcement, stress state in concrete and crack width. The numerical simulation revealed that the precast element should be heavily reinforced to fulfill all the demands assumed. The main cause of using high amount of reinforcement is the size of the imperfections which could occur at real structure. Improving manufacturing quality, the installation of the precast panels on a fresh base course or using a bedding layer underneath the surface course belong to the main steps how to reduce the size of imperfections and consequently lower the consumption of reinforcement.Keywords: nonlinear analysis, numerical simulation, precast concrete, pavement
Procedia PDF Downloads 257995 Approach for Evaluating Wastewater Reuse Options in Agriculture
Authors: Manal Elgallal, Louise Fletcher, Barbara Evans
Abstract:
Water scarcity is a growing concern in many arid and semi-arid countries. The increase of water scarcity threatens economic development and sustainability of human livelihoods as well as environment especially in developing countries. Globally, agriculture is the largest water consumption sector, accounting for approximately 70% of all freshwater extraction. Growing competition between the agricultural and higher economic value in urban and industrial uses of high-quality freshwater supplies, especially in regions where water scarcity major problems, will increase the pressure on this precious resource. In this circumstance, wastewater may provide reliable source of water for agriculture and enable freshwater to be exchanged for more economically valuable purposes. Concern regarding the risks from microbial and toxic components to human health and environment quality is a serious obstacle for wastewater reuse particularly in agriculture. Although powerful approaches and tools for microbial risk assessment and management for safe use of wastewater are now available, few studies have attempted to provide any mechanism to quantitatively assess and manage the environmental risks resulting from reusing wastewater. In seeking pragmatic solutions to sustainable wastewater reuse, there remains a lack of research incorporating both health and environmental risk assessment and management with economic analysis in order to quantitatively combine cost, benefits and risks to rank alternative reuse options. This study seeks to enhance effective reuse of wastewater for irrigation in arid and semi-arid areas, the outcome of the study is an evaluation approach that can be used to assess different reuse strategies and to determine the suitable scale at which treatment alternatives and interventions are possible, feasible and cost effective in order to optimise the trade-offs between risks to protect public health and the environment and preserving the substantial benefits.Keywords: environmental risks, management, life cycle costs, waste water irrigation
Procedia PDF Downloads 263994 Perception and Usage of Academic Social Networks among Scientists: A Cross-Sectional Study of North Indian Universities
Authors: Anita Chhatwal
Abstract:
Purpose: The purpose of this paper is to evaluate and investigate the scope of usage of Academic Social Networking Websites (ASNs) by the Science faculty members across universities of North India, viz. Panjab University, Punjabi University and University of Delhi, Delhi. Design/Methodology/Approach: The present study is based upon the primary data collected from 81 science faculty participants from three universities of North India. Questionnaire method was used as an instrument for survey. The study is descriptive and research-based to investigate the popular ASNs amongst the participants from three sample universities and the purpose for which they use them along with the problems they encounter while using ASNs. Findings: The findings of the study revealed that majority of the participants were using ASNs for their academic needs. It was observed that majority of the participants (78%) used ASNs to access scientific papers, while 73.8% of the participants used them to share their research publications. ResearchGate (60.5%) and Google Scholar (59.7%) were the top two most preferred and widely used ASNs by the participants. The critical analysis of the data shows that laptops (86.3%) emerged as major tools for accessing ASNs. Shortage of computers was found to be the chief obstacle in accessing ASNs by the participants. Results of the study demonstrate that 56.3% of participants suggested conduct of seminars and training as the most effective method to increase the awareness of ASNs. Research Limitations/Implications: The study in hand absorbed the 81 faculty (Assistant Professors) members from 15 Science teaching departments across three sample universities of North India. The findings of this study will help the Government of India to regulate and simultaneously make effort to develop and enhance ASNs usage among faculty, researchers, and students. The present study will add to the existing library and information science literature and will be advantageous for all the information professionals as well. Originality/Value: This study is original survey based on primary data investigate the usage of ASNs by the academia. This study will be useful for research scholars, academicians and students all over the world.Keywords: academic social networks, awareness and usage, North India, scholarly communication, web 2.0
Procedia PDF Downloads 119993 Enhancing Sell-In and Sell-Out Forecasting Using Ensemble Machine Learning Method
Authors: Vishal Das, Tianyi Mao, Zhicheng Geng, Carmen Flores, Diego Pelloso, Fang Wang
Abstract:
Accurate sell-in and sell-out forecasting is a ubiquitous problem in the retail industry. It is an important element of any demand planning activity. As a global food and beverage company, Nestlé has hundreds of products in each geographical location that they operate in. Each product has its sell-in and sell-out time series data, which are forecasted on a weekly and monthly scale for demand and financial planning. To address this challenge, Nestlé Chilein collaboration with Amazon Machine Learning Solutions Labhas developed their in-house solution of using machine learning models for forecasting. Similar products are combined together such that there is one model for each product category. In this way, the models learn from a larger set of data, and there are fewer models to maintain. The solution is scalable to all product categories and is developed to be flexible enough to include any new product or eliminate any existing product in a product category based on requirements. We show how we can use the machine learning development environment on Amazon Web Services (AWS) to explore a set of forecasting models and create business intelligence dashboards that can be used with the existing demand planning tools in Nestlé. We explored recent deep learning networks (DNN), which show promising results for a variety of time series forecasting problems. Specifically, we used a DeepAR autoregressive model that can group similar time series together and provide robust predictions. To further enhance the accuracy of the predictions and include domain-specific knowledge, we designed an ensemble approach using DeepAR and XGBoost regression model. As part of the ensemble approach, we interlinked the sell-out and sell-in information to ensure that a future sell-out influences the current sell-in predictions. Our approach outperforms the benchmark statistical models by more than 50%. The machine learning (ML) pipeline implemented in the cloud is currently being extended for other product categories and is getting adopted by other geomarkets.Keywords: sell-in and sell-out forecasting, demand planning, DeepAR, retail, ensemble machine learning, time-series
Procedia PDF Downloads 276992 Importance of Prostate Volume, Prostate Specific Antigen Density and Free/Total Prostate Specific Antigen Ratio for Prediction of Prostate Cancer
Authors: Aliseydi Bozkurt
Abstract:
Objectives: Benign prostatic hyperplasia (BPH) is the most common benign disease, and prostate cancer (PC) is malign disease of the prostate gland. Transrectal ultrasound-guided biopsy (TRUS-bx) is one of the most important diagnostic tools in PC diagnosis. Identifying men at increased risk for having a biopsy detectable prostate cancer should consider prostate specific antigen density (PSAD), f/t PSA Ratio, an estimate of prostate volume. Method: We retrospectively studied 269 patients who had a prostate specific antigen (PSA) score of 4 or who had suspected rectal examination at any PSA level and received TRUS-bx between January 2015 and June 2018 in our clinic. TRUS-bx was received by 12 experienced urologists with 12 quadrants. Prostate volume was calculated prior to biopsy together with TRUS. Patients were classified as malignant and benign at the end of pathology. Age, PSA value, prostate volume in transrectal ultrasonography, corpuscle biopsy, biopsy pathology result, the number of cancer core and Gleason score were evaluated in the study. The success rates of PV, PSAD, and f/tPSA were compared in all patients and those with PSA 2.5-10 ng/mL and 10.1-30 ng/mL tp foresee prostate cancer. Result: In the present study, in patients with PSA 2.5-10 ng/ml, PV cut-off value was 43,5 mL (n=42 < 43,5 mL and n=102 > 43,5 mL) while in those with PSA 10.1-30 ng/mL prostate volüme (PV) cut-off value was found 61,5 mL (n=31 < 61,5 mL and n=36 > 61,5 mL). Total PSA values in the group with PSA 2.5-10 ng/ml were found lower (6.0 ± 1.3 vs 6.7 ± 1.7) than that with PV < 43,5 mL, this value was nearly significant (p=0,043). In the group with PSA value 10.1-30 ng/mL, no significant difference was found (p=0,117) in terms of total PSA values between the group with PV < 61,5 mL and that with PV > 61,5 mL. In the group with PSA 2.5-10 ng/ml, in patients with PV < 43,5 mL, f/t PSA value was found significantly lower compared to the group with PV > 43,5 mL (0.21 ± 0.09 vs 0.26 ± 0.09 p < 0.001 ). Similarly, in the group with PSA value of 10.1-30 ng/mL, f/t PSA value was found significantly lower in patients with PV < 61,5 mL (0.16 ± 0.08 vs 0.23 ± 0.10 p=0,003). In the group with PSA 2.5-10 ng/ml, PSAD value in patients with PV < 43,5 mL was found significantly higher compared to those with PV > 43,5 mL (0.17 ± 0.06 vs 0.10 ± 0.03 p < 0.001). Similarly, in the group with PSA value 10.1-30 ng/mL PSAD value was found significantly higher in patients with PV < 61,5 mL (0.47 ± 0.23 vs 0.17 ± 0.08 p < 0.001 ). The biopsy results suggest that in the group with PSA 2.5-10 ng/ml, in 29 of the patients with PV < 43,5 mL (69%) cancer was detected while in 13 patients (31%) no cancer was detected. While in 19 patients with PV > 43,5 mL (18,6%) cancer was found, in 83 patients (81,4%) no cancer was detected (p < 0.001). In the group with PSA value 10.1-30 ng/mL, in 21 patients with PV < 61,5 mL (67.7%) cancer was observed while only in10 patients (32.3%) no cancer was seen. In 5 patients with PV > 61,5 mL (13.9%) cancer was found while in 31 patients (86.1%) no cancer was observed (p < 0.001). Conclusions: Identifying men at increased risk for having a biopsy detectable prostate cancer should consider PSA, f/t PSA Ratio, an estimate of prostate volume. Prostate volume in PC was found lower.Keywords: prostate cancer, prostate volume, prostate specific antigen, free/total PSA ratio
Procedia PDF Downloads 151991 Numerical Analysis of the Response of Thin Flexible Membranes to Free Surface Water Flow
Authors: Mahtab Makaremi Masouleh, Günter Wozniak
Abstract:
This work is part of a major research project concerning the design of a light temporary installable textile flood control structure. The motivation for this work is the great need of applying light structures for the protection of coastal areas from detrimental effects of rapid water runoff. The prime objective of the study is the numerical analysis of the interaction among free surface water flow and slender shaped pliable structures, playing a key role in safety performance of the intended system. First, the behavior of down scale membrane is examined under hydrostatic pressure by the Abaqus explicit solver, which is part of the finite element based commercially available SIMULIA software. Then the procedure to achieve a stable and convergent solution for strongly coupled media including fluids and structures is explained. A partitioned strategy is imposed to make both structures and fluids be discretized and solved with appropriate formulations and solvers. In this regard, finite element method is again selected to analyze the structural domain. Moreover, computational fluid dynamics algorithms are introduced for solutions in flow domains by means of a commercial package of Star CCM+. Likewise, SIMULIA co-simulation engine and an implicit coupling algorithm, which are available communication tools in commercial package of the Star CCM+, enable powerful transmission of data between two applied codes. This approach is discussed for two different cases and compared with available experimental records. In one case, the down scale membrane interacts with open channel flow, where the flow velocity increases with time. The second case illustrates, how the full scale flexible flood barrier behaves when a massive flotsam is accelerated towards it.Keywords: finite element formulation, finite volume algorithm, fluid-structure interaction, light pliable structure, VOF multiphase model
Procedia PDF Downloads 187990 Assessment of Environmental Quality of an Urban Setting
Authors: Namrata Khatri
Abstract:
The rapid growth of cities is transforming the urban environment and posing significant challenges for environmental quality. This study examines the urban environment of Belagavi in Karnataka, India, using geostatistical methods to assess the spatial pattern and land use distribution of the city and to evaluate the quality of the urban environment. The study is driven by the necessity to assess the environmental impact of urbanisation. Satellite data was utilised to derive information on land use and land cover. The investigation revealed that land use had changed significantly over time, with a drop in plant cover and an increase in built-up areas. High-resolution satellite data was also utilised to map the city's open areas and gardens. GIS-based research was used to assess public green space accessibility and to identify regions with inadequate waste management practises. The findings revealed that garbage collection and disposal techniques in specific areas of the city needed to be improved. Moreover, the study evaluated the city's thermal environment using Landsat 8 land surface temperature (LST) data. The investigation found that built-up regions had higher LST values than green areas, pointing to the city's urban heat island (UHI) impact. The study's conclusions have far-reaching ramifications for urban planners and politicians in Belgaum and other similar cities. The findings may be utilised to create sustainable urban planning strategies that address the environmental effect of urbanisation while also improving the quality of life for city dwellers. Satellite data and high-resolution satellite pictures were gathered for the study, and remote sensing and GIS tools were utilised to process and analyse the data. Ground truthing surveys were also carried out to confirm the accuracy of the remote sensing and GIS-based data. Overall, this study provides a complete assessment of Belgaum's environmental quality and emphasizes the potential of remote sensing and geographic information systems (GIS) approaches in environmental assessment and management.Keywords: environmental quality, UEQ, remote sensing, GIS
Procedia PDF Downloads 81989 Prediction of Springback in U-bending of W-Temper AA6082 Aluminum Alloy
Authors: Jemal Ebrahim Dessie, Lukács Zsolt
Abstract:
High-strength aluminum alloys have drawn a lot of attention because of the expanding demand for lightweight vehicle design in the automotive sector. Due to poor formability at room temperature, warm and hot forming have been advised. However, warm and hot forming methods need more steps in the production process and an advanced tooling system. In contrast, since ordinary tools can be used, forming sheets at room temperature in the W temper condition is advantageous. However, springback of supersaturated sheets and their thinning are critical challenges and must be resolved during the use of this technique. In this study, AA6082-T6 aluminum alloy was solution heat treated at different oven temperatures and times using a specially designed and developed furnace in order to optimize the W-temper heat treatment temperature. A U-shaped bending test was carried out at different time periods between W-temper heat treatment and forming operation. Finite element analysis (FEA) of U-bending was conducted using AutoForm aiming to validate the experimental result. The uniaxial tensile and unload test was performed in order to determine the kinematic hardening behavior of the material and has been optimized in the Finite element code using systematic process improvement (SPI). In the simulation, the effect of friction coefficient & blank holder force was considered. Springback parameters were evaluated by the geometry adopted from the NUMISHEET ’93 benchmark problem. It is noted that the change of shape was higher at the more extended time periods between W-temper heat treatment and forming operation. Die radius was the most influential parameter at the flange springback. However, the change of shape shows an overall increasing tendency on the sidewall as the increase of radius of the punch than the radius of the die. The springback angles on the flange and sidewall seem to be highly influenced by the coefficient of friction than blank holding force, and the effect becomes increases as increasing the blank holding force.Keywords: aluminum alloy, FEA, springback, SPI, U-bending, W-temper
Procedia PDF Downloads 100988 A Descriptive Study of the Characteristics of Introductory Accounting Courses Offered by Community Colleges
Authors: Jonathan Nash, Allen Hartt, Catherine Plante
Abstract:
In many nations, community colleges, or similar institutions, play a crucial role in higher education. For example, in the United States more than half of all undergraduate students enroll in a community college at some point during their academic career. Similar statistics have been reported for Australia and Canada. Recognizing the important role these institutions play in educating future accountants, the American Accounting Association has called for research that contributes to a better understanding of these members of the academic community. Although previous literature has shown that community colleges and 4-year institutions differ on many levels, the extant literature has provided data on the characteristics of introductory accounting courses for four-year institutions but not for community colleges. We fill a void in the literature by providing data on the characteristics of introductory accounting courses offered by community colleges in the United States. Data are collected on several dimensions including: course size and staffing, pedagogical orientation, standardization of course elements, textbook selection, and use of technology-based course management tools. Many of these dimensions have been used in previous research examining four-year institutions thereby facilitating comparisons. The resulting data should be of interest to instructors, regulators and administrators, researchers, and the accounting profession. The data provide information on the introductory accounting courses completed by the average community college student which can help instructors identify areas where transfer students’ experiences might differ from their contemporaries at four-year colleges. Regulators and administrators may be interested in the differences between accounting courses offered by two- and four-year institutions when implementing standardized transfer programs. Researchers might use the data to motivate future research into whether differences between two- and four-year institutions affect outcomes like the probability of students choosing to major in accounting and their performance within the major. Accounting professionals may use our findings as a springboard for facilitating discussions related to the accounting labor supply.Keywords: Accounting curricula, Community college, Descriptive study, Introductory accounting
Procedia PDF Downloads 102987 Neighbor Caring Environment System (NCE) Using Parallel Replication Mechanism
Authors: Ahmad Shukri Mohd Noor, Emma Ahmad Sirajudin, Rabiei Mamat
Abstract:
Pertaining to a particular Marine interest, the process of data sampling could take years before a study can be concluded. Therefore, the need for a robust backup system for the data is invariably implicit. In recent advancement of Marine applications, more functionalities and tools are integrated to assist the work of the researchers. It is anticipated that this modality will continue as research scope widens and intensifies and at the same to follow suit with current technologies and lifestyles. The convenience to collect and share information these days also applies to the work in Marine research. Therefore, Marine system designers should be aware that high availability is a necessary attribute in Marine repository applications as well as a robust backup system for the data. In this paper, the approach to high availability is related both to hardware and software but the focus is more on software. We consider a NABTIC repository system that is primitively built on a single server and does not have replicated components. First, the system is decomposed into separate modules. The modules are placed on multiple servers to create a distributed system. Redundancy is added by placing the copies of the modules on different servers using Neighbor Caring Environment System(NCES) technique. NCER is utilizing parallel replication components mechanism. A background monitoring is established to check servers’ heartbeats to confirm their aliveness. At the same time, a critical adaptive threshold is maintained to make sure a failure is timely detected using Adaptive Fault Detection (AFD). A confirmed failure will set the recovery mode where a selection process will be done before a fail-over server is instructed. In effect, the Marine repository service is continued as the fail-over masks a recent failure. The performance of the new prototype is tested and is confirmed to be more highly available. Furthermore, the downtime is not noticeable as service is immediately restored automatically. The Marine repository system is said to have achieved fault tolerance.Keywords: availability, fault detection, replication, fault tolerance, marine application
Procedia PDF Downloads 323