Search results for: Web 2.0 tools
2837 Biomarkers, A Reliable Tool for Delineating Spill Trajectory
Authors: Okpor Victor, Selegha Abrakasa
Abstract:
Oil (Petroleum) spill occur frequently and in this era of a higher degree of awareness, it is pertinent that the trajectory of the spill is properly defined, to make certain of the area of impact by the spill. In this study, biomarkers that are known as the custodians of paleo information in oils are suggested to be used as reliable tools for defining the pathway of a spill. Samples were collected as tills alongside the GPS coordinates of the sample points suspected to have been impacted by a spill. Oils in the samples were extracted and analyzed as whole oil using GC–MS. Some biomarker parametric ratios were derived, and the ratio showed consistency of values along the sample trail from sample 1 to sample 20. The consistency of the values indicates that the oils at each sample point are the same hence the same value. This method can be used to validate the trajectory/pathway of a spill and also to define or establish a suspected pathway for a spill. The Oleanane/C30Hopane ratio showed good consistency and was suggested as a reliable parameter for establishing the trajectory of an oil spill.Keywords: spill, biomarkers, trajectory, pathway
Procedia PDF Downloads 652836 Energy Performance of Buildings Due to Downscaled Seasonal Models
Authors: Anastasia K. Eleftheriadou, Athanasios Sfetsos, Nikolaos Gounaris
Abstract:
The present work examines the suitability of a seasonal forecasting model downscaled with a very high spatial resolution in order to assess the energy performance and requirements of buildings. The application of the developed model is applied on Greece for a period and with a forecast horizon of 5 months in the future. Greece, as a country in the middle of a financial crisis and facing serious societal challenges, is also very sensitive to climate changes. The commonly used method for the correlation of climate change with the buildings energy consumption is the concept of Degree Days (DD). This method can be applied to heating and cooling systems for a better management of environmental, economic and energy crisis, and can be used as medium (3-6 months) planning tools in order to predict the building needs and country’s requirements for residential energy use.Keywords: downscaled seasonal models, degree days, energy performance
Procedia PDF Downloads 4532835 Advanced Concrete Crack Detection Using Light-Weight MobileNetV2 Neural Network
Authors: Li Hui, Riyadh Hindi
Abstract:
Concrete structures frequently suffer from crack formation, a critical issue that can significantly reduce their lifespan by allowing damaging agents to enter. Traditional methods of crack detection depend on manual visual inspections, which heavily relies on the experience and expertise of inspectors using tools. In this study, a more efficient, computer vision-based approach is introduced by using the lightweight MobileNetV2 neural network. A dataset of 40,000 images was used to develop a specialized crack evaluation algorithm. The analysis indicates that MobileNetV2 matches the accuracy of traditional CNN methods but is more efficient due to its smaller size, making it well-suited for mobile device applications. The effectiveness and reliability of this new method were validated through experimental testing, highlighting its potential as an automated solution for crack detection in concrete structures.Keywords: Concrete crack, computer vision, deep learning, MobileNetV2 neural network
Procedia PDF Downloads 662834 Dynamic Soil Structure Interaction in Buildings
Authors: Shreya Thusoo, Karan Modi, Ankit Kumar Jha, Rajesh Kumar
Abstract:
Since the evolution of computational tools and simulation software, there has been considerable increase in research on Soil Structure Interaction (SSI) to decrease the computational time and increase accuracy in the results. To aid the designer with a proper understanding of the response of structure in different soil types, the presented paper compares the deformation, shear stress, acceleration and other parameters of multi-storey building for a specific input ground motion using Response-spectrum Analysis (RSA) method. The response of all the models of different heights have been compared in different soil types. Finite Element Simulation software, ANSYS, has been used for all the computational purposes. Overall, higher response is observed with SSI, while it increases with decreasing stiffness of soil.Keywords: soil-structure interaction, response spectrum, analysis, finite element method, multi-storey buildings
Procedia PDF Downloads 4802833 Delineato: Designing Distraction-Free GUIs
Authors: Fernando Miguel Campos, Fernando Jesus Aguiar Campos, Pedro Filipe Campos
Abstract:
A large amount of software products offer a wide range and number of features. This is called featurities or creeping featurism and tends to rise with each release of the product. Feautiris often adds unnecessary complexity to software, leading to longer learning curves and overall confusing the users and degrading their experience. We take a look to a new design approach tendency that has been coming up, the so-called “What You Get Is What You Need” concept that argues that products should be very focused, simple and with minimalistic interfaces in order to help users conduct their tasks in distraction-free ambiances. This is not as simple to implement as it might sound and the developers need to cut down features. Our contribution illustrates and evaluates this design method through a novel distraction-free diagramming tool named Delineato Pro for Mac OS X in which the user is confronted with an empty canvas when launching the software and where tools only show up when really needed.Keywords: diagramming, HCI, usability, user interface
Procedia PDF Downloads 5272832 Terrorism: A Threat in Constant Evolution Still Misunderstood
Authors: M. J. Gazapo Lapayese
Abstract:
It is a well-established fact that terrorism is one of the foremost threats to present-day international security. The creation of tools or mechanisms for confronting it in an effective and efficient manner will only be possible by way of an objective assessment of the phenomenon. In order to achieve this, this paper has the following three main objectives: Firstly, setting out to find the reasons that have prevented the establishment of a universally accepted definition of terrorism, and consequently trying to outline the main features defining the face of the terrorist threat in order to discover the fundamental goals of what is now a serious blight on world society. Secondly, trying to explain the differences between a terrorist movement and a terrorist organisation, and the reasons for which a terrorist movement can be led to transform itself into an organisation. After analysing these motivations and the characteristics of a terrorist organisation, an example of the latter will be succinctly analysed to help the reader understand the ideas expressed. Lastly, discovering and exposing the factors that can lead to the appearance of terrorist tendencies, and discussing the most efficient and effective responses that can be given to this global security threat.Keywords: responses, resilience, security, terrorism
Procedia PDF Downloads 4532831 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2262830 Economic Policy of Achieving National Competitive Advantage
Authors: Gulnaz Erkomaishvili, Eteri Kharaishvili, Marina Chavleishvili
Abstract:
The paper discusses the economic policy of increasing national competitiveness, the tools, and means which help the country to improve its competitiveness. The sectors of the economy, in which the country can achieve a competitive advantage, are studied. It is noted that the country’s economic policy plays an important role in obtaining and maintaining a competitive advantage - authority should take measures to ensure a high level of education; scientific and research activities should be funded by the state; foreign direct investments should be attracted mainly in science-intensive industries; adaptation with the latest scientific achievements of the modern world and deepening of scientific and technical cooperation. Stable business environment and export-oriented strategy is the basis for the country’s economic growth. The studies have shown that institutional reforms in Georgia are not enough to significantly improve the country's competitiveness.Keywords: competitiveness, economic policy, competitiveness improvement strategy, competitiveness of Georgia
Procedia PDF Downloads 1282829 Chronically Ill Patient Satisfaction: An Indicator of Quality of Service Provided at Primary Health Care Settings in Alexandria
Authors: Alyaa Farouk Ibrahim, Gehan ElSayed, Ola Mamdouh, Nazek AbdelGhany
Abstract:
Background: Primary health care (PHC) can be considered the first contact between the patient and the health care system. It includes all the basic health care services to be provided to the community. Patient's satisfaction regarding health care has often improved the provision of care, also considered as one of the most important measures for evaluating the health care. Objective: This study aims to identify patient’s satisfaction with services provided at the primary health care settings in Alexandria. Setting: Seven primary health care settings representing the seven zones of Alexandria governorate were selected randomly and included in the study. Subjects: The study comprised 386 patients attended the previously selected settings at least twice before the time of the study. Tools: Two tools were utilized for data collection; sociodemographic characteristics and health status structured interview schedule and patient satisfaction scale. Reliability test for the scale was done using Cronbach's Alpha test, the result of the test ranged between 0.717 and 0.967. The overall satisfaction was computed and divided into high, medium, and low satisfaction. Results: Age of the studied sample ranged between 19 and 62 years, more than half (54.2%) of them aged 40 to less than 60 years. More than half (52.8%) of the patients included in the study were diabetics, 39.1% of them were hypertensive, 19.2% had cardiovascular diseases, the rest of the sample had tumor, liver diseases, and orthopedic/neurological disorders (6.5%, 5.2% & 3.2%, respectively). The vast majority of the study group mentioned high satisfaction with overall service cost, environmental conditions, medical staff attitude and health education given at the PHC settings (87.8%, 90.7%, 86.3% & 90.9%, respectively), however, medium satisfaction was mostly reported concerning medical checkup procedures, follow-up data and referral system (41.2%, 28.5% & 28.9%, respectively). Score level of patient satisfaction with health services provided at the assessed Primary health care settings proved to be significantly associated with patients’ social status (P=0.003, X²=14.2), occupation (P=0.011, X²=11.2), and monthly income (P=0.039, X²=6.50). In addition, a significant association was observed between score level of satisfaction and type of illness (P=0.007, X²=9.366), type of medication (P=0.014, X²=9.033), prior knowledge about the health center (P=0.050, X²=3.346), and highly significant with the administrative zone (P=0.001, X²=55.294). Conclusion: The current study revealed that overall service cost, environmental conditions, staff attitude and health education at the assessed primary health care settings gained high patient satisfaction level, while, medical checkup procedures, follow-up, and referral system caused a medium level of satisfaction among assessed patients. Nevertheless, social status, occupation, monthly income, type of illness, type of medication and administrative zones are all factors influencing patient satisfaction with services provided at the health facilities.Keywords: patient satisfaction, chronic illness, quality of health service, quality of service indicators
Procedia PDF Downloads 3522828 A Literature Review on Community Awareness, Education in Disaster Risk Reduction and Best Practices
Authors: Alwyn John Lim
Abstract:
Philippines is one of the most vulnerable areas to natural disasters in the world. Almost every year different types of natural disasters occur in Philippines and destroy many lives and resources of people. Although it is not possible to prevent the occurrence of disasters influenced by natural causes, proper plan and management such as disaster risk reduction may minimize the damage cause by natural disasters. Based on literature review this paper will analyze literatures on public/community awareness and education in disaster risk reduction that would help promote a country wide public disaster awareness and education program in the Philippines. This will include best practices and importance of community disaster awareness and education. The paper will also tackle ICT tools that will help boost the process and effectiveness of community/public disaster awareness and education.Keywords: community awareness, disaster education, disaster risk reduction, Philippines
Procedia PDF Downloads 5032827 All-or-None Principle and Weakness of Hodgkin-Huxley Mathematical Model
Authors: S. A. Sadegh Zadeh, C. Kambhampati
Abstract:
Mathematical and computational modellings are the necessary tools for reviewing, analysing, and predicting processes and events in the wide spectrum range of scientific fields. Therefore, in a field as rapidly developing as neuroscience, the combination of these two modellings can have a significant role in helping to guide the direction the field takes. The paper combined mathematical and computational modelling to prove a weakness in a very precious model in neuroscience. This paper is intended to analyse all-or-none principle in Hodgkin-Huxley mathematical model. By implementation the computational model of Hodgkin-Huxley model and applying the concept of all-or-none principle, an investigation on this mathematical model has been performed. The results clearly showed that the mathematical model of Hodgkin-Huxley does not observe this fundamental law in neurophysiology to generating action potentials. This study shows that further mathematical studies on the Hodgkin-Huxley model are needed in order to create a model without this weakness.Keywords: all-or-none, computational modelling, mathematical model, transmembrane voltage, action potential
Procedia PDF Downloads 6172826 Study of Land Use Land Cover Change of Bhimbetka with Temporal Satellite Data and Information Systems
Authors: Pranita Shivankar, Devashree Hardas, Prabodhachandra Deshmukh, Arun Suryavanshi
Abstract:
Bhimbetka Rock Shelters is the UNESCO World Heritage Site located about 45 kilometers south of Bhopal in the state of Madhya Pradesh, India. Rapid changes in land use land cover (LULC) adversely affect the environment. In recent past, significant changes are found in the cultural landscape over a period of time. The objective of the paper was to study the changes in land use land cover (LULC) of Bhimbetka and its peripheral region. For this purpose, the supervised classification was carried out by using satellite images of Landsat and IRS LISS III for the year 2000 and 2013. Use of remote sensing in combination with geographic information system is one of the effective information technology tools to generate land use land cover (LULC) change information.Keywords: IRS LISS III, Landsat, LULC, UNESCO, World Heritage Site
Procedia PDF Downloads 3502825 Mechanisms and Process of an Effective Public Policy Formulation in Islamic Economic System
Authors: Md Abu Saieed
Abstract:
Crafting and implementing public policy is one of the indispensable works in any form of state and government. But the policy objectives, methods of formulation and tools of implementation might be different based on the ideological nature, historical legacy, structure and capacity of administration and management and other push and factors. Public policy in Islamic economic system needs to be based on the key guidelines of divine scriptures along with other sources of sharia’h. As a representative of Allah (SWT), the governor and other apparatus of the state will formulate and implement public policies which will enable to establish a true welfare state based on justice, equity and equality. The whole life of Prophet Muhammad (pbuh) and his policy in operating state of affairs in Madina is the practical guidelines for the policy actors and professionals in Islamic system of economics. Moreover, policy makers need to be more meticulous in formulating Islamic public policy which meets the needs and demands of contemporary worlds as well.Keywords: formulation, Islam, public policy, policy factors, Sharia’h
Procedia PDF Downloads 3512824 Control of Hybrid System Using Fuzzy Logic
Authors: Faiza Mahi, Fatima Debbat, Mohamed Fayçal Khelfi
Abstract:
This paper proposes a control approach using Fuzzy Lo system. More precisely, the study focuses on the improvement of users service in terms of analysis and control of a transportation system their waiting times in the exchange platforms of passengers. Many studies have been developed in the literature for such problematic, and many control tools are proposed. In this paper we focus on the use of fuzzy logic technique to control the system during its evolution in order to minimize the arrival gap of connected transportation means at the exchange points of passengers. An example of illustration is worked out and the obtained results are reported. an important area of research is the modeling and simulation ordering system. We describe an approach to analysis using Fuzzy Logic. The hybrid simulator developed in toolbox Matlab consists calculation of waiting time transportation mode.Keywords: Fuzzy logic, Hybrid system, Waiting Time, Transportation system, Control
Procedia PDF Downloads 5552823 Model Order Reduction Using Hybrid Genetic Algorithm and Simulated Annealing
Authors: Khaled Salah
Abstract:
Model order reduction has been one of the most challenging topics in the past years. In this paper, a hybrid solution of genetic algorithm (GA) and simulated annealing algorithm (SA) are used to approximate high-order transfer functions (TFs) to lower-order TFs. In this approach, hybrid algorithm is applied to model order reduction putting in consideration improving accuracy and preserving the properties of the original model which are two important issues for improving the performance of simulation and computation and maintaining the behavior of the original complex models being reduced. Compared to conventional mathematical methods that have been used to obtain a reduced order model of high order complex models, our proposed method provides better results in terms of reducing run-time. Thus, the proposed technique could be used in electronic design automation (EDA) tools.Keywords: genetic algorithm, simulated annealing, model reduction, transfer function
Procedia PDF Downloads 1432822 Defining a Holistic Approach for Model-Based System Engineering: Paradigm and Modeling Requirements
Authors: Hycham Aboutaleb, Bruno Monsuez
Abstract:
Current systems complexity has reached a degree that requires addressing conception and design issues while taking into account all the necessary aspects. Therefore, one of the main challenges is the way complex systems are specified and designed. The exponential growing effort, cost and time investment of complex systems in modeling phase emphasize the need for a paradigm, a framework and a environment to handle the system model complexity. For that, it is necessary to understand the expectations of the human user of the model and his limits. This paper presents a generic framework for designing complex systems, highlights the requirements a system model needs to fulfill to meet human user expectations, and defines the refined functional as well as non functional requirements modeling tools needs to meet to be useful in model-based system engineering.Keywords: system modeling, modeling language, modeling requirements, framework
Procedia PDF Downloads 5312821 Socio-Cultural Adaptation Approach to Enhance Intercultural Collaboration and Learning
Authors: Fadoua Ouamani, Narjès Bellamine Ben Saoud, Henda Hajjami Ben Ghézala
Abstract:
In the last few years and over the last decades, there was a growing interest in the development of Computer Supported Collaborative Learning (CSCL) environments. However, the existing systems ignore the variety of learners and their socio-cultural differences, especially in the case of distant and networked learning. In fact, within such collaborative learning environments, learners from different socio-cultural backgrounds may interact together. These learners evolve within various cultures and social contexts and acquire different socio-cultural values and behaviors. Thus, they should be assisted while communicating and collaborating especially in an intercultural group. Besides, the communication and collaboration tools provided to each learner must depend on and be adapted to her/his socio-cultural profile. The main goal of this paper is to present the proposed socio-cultural adaptation approach based on and guided by ontologies to adapt CSCL environments to the socio-cultural profiles of its users (learners or others).Keywords: CSCL, socio-cultural profile, adaptation, ontology
Procedia PDF Downloads 3602820 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 2142819 Technology Computer Aided Design Simulation of Space Charge Limited Conduction in Polycrystalline Thin Films
Authors: Kunj Parikh, S. Bhattacharya, V. Natarajan
Abstract:
TCAD numerical simulation is one of the most tried and tested powerful tools for designing devices in semiconductor foundries worldwide. It has also been used to explain conduction in organic thin films where the processing temperature is often enough to make homogeneous samples (often imperfect, but homogeneously imperfect). In this report, we have presented the results of TCAD simulation in multi-grain thin films. The work has addressed the inhomogeneity in one dimension, but can easily be extended to two and three dimensions. The effect of grain boundaries has mainly been approximated as barriers located at the junction between two adjacent grains. The effect of the value of grain boundary barrier, the bulk traps, and the measurement temperature have been investigated.Keywords: polycrystalline thin films, space charge limited conduction, Technology Computer-Aided Design (TCAD) simulation, traps
Procedia PDF Downloads 2142818 HD-WSComp: Hypergraph Decomposition for Web Services Composition Based on QoS
Authors: Samah Benmerbi, Kamal Amroun, Abdelkamel Tari
Abstract:
The increasing number of Web service (WS)providers throughout the globe, have produced numerous Web services providing the same or similar functionality. Therefore, there is a need of tools developing the best answer of queries by selecting and composing services with total transparency. This paper reviews various QoS based Web service selection mechanisms and architectures which facilitate qualitatively optimal selection, in other fact Web service composition is required when a request cannot be fulfilled by a single web service. In such cases, it is preferable to integrate existing web services to satisfy user’s request. We introduce an automatic Web service composition method based on hypergraph decomposition using hypertree decomposition method. The problem of selection and the composition of the web services is transformed into a resolution in a hypertree by exploring the relations of dependency between web services to get composite web service via employing an execution order of WS satisfying global request.Keywords: web service, web service selection, web service composition, QoS, hypergraph decomposition, BE hypergraph decomposition, hypertree resolution
Procedia PDF Downloads 5092817 Effect of Cost Control and Cost Reduction Techniques in Organizational Performance
Authors: Babatunde Akeem Lawal
Abstract:
In any organization, the primary aim is to maximize profit, but the major challenges facing them is the increase in cost of operation because of this there is increase in cost of production that could lead to inevitable cost control and cost reduction scheme which make it difficult for most organizations to operate at the cost efficient frontier. The study aims to critically examine and evaluate the application of cost control and cost reduction in organization performance and also to review budget as an effective tool of cost control and cost reduction. A descriptive survey research was adopted. A total number of 40 respondent retrieved were used for the study. The analysis of data collected was undertaken by applying appropriate statistical tools. Regression analysis was used to test the hypothesis with the use of SPSS. Based on the findings; it was evident that cost control has a positive impact on organizational performance and also the style of management has a positive impact on organizational performance.Keywords: organization, cost reduction, cost control, performance, budget, profit
Procedia PDF Downloads 6032816 The Problem of the Use of Learning Analytics in Distance Higher Education: An Analytical Study of the Open and Distance University System in Mexico
Authors: Ismene Ithai Bras-Ruiz
Abstract:
Learning Analytics (LA) is employed by universities not only as a tool but as a specialized ground to enhance students and professors. However, not all the academic programs apply LA with the same goal and use the same tools. In fact, LA is formed by five main fields of study (academic analytics, action research, educational data mining, recommender systems, and personalized systems). These fields can help not just to inform academic authorities about the situation of the program, but also can detect risk students, professors with needs, or general problems. The highest level applies Artificial Intelligence techniques to support learning practices. LA has adopted different techniques: statistics, ethnography, data visualization, machine learning, natural language process, and data mining. Is expected that any academic program decided what field wants to utilize on the basis of his academic interest but also his capacities related to professors, administrators, systems, logistics, data analyst, and the academic goals. The Open and Distance University System (SUAYED in Spanish) of the University National Autonomous of Mexico (UNAM), has been working for forty years as an alternative to traditional programs; one of their main supports has been the employ of new information and communications technologies (ICT). Today, UNAM has one of the largest network higher education programs, twenty-six academic programs in different faculties. This situation means that every faculty works with heterogeneous populations and academic problems. In this sense, every program has developed its own Learning Analytic techniques to improve academic issues. In this context, an investigation was carried out to know the situation of the application of LA in all the academic programs in the different faculties. The premise of the study it was that not all the faculties have utilized advanced LA techniques and it is probable that they do not know what field of study is closer to their program goals. In consequence, not all the programs know about LA but, this does not mean they do not work with LA in a veiled or, less clear sense. It is very important to know the grade of knowledge about LA for two reasons: 1) This allows to appreciate the work of the administration to improve the quality of the teaching and, 2) if it is possible to improve others LA techniques. For this purpose, it was designed three instruments to determinate the experience and knowledge in LA. These were applied to ten faculty coordinators and his personnel; thirty members were consulted (academic secretary, systems manager, or data analyst, and coordinator of the program). The final report allowed to understand that almost all the programs work with basic statistics tools and techniques, this helps the administration only to know what is happening inside de academic program, but they are not ready to move up to the next level, this means applying Artificial Intelligence or Recommender Systems to reach a personalized learning system. This situation is not related to the knowledge of LA, but the clarity of the long-term goals.Keywords: academic improvements, analytical techniques, learning analytics, personnel expertise
Procedia PDF Downloads 1282815 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 272814 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading
Authors: Robert Caulk
Abstract:
A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration
Procedia PDF Downloads 892813 Social Media as an Interactive Learning Tool Applied to Faculty of Tourism and Hotels, Fayoum University
Authors: Islam Elsayed Hussein
Abstract:
The aim of this paper is to discover the impact of students’ attitude towards social media and the skills required to adopt social media as a university e-learning (2.0) platform. In addition, it measures the effect of social media adoption on interactive learning effectiveness. The population of this study was students at Faculty of tourism and Hotels, Fayoum University. A questionnaire was used as a research instrument to collect data from respondents, which had been selected randomly. Data had been analyzed using quantitative data analysis method. Findings showed that the students have a positive attitude towards adopting social networking in the learning process and they have also good skills for effective use of social networking tools. In addition, adopting social media is effectively affecting the interactive learning environment.Keywords: attitude, skills, e-learning 2.0, interactive learning, Egypt
Procedia PDF Downloads 5242812 Innovative Business Education Pedagogy: A Case Study of Action Learning at NITIE, Mumbai
Authors: Sudheer Dhume, T. Prasad
Abstract:
There are distinct signs of Business Education losing its sheen. It is more so in developing countries. One of the reasons is the value addition at the end of 2 year MBA program is not matching with the requirements of present times and expectations of the students. In this backdrop, Pedagogy Innovation has become prerequisite for making our MBA programs relevant and useful. This paper is the description and analysis of innovative Action Learning pedagogical approach adopted by a group of faculty members at NITIE Mumbai. It not only promotes multidisciplinary research but also enhances integration of the functional areas skillsets in the students. The paper discusses the theoretical bases of this pedagogy and evaluates the effectiveness of it vis-à-vis conventional pedagogical tools. The evaluation research using Bloom’s taxonomy framework showed that this blended method of Business Education is much superior as compared to conventional pedagogy.Keywords: action learning, blooms taxonomy, business education, innovation, pedagogy
Procedia PDF Downloads 2702811 A Petri Net Model to Obtain the Throughput of Unreliable Production Lines in the Buffer Allocation Problem
Authors: Joselito Medina-Marin, Alexandr Karelin, Ana Tarasenko, Juan Carlos Seck-Tuoh-Mora, Norberto Hernandez-Romero, Eva Selene Hernandez-Gress
Abstract:
A production line designer faces with several challenges in manufacturing system design. One of them is the assignment of buffer slots in between every machine of the production line in order to maximize the throughput of the whole line, which is known as the Buffer Allocation Problem (BAP). The BAP is a combinatorial problem that depends on the number of machines and the total number of slots to be distributed on the production line. In this paper, we are proposing a Petri Net (PN) Model to obtain the throughput in unreliable production lines, based on PN mathematical tools and the decomposition method. The results obtained by this methodology are similar to those presented in previous works, and the number of machines is not a hard restriction.Keywords: buffer allocation problem, Petri Nets, throughput, production lines
Procedia PDF Downloads 3072810 A Fast Multi-Scale Finite Element Method for Geophysical Resistivity Measurements
Authors: Mostafa Shahriari, Sergio Rojas, David Pardo, Angel Rodriguez- Rozas, Shaaban A. Bakr, Victor M. Calo, Ignacio Muga
Abstract:
Logging-While Drilling (LWD) is a technique to record down-hole logging measurements while drilling the well. Nowadays, LWD devices (e.g., nuclear, sonic, resistivity) are mostly used commercially for geo-steering applications. Modern borehole resistivity tools are able to measure all components of the magnetic field by incorporating tilted coils. The depth of investigation of LWD tools is limited compared to the thickness of the geological layers. Thus, it is a common practice to approximate the Earth’s subsurface with a sequence of 1D models. For a 1D model, we can reduce the dimensionality of the problem using a Hankel transform. We can solve the resulting system of ordinary differential equations (ODEs) either (a) analytically, which results in a so-called semi-analytic method after performing a numerical inverse Hankel transform, or (b) numerically. Semi-analytic methods are used by the industry due to their high performance. However, they have major limitations, namely: -The analytical solution of the aforementioned system of ODEs exists only for piecewise constant resistivity distributions. For arbitrary resistivity distributions, the solution of the system of ODEs is unknown by today’s knowledge. -In geo-steering, we need to solve inverse problems with respect to the inversion variables (e.g., the constant resistivity value of each layer and bed boundary positions) using a gradient-based inversion method. Thus, we need to compute the corresponding derivatives. However, the analytical derivatives of cross-bedded formation and the analytical derivatives with respect to the bed boundary positions have not been published to the best of our knowledge. The main contribution of this work is to overcome the aforementioned limitations of semi-analytic methods by solving each 1D model (associated with each Hankel mode) using an efficient multi-scale finite element method. The main idea is to divide our computations into two parts: (a) offline computations, which are independent of the tool positions and we precompute only once and use them for all logging positions, and (b) online computations, which depend upon the logging position. With the above method, (a) we can consider arbitrary resistivity distributions along the 1D model, and (b) we can easily and rapidly compute the derivatives with respect to any inversion variable at a negligible additional cost by using an adjoint state formulation. Although the proposed method is slower than semi-analytic methods, its computational efficiency is still high. In the presentation, we shall derive the mathematical variational formulation, describe the proposed multi-scale finite element method, and verify the accuracy and efficiency of our method by performing a wide range of numerical experiments and comparing the numerical solutions to semi-analytic ones when the latest are available.Keywords: logging-While-Drilling, resistivity measurements, multi-scale finite elements, Hankel transform
Procedia PDF Downloads 3862809 Compensation of Power Quality Disturbances Using DVR
Authors: R. Rezaeipour
Abstract:
One of the key aspects of power quality improvement in power system is the mitigation of voltage sags/swells and flicker. Custom power devices have been known as the best tools for voltage disturbances mitigation as well as reactive power compensation. Dynamic voltage restorer (DVR) which is the most efficient and effective modern custom power device can provide the most commercial solution to solve several problems of power quality in distribution networks. This paper deals with analysis and simulation technique of DVR based on instantaneous power theory which is a quick control to detect signals. The main purpose of this work is to remove three important disturbances including voltage sags/swells and flicker. Simulation of the proposed method was carried out on two sample systems by using MATLAB software environment and the results of simulation show that the proposed method is able to provide desirable power quality in the presence of wide range of disturbances.Keywords: DVR, power quality, voltage sags, voltage swells, flicker
Procedia PDF Downloads 3452808 The Development of Speaking Using Folk Tales Based on Performance Activities for Early-Childhood Students
Authors: Ms Yaowaluck Ruampol
Abstract:
The research on the development of using folk tales based on performance activities aimed to (1) study the development of speaking skill for early-childhood students, (2) evaluate the development of speaking skill before and after speaking activities. Ten students of Kindergarten level 2, who have enrolled in the subject of the research for speaking development of semester 2 in 2013, were purposively selected as the research cohort. The research tools were lesson plans for speaking activities and pre-posttest for speaking development that were approved for content validity and reliability (IOC=.66-1.00,0.967). The research found that the development of speaking skill of the research samples before using performance activities on folk tales in developing speaking skill was in the normal high level. Additionally, the results revealed that the preschoolers after applying speaking skill on performance activities also imaginatively created their speaking skill.Keywords: speaking development, folk tales, performance activities, communication engineering
Procedia PDF Downloads 291