Search results for: CBSD (component based software development)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 40716

Search results for: CBSD (component based software development)

39906 Information System Development for Online Journal System Using Online Journal System for Journal Management of Suan Sunandha Rajabhat University

Authors: Anuphan Suttimarn, Natcha Wattanaprapa, Suwaree Yordchim

Abstract:

The aim of this study is to develop the online journal system using a web application to manage the journal service of Suan Sunandha Rajabhat University in order to improve the journal management of the university. The main structures of the system process consist of 1. journal content management system 2. membership system of the journal and 3. online submission or review process. The investigators developed the system based on a web application using open source OJS software and phpMyAdmin to manage a research database. The system test showed that this online system 'Online Journal System (OJS)' could shorten the time in the period of submission article to journal and helped in managing a journal procedure efficiently and accurately. The quality evaluation of Suan Sunandha Rajabhat online journal system (SSRUOJS) undertaken by experts and researchers in 5 aspects; design, usability, security, reducing time, and accuracy showed the highest average value (X=4.30) on the aspect of reducing time. Meanwhile, the system efficiency evaluation was on an excellent level (X=4.13).

Keywords: online journal system, Journal management, Information system development, OJS

Procedia PDF Downloads 157
39905 Estimation of Energy Losses of Photovoltaic Systems in France Using Real Monitoring Data

Authors: Mohamed Amhal, Jose Sayritupac

Abstract:

Photovoltaic (PV) systems have risen as one of the modern renewable energy sources that are used in wide ranges to produce electricity and deliver it to the electrical grid. In parallel, monitoring systems have been deployed as a key element to track the energy production and to forecast the total production for the next days. The reliability of the PV energy production has become a crucial point in the analysis of PV systems. A deeper understanding of each phenomenon that causes a gain or a loss of energy is needed to better design, operate and maintain the PV systems. This work analyzes the current losses distribution in PV systems starting from the available solar energy, going through the DC side and AC side, to the delivery point. Most of the phenomena linked to energy losses and gains are considered and modeled, based on real time monitoring data and datasheets of the PV system components. An analysis of the order of magnitude of each loss is compared to the current literature and commercial software. To date, the analysis of PV systems performance based on a breakdown structure of energy losses and gains is not covered enough in the literature, except in some software where the concept is very common. The cutting-edge of the current analysis is the implementation of software tools for energy losses estimation in PV systems based on several energy losses definitions and estimation technics. The developed tools have been validated and tested on some PV plants in France, which are operating for years. Among the major findings of the current study: First, PV plants in France show very low rates of soiling and aging. Second, the distribution of other losses is comparable to the literature. Third, all losses reported are correlated to operational and environmental conditions. For future work, an extended analysis on further PV plants in France and abroad will be performed.

Keywords: energy gains, energy losses, losses distribution, monitoring, photovoltaic, photovoltaic systems

Procedia PDF Downloads 150
39904 Development of Basic Patternmaking Using Parametric Modelling and AutoLISP

Authors: Haziyah Hussin, Syazwan Abdul Samad, Rosnani Jusoh

Abstract:

This study is aimed towards the automisation of basic patternmaking for traditional clothes for the purpose of mass production using AutoCAD to apply AutoLISP feature under software Hazi Attire. A standard dress form (industrial form) with the size of small (S), medium (M) and large (L) size is measured using full body scanning machine. Later, the pattern for the clothes is designed parametrically based on the measured dress form. Hazi Attire program is used within the framework of AutoCAD to generate the basic pattern of front bodice, back bodice, front skirt, back skirt and sleeve block (sloper). The generation of pattern is based on the parameters inputted by user, whereby in this study, the parameters were determined based on the measured size of dress form. The finalized pattern parameter shows that the pattern fit perfectly on the dress form. Since the pattern is generated almost instantly, these proved that using the AutoLISP programming, the manufacturing lead time for the mass production of the traditional clothes can be decreased.

Keywords: apparel, AutoLISP, Malay traditional clothes, pattern ganeration

Procedia PDF Downloads 234
39903 Clinicians’ Experiences with IT Systems in a UK District General Hospital: A Qualitative Analysis

Authors: Sunny Deo, Eve Barnes, Peter Arnold-Smith

Abstract:

Introduction: Healthcare technology is a rapidly expanding field in healthcare, with enthusiasts suggesting a revolution in the quality and efficiency of healthcare delivery based on the utilisation of better e-healthcare, including the move to paperless healthcare. The role and use of computers and programmes for healthcare have been increasing over the past 50 years. Despite this, there is no standardised method of assessing the quality of hardware and software utilised by frontline healthcare workers. Methods and subjects: Based on standard Patient Related Outcome Measures, a questionnaire was devised with the aim of providing quantitative and qualitative data on clinicians’ perspectives of their hospital’s Information Technology (IT). The survey was distributed via the Institution’s Intranet to all contracted doctors, and the survey's qualitative results were analysed. Qualitative opinions were grouped as positive, neutral, or negative and further sub-grouped into speed/usability, software/hardware, integration, IT staffing, clinical risk, and wellbeing. Analysis was undertaken on the basis of doctor seniority and by specialty. Results: There were 196 responses, with 51% from senior doctors (consultant grades) and the rest from junior grades, with the largest group of respondents 52% coming from medicine specialties. Differences in the proportion of principle and sub-groups were noted by seniority and specialty. Negative themes were by far the commonest stated opinion type, occurring in almost 2/3’s of responses (63%), while positive comments occurred less than 1 in 10 (8%). Conclusions: This survey confirms strongly negative attitudes to the current state of electronic documentation and IT in a large single-centre cohort of hospital-based frontline physicians after two decades of so-called progress to a paperless healthcare system. Greater use would provide further insights and potentially optimise the focus of development and delivery to improve the quality and effectiveness of IT for clinicians and their patients.

Keywords: information technology, electronic patient records, digitisation, paperless healthcare

Procedia PDF Downloads 62
39902 Effects of Different Meteorological Variables on Reference Evapotranspiration Modeling: Application of Principal Component Analysis

Authors: Akinola Ikudayisi, Josiah Adeyemo

Abstract:

The correct estimation of reference evapotranspiration (ETₒ) is required for effective irrigation water resources planning and management. However, there are some variables that must be considered while estimating and modeling ETₒ. This study therefore determines the multivariate analysis of correlated variables involved in the estimation and modeling of ETₒ at Vaalharts irrigation scheme (VIS) in South Africa using Principal Component Analysis (PCA) technique. Weather and meteorological data between 1994 and 2014 were obtained both from South African Weather Service (SAWS) and Agricultural Research Council (ARC) in South Africa for this study. Average monthly data of minimum and maximum temperature (°C), rainfall (mm), relative humidity (%), and wind speed (m/s) were the inputs to the PCA-based model, while ETₒ is the output. PCA technique was adopted to extract the most important information from the dataset and also to analyze the relationship between the five variables and ETₒ. This is to determine the most significant variables affecting ETₒ estimation at VIS. From the model performances, two principal components with a variance of 82.7% were retained after the eigenvector extraction. The results of the two principal components were compared and the model output shows that minimum temperature, maximum temperature and windspeed are the most important variables in ETₒ estimation and modeling at VIS. In order words, ETₒ increases with temperature and windspeed. Other variables such as rainfall and relative humidity are less important and cannot be used to provide enough information about ETₒ estimation at VIS. The outcome of this study has helped to reduce input variable dimensionality from five to the three most significant variables in ETₒ modelling at VIS, South Africa.

Keywords: irrigation, principal component analysis, reference evapotranspiration, Vaalharts

Procedia PDF Downloads 231
39901 An Experience of Translating an Excerpt from Sophie Adonon’s Echos de Femmes from French to English, Using Reverso.

Authors: Michael Ngongeh Mombe

Abstract:

This Paper seeks to investigate an assertion made by some colleagues that there is no need paying a human translator to translate their literary texts, that there are softwares such as Reverso that can be used to do the translation. The main objective of this study is to examine the veracity of this assertion using Reverso to translate a literary text without any post-editing by a human translator. The work is based on two theories: Skopos and Communicative theories of translation. The work is a documentary research where data were collected from published documents in libraries, on the internet and from the translation produced by Reverso. We made a comparative text analyses of both source and target texts in a bid to highlight the weaknesses and strengths of the software. Findings of this work revealed that those who advocate the use of only Machine translation do so in ignorance of the translation mistakes usually made by the software. From the review of all the 268 segments of translation, we found out that the translation produced by Reverso is fraught with errors. We therefore recommend the use of human translators to either do the translation of their literary texts or revise the translation produced by machine to conform to the skopos of the work. This paper is based on Reverso translation. Similar works in the near future will be based on the other translation softwares to determine their weaknesses and strengths.

Keywords: machine translation, human translator, Reverso, literary text

Procedia PDF Downloads 74
39900 Assessment of Social Vulnerability of Urban Population to Floods – a Case Study of Mumbai

Authors: Sherly M. A., Varsha Vijaykumar, Subhankar Karmakar, Terence Chan, Christian Rau

Abstract:

This study aims at proposing an indicator-based framework for assessing social vulnerability of any coastal megacity to floods. The final set of indicators of social vulnerability are chosen from a set of feasible and available indicators which are prepared using a Geographic Information System (GIS) framework on a smaller scale considering 1-km grid cell to provide an insight into the spatial variability of vulnerability. The optimal weight for each individual indicator is assigned using data envelopment analysis (DEA) as it avoids subjective weights and improves the confidence on the results obtained. In order to de-correlate and reduce the dimension of multivariate data, principal component analysis (PCA) has been applied. The proposed methodology is demonstrated on twenty four wards of Mumbai under the jurisdiction of Municipal Corporation of Greater Mumbai (MCGM). This framework of vulnerability assessment is not limited to the present study area, and may be applied to other urban damage centers.

Keywords: urban floods, vulnerability, data envelopment analysis, principal component analysis

Procedia PDF Downloads 340
39899 Entrepreneurial Support Ecosystem: Role of Research Institutes

Authors: Ayna Yusubova, Bart Clarysse

Abstract:

This paper explores role of research institutes in creation of support ecosystem for new technology-based ventures. Previous literature introduced research institutes as part of business and knowledge ecosystem, very few studies are available that consider a research institute as an ecosystem that support high-tech startups at every stage of development. Based on a resource-based view and a stage-based model of high-tech startups growth, this study aims to analyze how a research institute builds a startup support ecosystem by attracting different stakeholders in order to help startups to overcome resource. This paper is based on an in-depth case study of public research institute that focus on development of entrepreneurial ecosystem in a developed region. Analysis shows that the idea generation stage of high-tech startups that related to the invention and development of product or technology for commercialization is associated with a lack of critical knowledge resources. Second, at growth phase that related to market entrance, high-tech startups face challenges associated with the development of their business network. Accordingly, the study shows the support ecosystem that research institute creates helps high-tech startups overcome resource gaps in order to achieve a successful transition from one phase of growth to the next.

Keywords: new technology-based firms, ecosystems, resources, business incubators, research instutes

Procedia PDF Downloads 241
39898 Examining Audiology Students: Clinical Reasoning Skills When Using Virtual Audiology Cases Aided With no Collaboration, Live Collaboration, and Virtual Collaboration

Authors: Ramy Shaaban

Abstract:

The purpose of this study was to examine the difference in clinical reasoning skills of students when using virtual audiology cases with and without collaborative assistance from major learning approaches important to clinical reasoning skills and computer-based learning models: Situated Learning Theory, Social Development Theory, Scaffolding, and Collaborative Learning. A quasi-experimental design was conducted at two United States universities to examine whether there is a significant difference in clinical reasoning skills between three treatment groups using IUP Audiosim software. Two computer-based audiology case simulations were developed, and participants were randomly placed into the three groups: no collaboration, virtual collaboration, and live collaboration. The clinical reasoning data were analyzed using One-Way ANOVA and Tukey posthoc analyses. The results show that there was a significant difference in clinical reasoning skills between the three treatment groups. The score obtained by the no collaboration group was significantly less than the scores obtained by the virtual and live collaboration groups. Collaboration, whether virtual or in person, has a positive effect on students’ clinical reasoning. These results with audiology students indicate that combining collaboration models with scaffolding and embedding situated learning and social development theories into the design of future virtual patients has the potential to improve students’ clinical reasoning skills.

Keywords: clinical reasoning, virtual patients, collaborative learning, scaffolding

Procedia PDF Downloads 192
39897 Integrating Sustainable Construction Principles into Curriculum Design for Built Environment Professional Programs in Nigeria

Authors: M. Yakubu, M. B. Isah, S. Bako

Abstract:

This paper presents the findings of a research which sought to investigate the readiness to integrate sustainable construction principles into curriculum design for built environment professional programs in the Nigerian Universities. Developing the knowledge and understanding that construction professionals acquire of sustainable construction practice leads to considerable improvement in the environmental performance of the construction sector. Integrating sustainable environmental issues within the built environment education curricula provide the basis of this research. An integration of sustainable development principles into the universities built environment professional programmes are carried out with a view of finding solutions to the key issues identified. The perspectives of academia have been assessed and findings tested for validity through the analysis of primary quantitative data that has been collected. The secondary data generated has shown that there are significant differences in the approach to curriculum design within the built environment professional programmes, and this reveals that there is no ‘best practice’ that is clearly identifiable. Sequel to the above, this research reveals that engaging all stakeholders would be a useful component of built environment curriculum development, and that the curriculum be negotiated with interested parties. These parties have been identified as academia, government, construction industry and built environment professionals.

Keywords: built environment, curriculum development, sustainable construction, sustainable development

Procedia PDF Downloads 402
39896 Reductions of Control Flow Graphs

Authors: Robert Gold

Abstract:

Control flow graphs are a well-known representation of the sequential control flow structure of programs with a multitude of applications. Not only single functions but also sets of functions or complete programs can be modelled by control flow graphs. In this case the size of the graphs can grow considerably and thus makes it difficult for software engineers to analyse the control flow. Graph reductions are helpful in this situation. In this paper we define reductions to subsets of nodes. Since executions of programs are represented by paths through the control flow graphs, paths should be preserved. Furthermore, the composition of reductions makes a stepwise analysis approach possible.

Keywords: control flow graph, graph reduction, software engineering, software applications

Procedia PDF Downloads 528
39895 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 116
39894 Technology Futures in Global Militaries: A Forecasting Method Using Abstraction Hierarchies

Authors: Mark Andrew

Abstract:

Geopolitical tensions are at a thirty-year high, and the pace of technological innovation is driving asymmetry in force capabilities between nation states and between non-state actors. Technology futures are a vital component of defence capability growth, and investments in technology futures need to be informed by accurate and reliable forecasts of the options for ‘systems of systems’ innovation, development, and deployment. This paper describes a method for forecasting technology futures developed through an analysis of four key systems’ development stages, namely: technology domain categorisation, scanning results examining novel systems’ signals and signs, potential system-of systems’ implications in warfare theatres, and political ramifications in terms of funding and development priorities. The method has been applied to several technology domains, including physical systems (e.g., nano weapons, loitering munitions, inflight charging, and hypersonic missiles), biological systems (e.g., molecular virus weaponry, genetic engineering, brain-computer interfaces, and trans-human augmentation), and information systems (e.g., sensor technologies supporting situation awareness, cyber-driven social attacks, and goal-specification challenges to proliferation and alliance testing). Although the current application of the method has been team-centred using paper-based rapid prototyping and iteration, the application of autonomous language models (such as GPT-3) is anticipated as a next-stage operating platform. The importance of forecasting accuracy and reliability is considered a vital element in guiding technology development to afford stronger contingencies as ideological changes are forecast to expand threats to ecology and earth systems, possibly eclipsing the traditional vulnerabilities of nation states. The early results from the method will be subjected to ground truthing using longitudinal investigation.

Keywords: forecasting, technology futures, uncertainty, complexity

Procedia PDF Downloads 96
39893 Design of a Professional Development Framework in Teaching and Learning for Engineering Educators

Authors: Orla McConnell, Cormac MacMahon, Jen Harvey

Abstract:

Ireland’s national professional development framework for those who teach in higher education, aims to provide guidance and leadership in the planning, developing and engaging in professional development practices. A series of pilot projects have been initiated to help explore the framework’s likely utility and acceptance by educators and their institutions. These projects require engagement with staff in the interpretation and adaption of the framework within their working contexts. The purpose of this paper is to outline the development of one such project with engineering educators at three Institutes of Technology seeking designation as a technological university. The initiative aims to gain traction in the acceptance of the framework with the engineering education community by linking core and discipline-specific teaching and learning competencies with professional development activities most valued by engineering educators. Informed by three strands of literature: professional development in higher education; engineering education; and teaching and learning training provisions, the project begins with a survey of all those involved in teaching and learning in engineering across the three institutes. Based on engagement with key stakeholders, subsequent qualitative research informs the contextualization of the national framework for discipline-specific and institutional piloting. The paper concludes by exploring engineering educator perceptions of the national framework’s utility based on their engagement with the pilot process. Feedback from the pilot indicates that there is a significant gap between the professional development needs of engineering educators and the current professional development provision in teaching and learning.

Keywords: engineering education, pilot, professional development, teaching and learning

Procedia PDF Downloads 312
39892 Software Architecture Optimization Using Swarm Intelligence Techniques

Authors: Arslan Ellahi, Syed Amjad Hussain, Fawaz Saleem Bokhari

Abstract:

Optimization of software architecture can be done with respect to a quality attributes (QA). In this paper, there is an analysis of multiple research papers from different dimensions that have been used to classify those attributes. We have proposed a technique of swarm intelligence Meta heuristic ant colony optimization algorithm as a contribution to solve this critical optimization problem of software architecture. We have ranked quality attributes and run our algorithm on every QA, and then we will rank those on the basis of accuracy. At the end, we have selected the most accurate quality attributes. Ant colony algorithm is an effective algorithm and will perform best in optimizing the QA’s and ranking them.

Keywords: complexity, rapid evolution, swarm intelligence, dimensions

Procedia PDF Downloads 235
39891 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.

Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA

Procedia PDF Downloads 176
39890 Some Trends in Analysis of Two-Way Solid Slabs

Authors: Reem I. Al-Ya' Goub, Nasim Shatarat

Abstract:

This paper presents the results of analytical and comparative study among software programs' outputs in analysis of some two way solid slabs; flat plate, flat slab with beams and flat slab with drop panels problems that already been analyzed using Classical Equivalent Frame Method (CEFM) by several reinforced concrete book authors. The primary objective of this research is to determine the moment results using various software programs. Then, a summary of the results and differences percentages were obtained to show how analysis procedure effects the outputs of calculations that vary from software program to another when comparing them with the results of CEFM. Moment values were obtained using either the Equivalent Frame Method (EFM) or Finite Element Method (FEM) that's used among many software programs. The results of the analyses demonstrate that software programs vary markedly in terms of the information they provide to the structural designer regarding values of the model insertion, stiffness, effective moment of inertia used and specially the moment values.

Keywords: two-way solid slabs, flat plate, flat slab with beams, flat slab with drop panels, analysis, modeling, EFM, CEFM, FEM

Procedia PDF Downloads 396
39889 Infrared Thermography as an Informative Tool in Energy Audit and Software Modelling of Historic Buildings: A Case Study of the Sheffield Cathedral

Authors: Ademuyiwa Agbonyin, Stamatis Zoras, Mohammad Zandi

Abstract:

This paper investigates the extent to which building energy modelling can be informed based on preliminary information provided by infrared thermography using a thermal imaging camera in a walkthrough audit. The case-study building is the Sheffield Cathedral, built in the early 1400s. Based on an informative qualitative report generated from the thermal images taken at the site, the regions showing significant heat loss are input into a computer model of the cathedral within the integrated environmental solution (IES) virtual environment software which performs an energy simulation to determine quantitative heat losses through the building envelope. Building data such as material thermal properties and building plans are provided by the architects, Thomas Ford and Partners Ltd. The results of the modelling revealed the portions of the building with the highest heat loss and these aligned with those suggested by the thermal camera. Retrofit options for the building are also considered, however, may not see implementation due to a desire to conserve the architectural heritage of the building. Results show that thermal imaging in a walk-through audit serves as a useful guide for the energy modelling process. Hand calculations were also performed to serve as a 'control' to estimate losses, providing a second set of data points of comparison.

Keywords: historic buildings, energy retrofit, thermal comfort, software modelling, energy modelling

Procedia PDF Downloads 146
39888 Conceptual Model for Logistics Information System

Authors: Ana María Rojas Chaparro, Cristian Camilo Sarmiento Chaves

Abstract:

Given the growing importance of logistics as a discipline for efficient management of materials flow and information, the adoption of tools that permit to create facilities in making decisions based on a global perspective of the system studied has been essential. The article shows how from a concepts-based model is possible to organize and represent in appropriate way the reality, showing accurate and timely information, features that make this kind of models an ideal component to support an information system, recognizing that information as relevant to establish particularities that allow get a better performance about the evaluated sector.

Keywords: system, information, conceptual model, logistics

Procedia PDF Downloads 471
39887 Prediction of Incompatibility Between Excipients and API in Gliclazide Tablets Using Infrared Spectroscopy and Principle Component Analysis

Authors: Farzad Khajavi

Abstract:

Recognition of the interaction between active pharmaceutical ingredients (API) and excipients is a pivotal factor in the development of all pharmaceutical dosage forms. By predicting the interaction between API and excipients, we will be able to prevent the advent of impurities or at least lessen their amount. In this study, we used principle component analysis (PCA) to predict the interaction between Gliclazide as a secondary amine with Lactose in pharmaceutical solid dosage forms. The infrared spectra of binary mixtures of Gliclazide with Lactose at different mole ratios were recorded, and the obtained matrix was analyzed with PCA. By plotting score columns of the analyzed matrix, the incompatibility between Gliclazide and Lactose was observed. This incompatibility was seen experimentally. We observed the appearance of the impurity originated from the Maillard reaction between Gliclazide and Lactose at the chromatogram of the manufactured tablets in room temperature and under accelerated stability conditions. This impurity increases at the stability months. By changing Lactose to Mannitol and using Calcium Dibasic Phosphate in the tablet formulation, the amount of the impurity decreased and was in the acceptance range defined by British pharmacopeia for Gliclazide Tablets. This method is a fast and simple way to predict the existence of incompatibility between excipients and active pharmaceutical ingredients.

Keywords: PCA, gliclazide, impurity, infrared spectroscopy, interaction

Procedia PDF Downloads 190
39886 An Enhanced Approach in Validating Analytical Methods Using Tolerance-Based Design of Experiments (DoE)

Authors: Gule Teri

Abstract:

The effective validation of analytical methods forms a crucial component of pharmaceutical manufacturing. However, traditional validation techniques can occasionally fail to fully account for inherent variations within datasets, which may result in inconsistent outcomes. This deficiency in validation accuracy is particularly noticeable when quantifying low concentrations of active pharmaceutical ingredients (APIs), excipients, or impurities, introducing a risk to the reliability of the results and, subsequently, the safety and effectiveness of the pharmaceutical products. In response to this challenge, we introduce an enhanced, tolerance-based Design of Experiments (DoE) approach for the validation of analytical methods. This approach distinctly measures variability with reference to tolerance or design margins, enhancing the precision and trustworthiness of the results. This method provides a systematic, statistically grounded validation technique that improves the truthfulness of results. It offers an essential tool for industry professionals aiming to guarantee the accuracy of their measurements, particularly for low-concentration components. By incorporating this innovative method, pharmaceutical manufacturers can substantially advance their validation processes, subsequently improving the overall quality and safety of their products. This paper delves deeper into the development, application, and advantages of this tolerance-based DoE approach and demonstrates its effectiveness using High-Performance Liquid Chromatography (HPLC) data for verification. This paper also discusses the potential implications and future applications of this method in enhancing pharmaceutical manufacturing practices and outcomes.

Keywords: tolerance-based design, design of experiments, analytical method validation, quality control, biopharmaceutical manufacturing

Procedia PDF Downloads 54
39885 GeoWeb at the Service of Household Waste Collection in Urban Areas

Authors: Abdessalam Hijab, Eric Henry, Hafida Boulekbache

Abstract:

The complexity of the city makes sustainable management of the urban environment more difficult. Managers are required to make significant human and technical investments, particularly in household waste collection (focus of our research). The aim of this communication is to propose a collaborative geographic multi-actor device (MGCD) based on the link between information and communication technologies (ICT) and geo-web tools in order to involve urban residents in household waste collection processes. Our method is based on a collaborative/motivational concept between the city and its residents. It is a geographic collaboration dedicated to the general public (citizens, residents, and any other participant), based on real-time allocation and geographic location of topological, geographic, and multimedia data in the form of local geo-alerts (location-specific problems) related to household waste in an urban environment. This contribution allows us to understand the extent to which residents can assist and contribute to the development of household waste collection processes for a better protected urban environment. This suggestion provides a good idea of how residents can contribute to the data bank for future uses. Moreover, it will contribute to the transformation of the population into a smart inhabitant as an essential component of a smart city. The proposed model will be tested in the Lamkansa sampling district in Casablanca, Morocco.

Keywords: information and communication technologies, ICTs, GeoWeb, geo-collaboration, city, inhabitant, waste, collection, environment

Procedia PDF Downloads 102
39884 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 490
39883 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification

Authors: Hung-Sheng Lin, Cheng-Hsuan Li

Abstract:

Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.

Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction

Procedia PDF Downloads 324
39882 Comprehensive Analysis and Optimization of Alkaline Water Electrolysis for Green Hydrogen Production: Experimental Validation, Simulation Study, and Cost Analysis

Authors: Umair Ahmed, Muhammad Bin Irfan

Abstract:

This study focuses on designing and optimization of an alkaline water electrolyser for the production of green hydrogen. The aim is to enhance the durability and efficiency of this technology while simultaneously reducing the cost associated with the production of green hydrogen. The experimental results obtained from the alkaline water electrolyser are compared with simulated results using Aspen Plus software, allowing a comprehensive analysis and evaluation. To achieve the aforementioned goals, several design and operational parameters are investigated. The electrode material, electrolyte concentration, and operating conditions are carefully selected to maximize the efficiency and durability of the electrolyser. Additionally, cost-effective materials and manufacturing techniques are explored to decrease the overall production cost of green hydrogen. The experimental setup includes a carefully designed alkaline water electrolyser, where various performance parameters (such as hydrogen production rate, current density, and voltage) are measured. These experimental results are then compared with simulated data obtained using Aspen Plus software. The simulation model is developed based on fundamental principles and validated against the experimental data. The comparison between experimental and simulated results provides valuable insight into the performance of an alkaline water electrolyser. It helps to identify the areas where improvements can be made, both in terms of design and operation, to enhance the durability and efficiency of the system. Furthermore, the simulation results allow cost analysis providing an estimate of the overall production cost of green hydrogen. This study aims to develop a comprehensive understanding of alkaline water electrolysis technology. The findings of this research can contribute to the development of more efficient and durable electrolyser technology while reducing the cost associated with this technology. Ultimately, these advancements can pave the way for a more sustainable and economically viable hydrogen economy.

Keywords: sustainable development, green energy, green hydrogen, electrolysis technology

Procedia PDF Downloads 60
39881 Development of the Integrated Quality Management System of Cooked Sausage Products

Authors: Liubov Lutsyshyn, Yaroslava Zhukova

Abstract:

Over the past twenty years, there has been a drastic change in the mode of nutrition in many countries which has been reflected in the development of new products, production techniques, and has also led to the expansion of sales markets for food products. Studies have shown that solution of the food safety problems is almost impossible without the active and systematic activity of organizations directly involved in the production, storage and sale of food products, as well as without management of end-to-end traceability and exchange of information. The aim of this research is development of the integrated system of the quality management and safety assurance based on the principles of HACCP, traceability and system approach with creation of an algorithm for the identification and monitoring of parameters of technological process of manufacture of cooked sausage products. Methodology of implementation of the integrated system based on the principles of HACCP, traceability and system approach during the manufacturing of cooked sausage products for effective provision for the defined properties of the finished product has been developed. As a result of the research evaluation technique and criteria of performance of the implementation and operation of the system of the quality management and safety assurance based on the principles of HACCP have been developed and substantiated. In the paper regularities of influence of the application of HACCP principles, traceability and system approach on parameters of quality and safety of the finished product have been revealed. In the study regularities in identification of critical control points have been determined. The algorithm of functioning of the integrated system of the quality management and safety assurance has also been described and key requirements for the development of software allowing the prediction of properties of finished product, as well as the timely correction of the technological process and traceability of manufacturing flows have been defined. Based on the obtained results typical scheme of the integrated system of the quality management and safety assurance based on HACCP principles with the elements of end-to-end traceability and system approach for manufacture of cooked sausage products has been developed. As a result of the studies quantitative criteria for evaluation of performance of the system of the quality management and safety assurance have been developed. A set of guidance documents for the implementation and evaluation of the integrated system based on the HACCP principles in meat processing plants have also been developed. On the basis of the research the effectiveness of application of continuous monitoring of the manufacturing process during the control on the identified critical control points have been revealed. The optimal number of critical control points in relation to the manufacture of cooked sausage products has been substantiated. The main results of the research have been appraised during 2013-2014 under the conditions of seven enterprises of the meat processing industry and have been implemented at JSC «Kyiv meat processing plant».

Keywords: cooked sausage products, HACCP, quality management, safety assurance

Procedia PDF Downloads 231
39880 Strategic Partnerships for Sustainable Tourism Development in Papua New Guinea

Authors: Zainab Olabisi Tairu

Abstract:

Strategic partnerships are a core requirement in delivering sustainable tourism for development in developing nations like Papua New Guinea. This paper unveils the strategic partnerships for sustainable tourism development in Papua New Guinea. Much emphasis is made among tourism stakeholders, on the importance of strategic partnership and positioning in developing sustainable tourism development. This paper engages stakeholders’ ecotourism differentiation and power relations in the discussion of the paper through interviews and observations with tourism stakeholders in Papua New Guinea. Collaborative approaches in terms of sustaining the tourism industry, having a milestone of achieved plans, are needed for tourism growth and development. This paper adds a new insight to the body of knowledge on stakeholders’ identification, formation, power relations and an integrated approach to successful tourism development. In order to achieve responsible tourism planning and management outcomes, partnerships must be holistic in perspective and based on sustainable development principles.

Keywords: stakeholders, sustainable tourism, Papua New Guinea, partnerships

Procedia PDF Downloads 641
39879 Working Improvement of Modern Finance in Millennium World

Authors: Saeed Mohammadirad

Abstract:

Financing activities involve long-term liabilities, stockholders' equity (or owner's equity), and changes to short-term borrowings. Finance is very important for every business activities. To perform the finance we have to follow the accounting languages bases on the nature of the business. If all are one package in the software, it is easy to handle, monitor, control, plan, organize, direct and budget the finance. Let us make a challenge in the computer software for the whole finance packages of every business related activities. In this article, it mentioned about the finance functions in the various levels of the business activities and how it should be maintained properly to avoid the unethical events.

Keywords: financing activities, business activities, computer software, unethical events

Procedia PDF Downloads 334
39878 A Methodological Approach to Development of Mental Script for Mental Practice of Micro Suturing

Authors: Vaikunthan Rajaratnam

Abstract:

Intro: Motor imagery (MI) and mental practice (MP) can be an alternative to acquire mastery of surgical skills. One component of using this technique is the use of a mental script. The aim of this study was to design and develop a mental script for basic micro suturing training for skill acquisition using a low-fidelity rubber glove model and to describe the detailed methodology for this process. Methods: This study was based on a design and development research framework. The mental script was developed with 5 expert surgeons performing a cognitive walkthrough of the repair of a vertical opening in a rubber glove model using 8/0 nylon. This was followed by a hierarchal task analysis. A draft script was created, and face and content validity assessed with a checking-back process. The final script was validated with the recruitment of 28 participants, assessed using the Mental Imagery Questionnaire (MIQ). Results: The creation of the mental script is detailed in the full text. After assessment by the expert panel, the mental script had good face and content validity. The average overall MIQ score was 5.2 ± 1.1, demonstrating the validity of generating mental imagery from the mental script developed in this study for micro suturing in the rubber glove model. Conclusion: The methodological approach described in this study is based on an instructional design framework to teach surgical skills. This MP model is inexpensive and easily accessible, addressing the challenge of reduced opportunities to practice surgical skills. However, while motor skills are important, other non-technical expertise required by the surgeon is not addressed with this model. Thus, this model should act a surgical training augment, but not replace it.

Keywords: mental script, motor imagery, cognitive walkthrough, verbal protocol analysis, hierarchical task analysis

Procedia PDF Downloads 82
39877 Technology Impact on the Challenge between Human Rights and Cyber Terrorism

Authors: Abanoub Zare Zakaria Herzalla

Abstract:

The link between terrorism and human rights has become a major challenge in the fight against terrorism around the world. This is based on the fact that terrorism and human rights are so closely linked that when the former starts, the latter are violated. This direct connection was recognized in the Vienna Declaration and Program of Action adopted by the World Conference on Human Rights in Vienna on June 25, 1993, which recognizes that acts of terrorism in all their forms and manifestations aim to destroy the human rights of people. Terrorism therefore represents an attack on our most basic human rights. To this end, the first part of this article focuses on the connections between terrorism and human rights and seeks to highlight the interdependence between these two concepts. The second part discusses the emerging concept of cyberterrorism and its manifestations. An analysis of the fight against cyberterrorism in the context of human rights is also carried out.

Keywords: sustainable development, human rights, the right to development, the human rights-based approach to development, environmental rights, economic development, social sustainability human rights protection, human rights violations, workers’ rights, justice, security.

Procedia PDF Downloads 24