Search results for: latent Dirichlet allocation
502 A Review on Application of Phase Change Materials in Textiles Finishing
Authors: Mazyar Ahrari, Ramin Khajavi, Mehdi Kamali Dolatabadi, Tayebeh Toliyat, Abosaeed Rashidi
Abstract:
Fabric as the first and most common layer that is in permanent contact with human skin is a very good interface to provide coverage, as well as heat and cold insulation. Phase change materials (PCMs) are organic and inorganic compounds which have the capability of absorbing and releasing noticeable amounts of latent heat during phase transitions between solid and liquid phases at a low temperature range. PCMs come across phase changes (liquid-solid and solid-liquid transitions) during absorbing and releasing thermal heat; so, in order to use them for a long time, they should have been encapsulated in polymeric shells, so-called microcapsules. Microencapsulation and nanoencapsulation methods have been developed in order to reduce the reactivity of a PCM with outside environment, promoting the ease of handling, decreasing the diffusion and evaporation rates. Methods of incorporation of PCMs in textiles such as electrospinning and determining thermal properties had been summarized. Paraffin waxes catch a lot of attention due to their high thermal storage density, repeatability of phase change, thermal stability, small volume change during phase transition, chemical stability, non-toxicity, non-flammability, non-corrosive and low cost and they seem to play a key role in confronting with climate change and global warming. In this article, we aimed to review the researches concentrating on the characteristics of PCMs and new materials and methods of microencapsulation.Keywords: thermoregulation, microencapsulation, phase change materials, thermal energy storage, nanoencapsulation
Procedia PDF Downloads 387501 A Gauge Repeatability and Reproducibility Study for Multivariate Measurement Systems
Authors: Jeh-Nan Pan, Chung-I Li
Abstract:
Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries. Measurement system analysis (MSA) plays an important role in helping organizations to improve their product quality. Generally speaking, the gauge repeatability and reproducibility (GRR) study is performed according to the MSA handbook stated in QS9000 standards. Usually, GRR study for assessing the adequacy of gauge variation needs to be conducted prior to the process capability analysis. Traditional MSA only considers a single quality characteristic. With the advent of modern technology, industrial products have become very sophisticated with more than one quality characteristic. Thus, it becomes necessary to perform multivariate GRR analysis for a measurement system when collecting data with multiple responses. In this paper, we take the correlation coefficients among tolerances into account to revise the multivariate precision-to-tolerance (P/T) ratio as proposed by Majeske (2008). We then compare the performance of our revised P/T ratio with that of the existing ratios. The simulation results show that our revised P/T ratio outperforms others in terms of robustness and proximity to the actual value. Moreover, the optimal allocation of several parameters such as the number of quality characteristics (v), sample size of parts (p), number of operators (o) and replicate measurements (r) is discussed using the confidence interval of the revised P/T ratio. Finally, a standard operating procedure (S.O.P.) to perform the GRR study for multivariate measurement systems is proposed based on the research results. Hopefully, it can be served as a useful reference for quality practitioners when conducting such study in industries.Keywords: gauge repeatability and reproducibility, multivariate measurement system analysis, precision-to-tolerance ratio, Gauge repeatability
Procedia PDF Downloads 261500 The Impact of Corporate Social Responsibility Information Disclosure on the Accuracy of Analysts' Earnings Forecasts
Authors: Xin-Hua Zhao
Abstract:
In recent years, the growth rate of social responsibility reports disclosed by Chinese corporations has grown rapidly. The economic effects of the growing corporate social responsibility reports have become a hot topic. The article takes the chemical listed engineering corporations that disclose social responsibility reports in China as a sample, and based on the information asymmetry theory, examines the economic effect generated by corporate social responsibility disclosure with the method of ordinary least squares. The research is conducted from the perspective of analysts’ earnings forecasts and studies the impact of corporate social responsibility information disclosure on improving the accuracy of analysts' earnings forecasts. The results show that there is a statistically significant negative correlation between corporate social responsibility disclosure index and analysts’ earnings forecast error. The conclusions confirm that enterprises can reduce the asymmetry of social and environmental information by disclosing social responsibility reports, and thus improve the accuracy of analysts’ earnings forecasts. It can promote the effective allocation of resources in the market.Keywords: analysts' earnings forecasts, corporate social responsibility disclosure, economic effect, information asymmetry
Procedia PDF Downloads 156499 Norms and Laws: Fate of Community Forestry in Jharkhand
Authors: Pawas Suren
Abstract:
The conflict between livelihood and forest protection has been a perpetual phenomenon in India. In the era of climate change, the problem is expected to aggravate the declining trend of dense forest in the country, creating impediments in the climate change adaptation by the forest dependent communities. In order to access the complexity of the problem, Hazarinagh and Chatra districts of Jharkhand were selected as a case study. To identify norms practiced by the communities to manage community forestry, the ethnographic study was designed to understand the values, traditions, and cultures of forest dependent communities, most of whom were tribal. It was observed that internalization of efficient forest norms is reflected in the pride and honor of such behavior while violators are sanctioned through guilt and shame. The study analyzes the effect of norms being practiced in the management and ecology of community forestry as common property resource. The light of the findings led towards the gaps in the prevalent forest laws to address efficient allocation of property rights. The conclusion embarks on reconsidering accepted factors of forest degradation in India.Keywords: climate change, common property resource, community forestry, norms
Procedia PDF Downloads 341498 Advocacy for Increasing Health Care Budget in Parepare City with DALY Approach: Case Study on Improving Public Health Insurance Budget
Authors: Kasman, Darmawansyah, Alimin Maidin, Amran Razak
Abstract:
Background: In decentralization, advocacy is needed to increase the health budget in Parepare District. One of the advocacy methods recommended by the World Bank is the economic loss approach. Methods: This research is observational in the field of health economics that contributes directly to the magnitude of the economic loss of the community and the government and provides advocacy to the executive and legislative to see the harm it causes. Results: The research results show the amount of direct cost, which consists of household expenditure for transport Rp.295,865,500. Indirect Cost of YLD of Rp.14.688.000, and YLL of Rp.28.986.336.00, so the amount of DALY is Rp.43.674.336.000. The total economic loss of Rp.43.970.201.500. These huge economic losses can be prevented by increasing the allocation of health budgets for promotive and preventive efforts and expanding the coverage of health insurance for the community. Conclusion: There is a need to advocate the executive and legislative about the importance of guarantee on public health financing by conducting studies in terms of economic losses so that all strategic alliances believe that health is an investment.Keywords: advocacy, economic lost, health insurance, economic losses
Procedia PDF Downloads 113497 Anomaly Detection in Financial Markets Using Tucker Decomposition
Authors: Salma Krafessi
Abstract:
The financial markets have a multifaceted, intricate environment, and enormous volumes of data are produced every day. To find investment possibilities, possible fraudulent activity, and market oddities, accurate anomaly identification in this data is essential. Conventional methods for detecting anomalies frequently fail to capture the complex organization of financial data. In order to improve the identification of abnormalities in financial time series data, this study presents Tucker Decomposition as a reliable multi-way analysis approach. We start by gathering closing prices for the S&P 500 index across a number of decades. The information is converted to a three-dimensional tensor format, which contains internal characteristics and temporal sequences in a sliding window structure. The tensor is then broken down using Tucker Decomposition into a core tensor and matching factor matrices, allowing latent patterns and relationships in the data to be captured. A possible sign of abnormalities is the reconstruction error from Tucker's Decomposition. We are able to identify large deviations that indicate unusual behavior by setting a statistical threshold. A thorough examination that contrasts the Tucker-based method with traditional anomaly detection approaches validates our methodology. The outcomes demonstrate the superiority of Tucker's Decomposition in identifying intricate and subtle abnormalities that are otherwise missed. This work opens the door for more research into multi-way data analysis approaches across a range of disciplines and emphasizes the value of tensor-based methods in financial analysis.Keywords: tucker decomposition, financial markets, financial engineering, artificial intelligence, decomposition models
Procedia PDF Downloads 68496 Use of Improved Genetic Algorithm in Cloud Computing to Reduce Energy Consumption in Migration of Virtual Machines
Authors: Marziyeh Bahrami, Hamed Pahlevan Hsseini, Behnam Ghamami, Arman Alvanpour, Hamed Ezzati, Amir Salar Sadeghi
Abstract:
One of the ways to increase the efficiency of services in the system of agents and, of course, in the world of cloud computing, is to use virtualization techniques. The aim of this research is to create changes in cloud computing services that will reduce as much as possible the energy consumption related to the migration of virtual machines and, in some way, the energy related to the allocation of resources and reduce the amount of pollution. So far, several methods have been proposed to increase the efficiency of cloud computing services in order to save energy in the cloud environment. The method presented in this article tries to prevent energy consumption by data centers and the subsequent production of carbon and biological pollutants as much as possible by increasing the efficiency of cloud computing services. The results show that the proposed algorithm, using the improvement in virtualization techniques and with the help of a genetic algorithm, improves the efficiency of cloud services in the matter of migrating virtual machines and finally saves consumption. becomes energy.Keywords: consumption reduction, cloud computing, genetic algorithm, live migration, virtual Machine
Procedia PDF Downloads 58495 Genomic Sequence Representation Learning: An Analysis of K-Mer Vector Embedding Dimensionality
Authors: James Jr. Mashiyane, Risuna Nkolele, Stephanie J. Müller, Gciniwe S. Dlamini, Rebone L. Meraba, Darlington S. Mapiye
Abstract:
When performing language tasks in natural language processing (NLP), the dimensionality of word embeddings is chosen either ad-hoc or is calculated by optimizing the Pairwise Inner Product (PIP) loss. The PIP loss is a metric that measures the dissimilarity between word embeddings, and it is obtained through matrix perturbation theory by utilizing the unitary invariance of word embeddings. Unlike in natural language, in genomics, especially in genome sequence processing, unlike in natural language processing, there is no notion of a “word,” but rather, there are sequence substrings of length k called k-mers. K-mers sizes matter, and they vary depending on the goal of the task at hand. The dimensionality of word embeddings in NLP has been studied using the matrix perturbation theory and the PIP loss. In this paper, the sufficiency and reliability of applying word-embedding algorithms to various genomic sequence datasets are investigated to understand the relationship between the k-mer size and their embedding dimension. This is completed by studying the scaling capability of three embedding algorithms, namely Latent Semantic analysis (LSA), Word2Vec, and Global Vectors (GloVe), with respect to the k-mer size. Utilising the PIP loss as a metric to train embeddings on different datasets, we also show that Word2Vec outperforms LSA and GloVe in accurate computing embeddings as both the k-mer size and vocabulary increase. Finally, the shortcomings of natural language processing embedding algorithms in performing genomic tasks are discussed.Keywords: word embeddings, k-mer embedding, dimensionality reduction
Procedia PDF Downloads 135494 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling
Procedia PDF Downloads 431493 Highway Capacity and Level of Service
Authors: Kidist Mesfin Nguse
Abstract:
Ethiopia is the second most densely populated nation in Africa, and about 121 million people as the 2022 Ethiopia population live report recorded. In recent years, the Ethiopian government (GOE) has been gradually growing its road network. With 138,127 kilometers (85,825 miles) of all-weather roads as of the end of 2018–19, Ethiopia possessed just 39% of the nation's necessary road network and lacked a well-organized system. The Ethiopian urban population report recorded that about 21% of the population lives in urban areas, and the high population, coupled with growth in various infrastructures, has led to the migration of the workforce from rural areas to cities across the country. In main roads, the heterogeneous traffic flow with various operational features makes it more unfavorable, causing frequent congestion in the stretch of road. The Level of Service (LOS), a qualitative measure of traffic, is categorized based on the operating conditions in the traffic stream. Determining the capacity and LOS for this city is very crucial as this affects the planning and design of traffic systems and their operation, and the allocation of route selection for infrastructure building projects to provide for a considerably good level of service.Keywords: capacity, level of service, traffic volume, free flow speed
Procedia PDF Downloads 47492 Simultaneous Determination of Six Characterizing/Quality Parameters of Biodiesels via 1H NMR and Multivariate Calibration
Authors: Gustavo G. Shimamoto, Matthieu Tubino
Abstract:
The characterization and the quality of biodiesel samples are checked by determining several parameters. Considering a large number of analysis to be performed, as well as the disadvantages of the use of toxic solvents and waste generation, multivariate calibration is suggested to reduce the number of tests. In this work, hydrogen nuclear magnetic resonance (1H NMR) spectra were used to build multivariate models, from partial least squares (PLS) regression, in order to determine simultaneously six important characterizing and/or quality parameters of biodiesels: density at 20 ºC, kinematic viscosity at 40 ºC, iodine value, acid number, oxidative stability, and water content. Biodiesels from twelve different oils sources were used in this study: babassu, brown flaxseed, canola, corn, cottonseed, macauba almond, microalgae, palm kernel, residual frying, sesame, soybean, and sunflower. 1H NMR reflects the structures of the compounds present in biodiesel samples and showed suitable correlations with the six parameters. The PLS models were constructed with latent variables between 5 and 7, the obtained values of r(cal) and r(val) were greater than 0.994 and 0.989, respectively. In addition, the models were considered suitable to predict all the six parameters for external samples, taking into account the analytical speed to perform it. Thus, the alliance between 1H NMR and PLS showed to be appropriate to characterize and evaluate the quality of biodiesels, reducing significantly analysis time, the consumption of reagents/solvents, and waste generation. Therefore, the proposed methods can be considered to adhere to the principles of green chemistry.Keywords: biodiesel, multivariate calibration, nuclear magnetic resonance, quality parameters
Procedia PDF Downloads 538491 BOFSC: A Blockchain Based Decentralized Framework to Ensure the Transparency of Organic Food Supply Chain
Authors: Mifta Ul Jannat, Raju Ahmed, Al Mamun, Jannatul Ferdaus, Ritu Costa, Milon Biswas
Abstract:
Blockchain is an internet-based invention that is coveted in the permanent, scumbled record for its capacity to openly accept, record, and distribute transactions. In a traditional supply chain, there are no trustworthy participants for an organic product. Yet blockchain engineering may provide confidence, transparency, and traceability. Blockchain varies in how companies get real, checked, and lasting information from their supply chain and lock in customers. In an arrangement of cryptographic squares, Blockchain digitizes each connection by sparing it. No one person may alter the documents, and any alteration within the agreement is clear to all. The coming to the record is tamper proof and unchanging, offering a complete history of the object’s life cycle and minimizing opening for extorting. The primary aim of this analysis is to identify the underlying problem that the customer faces. In this post, we will minimize the allocation of fraud data through the ’Smart Contract’ and include a certificate of quality assurance.Keywords: blockchain technology, food supply chain, Ethereum, smart contract, quality assurance, trustability, security, transparency
Procedia PDF Downloads 152490 Experts' Perception of Secondary Education Quality Management Challenges in Ethiopia
Authors: Aklilu Alemu, Tak Cheung Chan
Abstract:
Following the intensification of secondary education in the developing world, the attention of Ethiopia has currently shifted to its quality education and its management. This study is aimed to explore experts’ perceptions of quality management challenges in secondary education in Ethiopia. The researchers employed a case study design recruiting participating supervisors from the Ministry of Education, region, zone, wereda, and cluster by using a purposeful sampling technique. Twenty-six interviewees took part in this study. The researchers employed NVivo 8 versions together with a thematic analysis process to analyze the data. This study revealed that major problems that affected quality management practices in Ethiopia were: lack of qualified experts at all levels; lack of accountability in every echelon; the changing nature of teacher education; the ineffectiveness of teacher-licensing programs; and lack of educational budget and the problem of utilizing this limited budget. The study concluded that the experts at different levels were not genuinely fulfilling their roles and responsibilities. Therefore, the Ministry of Finance and Economic Development, together with the concerned parties, needs to reconsider budget allocation for secondary education.Keywords: education quality, Ethiopia, quality challenge, quality management, secondary education
Procedia PDF Downloads 215489 The Viability of Islamic Finance and Its Impact on Global Financial Stability: Evidence from Practical Implications
Authors: Malik Shahzad Shabbir, Muhammad Saarim Ghazi, Amir Khalil ur Rehman
Abstract:
This study examines the factors which influence and contribute towards the financial viability of Islamic finance and its impact on global financial stability. However, the purpose of this paper is to differentiate the practical implications of both Islamic and conventional finance on global financial stability. The Islamic finance is asset backed financing which creates wealth through trade, commerce and believes in risk and return sharing. Islamic banking is asset driven as against to conventional banking which is liability driven. In order to introduce new financial products for market, financial innovation in Islamic finance must be within the Shari’ah parameters that are tested against the ‘Maqasid al-Shari’ah’. Interest-based system leads to income and wealth inequalities and mis-allocation of resources. Moreover, this system has absence of just and equitable aspect of distribution that may exploit either the debt holder or the financier. Such implications are reached to a tipping point that leaves only one choice: change or face continued decline and misery.Keywords: viability, global financial stability, practical implications, asset driven, tipping point
Procedia PDF Downloads 302488 Teacher Support and Academic Resilience in Vietnam: An Analysis of Low Socio-Economic Status Students in Programme for International Student Assessment 2018
Authors: My Ha, Suwei Lin, Huiying Zou
Abstract:
This study aimed at investigating the association between teacher support and academic resilience in a developing country. Using the data from PISA 2018 Student Questionnaire and Cognitive Tests, the study provided evidence of the significant impact teacher support had on reading literacy among 15-year-old students from low socio-economic status (SES) homes in Vietnam. From a total of 5773 Vietnamese participants from all backgrounds, a sample of 1765 disadvantaged students was drawn for analysis. As a result, 32 percent of the low SES sample was identified as resilient. Through their response to the PISA items regarding the frequency of support they received from teachers, the result of Latent Class Analysis (LCA) divides children into three subgroups: High Support (74.6%), Fair Support (21.6%), and Low Support (3.8%). The high support group reported the highest proportion of resilient students. Meanwhile, the low support group scored the lowest mean on reading test and had the lowest rate of resilience. Also, as the level of support increases, reading achievement becomes less dependent on socioeconomic status, reflected by the decrease in both the slope and magnitude of their correlation. Logistic regression revealed that 1 unit increase in standardized teacher support would lead to an increase of 29.1 percent in the odds of a student becoming resilient. The study emphasizes the role of supportive teachers in promoting resilience, as well as lowering educational inequity in general.Keywords: academic resilience, disadvantaged students, teacher support, inequity, PISA
Procedia PDF Downloads 88487 Research on Planning Strategy of Characteristic Town from the Perspective of Ecological Concept: A Case Study on Hangzhou Dream Town in Zhejiang
Authors: Xiaohan Ye
Abstract:
Under the new normal situation, some urban spaces with the industrial base and regional features in Zhejiang, China have been selected to build a characteristic town, a kind of environmentally-friendly development platform with city-industry integrated, in an attempt to achieve the most optimized layout of productivity with the least space resource. After analysis on the connotation, mechanism and mode of characteristic town in Zhejiang, it is suggested in this paper that characteristic town should take improving the regional ecological environment as an important object in planning strategy from the perspective of ecological concept. Improved environmental quality, optimized resource allocation, and compact industrial distribution should be realized so as to drive the regional green and sustainable development. Finally, this paper analyzes location selection, industrial distribution, spatial organization and environment construction based on the exploration of the dream town of Zhejiang province, the first batch of provincial-level characteristic towns to demonstrate how to apply the ecological concept to the design of characteristic town.Keywords: characteristic town, ecological concept, Hangzhou dream town, planning strategy
Procedia PDF Downloads 310486 Hidden Markov Model for Financial Limit Order Book and Its Application to Algorithmic Trading Strategy
Authors: Sriram Kashyap Prasad, Ionut Florescu
Abstract:
This study models the intraday asset prices as driven by Markov process. This work identifies the latent states of the Hidden Markov model, using limit order book data (trades and quotes) to continuously estimate the states throughout the day. This work builds a trading strategy using estimated states to generate signals. The strategy utilizes current state to recalibrate buy/ sell levels and the transition between states to trigger stop-loss when adverse price movements occur. The proposed trading strategy is tested on the Stevens High Frequency Trading (SHIFT) platform. SHIFT is a highly realistic market simulator with functionalities for creating an artificial market simulation by deploying agents, trading strategies, distributing initial wealth, etc. In the implementation several assets on the NASDAQ exchange are used for testing. In comparison to a strategy with static buy/ sell levels, this study shows that the number of limit orders that get matched and executed can be increased. Executing limit orders earns rebates on NASDAQ. The system can capture jumps in the limit order book prices, provide dynamic buy/sell levels and trigger stop loss signals to improve the PnL (Profit and Loss) performance of the strategy.Keywords: algorithmic trading, Hidden Markov model, high frequency trading, limit order book learning
Procedia PDF Downloads 150485 The Diffusion of Telehealth: System-Level Conditions for Successful Adoption
Authors: Danika Tynes
Abstract:
Telehealth is a promising advancement in health care, though there are certain conditions under which telehealth has a greater chance of success. This research sought to further the understanding of what conditions compel the success of telehealth adoption at the systems level applying Diffusion of Innovations (DoI) theory (Rogers, 1962). System-level indicators were selected to represent four components of DoI theory (relative advantage, compatibility, complexity, and observability) and regressed on 5 types of telehealth (teleradiology, teledermatology, telepathology, telepsychology, and remote monitoring) using multiple logistic regression. The analyses supported relative advantage and compatibility as the strongest influencers of telehealth adoption, remote monitoring in particular. These findings help to quantitatively clarify the factors influencing the adoption of innovation and advance the ability to make recommendations on the viability of state telehealth adoption. In addition, results indicate when DoI theory is most applicable to the understanding of telehealth diffusion. Ultimately, this research may contribute to more focused allocation of scarce health care resources through consideration of existing state conditions available foster innovation.Keywords: adoption, diffusion of innovation theory, remote monitoring, system-level indicators
Procedia PDF Downloads 134484 Trade and Environmental Policy Strategies
Authors: Olakunle Felix Adekunle
Abstract:
In the recent years several non-tariff provisions have been regarded as means holding back transboundary environmental damages. Affected countries have then increasingly come up with trade policies to compensate for or to In recent years, several non‐tariff trade provisions have been regarded as means of holding back transboundary environmental damages. Affected countries have then increasingly come up with trade policies to compensate for or to enforce the adoption of environmental policies elsewhere. These non‐tariff trade constraints are claimed to threaten the freedom of trading across nations, as well as the harmonization sought towards the distribution of income and policy measures. Therefore the ‘greening’ of world trade issues essentially ranges over whether there ought or ought not to be a trade‐off between trade and environmental policies. The impacts of free trade and environmental policies on major economic variables (such as trade flows, balances of trade, resource allocation, output, consumption and welfare) are thus studied here, and so is the EKC hypothesis, when such variables are played against the resulting emission levels. The policy response is seen as a political game, played here by two representative parties named North and South. Whether their policy choices, simulated by four scenarios, are right or wrong depends on their policy goals, split into economic and environmental ones.Keywords: environmental, policies, strategies, constraint
Procedia PDF Downloads 331483 i2kit: A Tool for Immutable Infrastructure Deployments
Authors: Pablo Chico De Guzman, Cesar Sanchez
Abstract:
Microservice architectures are increasingly in distributed cloud applications due to the advantages on the software composition, development speed, release cycle frequency and the business logic time to market. On the other hand, these architectures also introduce some challenges on the testing and release phases of applications. Container technology solves some of these issues by providing reproducible environments, easy of software distribution and isolation of processes. However, there are other issues that remain unsolved in current container technology when dealing with multiple machines, such as networking for multi-host communication, service discovery, load balancing or data persistency (even though some of these challenges are already solved by traditional cloud vendors in a very mature and widespread manner). Container cluster management tools, such as Kubernetes, Mesos or Docker Swarm, attempt to solve these problems by introducing a new control layer where the unit of deployment is the container (or the pod — a set of strongly related containers that must be deployed on the same machine). These tools are complex to configure and manage and they do not follow a pure immutable infrastructure approach since servers are reused between deployments. Indeed, these tools introduce dependencies at execution time for solving networking or service discovery problems. If an error on the control layer occurs, which would affect running applications, specific expertise is required to perform ad-hoc troubleshooting. As a consequence, it is not surprising that container cluster support is becoming a source of revenue for consulting services. This paper presents i2kit, a deployment tool based on the immutable infrastructure pattern, where the virtual machine is the unit of deployment. The input for i2kit is a declarative definition of a set of microservices, where each microservice is defined as a pod of containers. Microservices are built into machine images using linuxkit —- a tool for creating minimal linux distributions specialized in running containers. These machine images are then deployed to one or more virtual machines, which are exposed through a cloud vendor load balancer. Finally, the load balancer endpoint is set into other microservices using an environment variable, providing service discovery. The toolkit i2kit reuses the best ideas from container technology to solve problems like reproducible environments, process isolation, and software distribution, and at the same time relies on mature, proven cloud vendor technology for networking, load balancing and persistency. The result is a more robust system with no learning curve for troubleshooting running applications. We have implemented an open source prototype that transforms i2kit definitions into AWS cloud formation templates, where each microservice AMI (Amazon Machine Image) is created on the fly using linuxkit. Even though container cluster management tools have more flexibility for resource allocation optimization, we defend that adding a new control layer implies more important disadvantages. Resource allocation is greatly improved by using linuxkit, which introduces a very small footprint (around 35MB). Also, the system is more secure since linuxkit installs the minimum set of dependencies to run containers. The toolkit i2kit is currently under development at the IMDEA Software Institute.Keywords: container, deployment, immutable infrastructure, microservice
Procedia PDF Downloads 177482 Application of VE in Healthcare Services: An Overview of Healthcare Facility
Authors: Safeer Ahmad, Pratheek Sudhakran, M. Arif Kamal, Tarique Anwar
Abstract:
In Healthcare facility designing, Efficient MEP services are very crucial because the built environment not only affects patients and family but also Healthcare staff and their outcomes. This paper shall cover the basics of Value engineering and its different phases that can be implemented to the MEP Designing stage for Healthcare facility optimization, also VE can improve the product cost the unnecessary costs associated with healthcare services. This paper explores Healthcare facility services and their Value engineering Job plan for the successful application of the VE technique by conducting a Workshop with end-users, designing team and associate experts shall be carried out using certain concepts, tools, methods and mechanism developed to achieve the purpose of selecting what is actually appropriate and ideal among many value engineering processes and tools that have long proven their ability to enhance the value by following the concept of Total quality management while achieving the most efficient resources allocation to satisfy the key functions and requirements of the project without sacrificing the targeted level of service for all design metrics. Detail study has been discussed with analysis been carried out by this process to achieve a better outcome, Various tools are used for the Analysis of the product at different phases used, at the end the results obtained after implementation of techniques are discussed.Keywords: value engineering, healthcare facility, design, services
Procedia PDF Downloads 196481 Land Suitability Analysis for Maize Production in Egbeda Local Government Area of Oyo State Using GIS Techniques
Authors: Abegunde Linda, Adedeji Oluwatayo, Tope-Ajayi Opeyemi
Abstract:
Maize constitutes a major agrarian production for use by the vast population but despite its economic importance, it has not been produced to meet the economic needs of the country. Achieving optimum yield in maize can meaningfully be supported by land suitability analysis in order to guarantee self-sufficiency for future production optimization. This study examines land suitability for maize production through the analysis of the physic-chemical variations in soil properties over space using a Geographic Information System (GIS) framework. Physic-chemical parameters of importance selected include slope, landuse, and physical and chemical properties of the soil. Landsat imagery was used to categorize the landuse, Shuttle Radar Topographic Mapping (SRTM) generated the slope and soil samples were analyzed for its physical and chemical components. Suitability was categorized into highly, moderately and marginally suitable based on Food and Agricultural Organisation (FAO) classification using the Analytical Hierarchy Process (AHP) technique of GIS. This result can be used by small scale farmers for efficient decision making in the allocation of land for maize production.Keywords: AHP, GIS, MCE, suitability, Zea mays
Procedia PDF Downloads 395480 Implementation of Lean Production in Business Enterprises: A Literature-Based Content Analysis of Implementation Procedures
Authors: P. Pötters, A. Marquet, B. Leyendecker
Abstract:
The objective of this paper is to investigate different implementation approaches for the implementation of Lean production in companies. Furthermore, a structured overview of those different approaches is to be made. Therefore, the present work is intended to answer the following research question: What differences and similarities exist between the various systematic approaches and phase models for the implementation of Lean Production? To present various approaches for the implementation of Lean Production discussed in the literature, a qualitative content analysis was conducted. Within the framework of a qualitative survey, a selection of texts dealing with lean production and its introduction was examined. The analysis presents different implementation approaches from the literature, covering the descriptive aspect of the study. The study also provides insights into similarities and differences among the implementation approaches, which are drawn from the analysis of latent text contents and author interpretations. In this study, the focus is on identifying differences and similarities among systemic approaches for implementing Lean Production. The research question takes into account the main object of consideration, objectives pursued, starting point, procedure, and endpoint of the implementation approach. The study defines the concept of Lean Production and presents various approaches described in literature that companies can use to implement Lean Production successfully. The study distinguishes between five systemic implementation approaches and seven phase models to help companies choose the most suitable approach for their implementation project. The findings of this study can contribute to enhancing transparency regarding the existing approaches for implementing Lean Production. This can enable companies to compare and contrast the available implementation approaches and choose the most suitable one for their specific project.Keywords: implementation, lean production, phase models, systematic approaches
Procedia PDF Downloads 103479 Comparing Performance of Irrigation System in Nepal by Collective Action and Decision-Making Capacity of the Farmers
Authors: Manita Ale, Ganesh P. Shivakoti, Ram C. Bastakoti
Abstract:
Irrigation system, a system for enhancing agricultural productivity, requires regular maintenance in order to avoid irregular allocation of water. For maintenance of the system in long run, farmers’ participation plays a key role increasing the performance of system. The performance of any irrigation system mainly relies on various factors which affect collective action plus decision making, as well as their shared impacts. The paper consists of system level information that were collected from 12 Irrigation Systems (IS) from three-sampled districts of Nepal and the household information that were collected from 160 irrigation water users. The results reveal that, out of 12 sampled irrigation systems, only 4 systems shows high performance levels. The high performance level of those systems was characterized on the basis of adequate availability of water, good maintenance of system infrastructure, and conformance to existing rules followed. In addition, the paper compares different irrigation systems based on trust, reciprocity, cropping intensity, command area and yield as tools to indicate the importance of collective action in performance of irrigation system.Keywords: collective action, decision-making, farmers’ participation, performance
Procedia PDF Downloads 404478 Growth Performance of New Born Holstein Calves Supplemented with Garlic (Allium sativum) Powder and Probiotics
Authors: T. W. Kekana, J. J. Baloyi, M. C. Muya, F. V. Nherera
Abstract:
Secondary metabolites (thiosulphinates) from Allium sativum are able to stimulate the production of volatile fatty acids. This study was carried out to investigate the effects of feeding Garlic powder or probiotics or a combination of both on feed intake and growth performance of Holstein calves. Neonatal calves were randomly allocated, according to birth weight, to four dietary treatments, each with 8 calves. The treatments were: C control, no additive (C), G: supplemented with either 5g/d garlic powder (G) or 4 g/d probiotics (P) or GP 5g/d garlic powder and 4 g/d probiotics compound (GP) with the total viable count of 1.3 x 107 cfu/g. Garlic and probiotics were diluted in the daily milk allocation from day 4. Commercial (17.5% CP) starter feed and fresh water were available ad libitum from day 4 until day 42 of age. Calves fed GP (0.27 kg day-1) tended (P=0.055) to have higher DMI than C (0.22 kg day-1). Milk, water, CP, fat intake and FCR were not affected (P>0.05) by the treatments. Metibolisable energy (ME) intake for GP group tended (P=0.058) to be higher than C calves. Combination of G and P (60.3 kg) tended (P = 0.056) to be higher than C (56.0 kg) calves on final BW. Garlic, probiotics or their combination did not affect calve’s HG, ADG and BL (P>0.05). The results of the current study indicated that combination of garlic and probiotics may improve nutrients intake and body weight when fed to calves during the first 42 days of life.Keywords: garlic powder, probiotics, intake, growth, Holstein calves
Procedia PDF Downloads 669477 Global Indicators of Successful Remote Monitoring Adoption Applying Diffusion of Innovation Theory
Authors: Danika Tynes
Abstract:
Innovations in technology have implications for sustainable development in health and wellness. Remote monitoring is one innovation for which the evidence-base has grown to support its viability as a quality healthcare delivery adjunct. This research reviews global data on telehealth adoption, in particular, remote monitoring, and the conditions under which its success becomes more likely. System-level indicators were selected to represent four constructs of DoI theory (relative advantage, compatibility, complexity, and observability) and assessed against 5 types of Telehealth (Teleradiology, Teledermatology, Telepathology, Telepsychology, and Remote Monitoring) using ordinal logistic regression. Analyses include data from 84 countries, as extracted from the World Health Organization, World Bank, ICT (Information Communications Technology) Index, and HDI (Human Development Index) datasets. Analyses supported relative advantage and compatibility as the strongest influencers of remote monitoring adoption. Findings from this research may help focus on the allocation of resources, as a sustainability concern, through consideration of systems-level factors that may influence the success of remote monitoring adoption.Keywords: remote monitoring, diffusion of innovation, telehealth, digital health
Procedia PDF Downloads 132476 Development of Expanded Perlite-Caprylicacid Composite for Temperature Maintainance in Buildings
Authors: Akhila Konala, Jagadeeswara Reddy Vennapusa, Sujay Chattopadhyay
Abstract:
The energy consumption of humankind is growing day by day due to an increase in the population, industrialization and their needs for living. Fossil fuels are the major source of energy to satisfy energy needs, which are non-renewable energy resources. So, there is a need to develop green resources for energy production and storage. Phase change materials (PCMs) derived from plants (green resources) are well known for their capacity to store the thermal energy as latent heat during their phase change from solid to liquid. This property of PCM could be used for storage of thermal energy. In this study, a composite with fatty acid (caprylic acid; M.P 15°C, Enthalpy 179kJ/kg) as a phase change material and expanded perlite as support porous matrix was prepared through direct impregnation method for thermal energy storage applications. The prepared composite was characterized using Differential scanning calorimetry (DSC), Field Emission Scanning Electron Microscope (FESEM), Thermal Gravimetric Analysis (TGA), and Fourier Transform Infrared (FTIR) spectrometer. The melting point of the prepared composite was 15.65°C, and the melting enthalpy was 82kJ/kg. The surface nature of the perlite was observed through FESEM. It was observed that there are micro size pores in the perlite surface, which were responsible for the absorption of PCM into perlite. In TGA thermogram, the PCM loss from composite was started at ~90°C. FTIR curves proved there was no chemical interaction between the perlite and caprylic acid. So, the PCM composite prepared in this work could be effective to use in temperature maintenance of buildings.Keywords: caprylic acid, composite, phase change materials, PCM, perlite, thermal energy
Procedia PDF Downloads 122475 Airline Choice Model for Domestic Flights: The Role of Airline Flexibility
Authors: Camila Amin-Puello, Lina Vasco-Diaz, Juan Ramirez-Arias, Claudia Munoz, Carlos Gonzalez-Calderon
Abstract:
Operational flexibility is a fundamental aspect in the field of airlines because although demand is constantly changing, it is the duty of companies to provide a service to users that satisfies their needs in an efficient manner without sacrificing factors such as comfort, safety and other perception variables. The objective of this research is to understand the factors that describe and explain operational flexibility by implementing advanced analytical methods such as exploratory factor analysis and structural equation modeling, examining multiple levels of operational flexibility and understanding how these variable influences users' decision-making when choosing an airline and in turn how it affects the airlines themselves. The use of a hybrid model and latent variables improves the efficiency and accuracy of airline performance prediction in the unpredictable Colombian market. This pioneering study delves into traveler motivations and their impact on domestic flight demand, offering valuable insights to optimize resources and improve the overall traveler experience. Applying the methods, it was identified that low-cost airlines are not useful for flexibility, while users, especially women, found airlines with greater flexibility in terms of ticket costs and flight schedules to be more useful. All of this allows airlines to anticipate and adapt to their customers' needs efficiently: to plan flight capacity appropriately, adjust pricing strategies and improve the overall passenger experience.Keywords: hybrid choice model, airline, business travelers, domestic flights
Procedia PDF Downloads 10474 Imputing Missing Data in Electronic Health Records: A Comparison of Linear and Non-Linear Imputation Models
Authors: Alireza Vafaei Sadr, Vida Abedi, Jiang Li, Ramin Zand
Abstract:
Missing data is a common challenge in medical research and can lead to biased or incomplete results. When the data bias leaks into models, it further exacerbates health disparities; biased algorithms can lead to misclassification and reduced resource allocation and monitoring as part of prevention strategies for certain minorities and vulnerable segments of patient populations, which in turn further reduce data footprint from the same population – thus, a vicious cycle. This study compares the performance of six imputation techniques grouped into Linear and Non-Linear models on two different realworld electronic health records (EHRs) datasets, representing 17864 patient records. The mean absolute percentage error (MAPE) and root mean squared error (RMSE) are used as performance metrics, and the results show that the Linear models outperformed the Non-Linear models in terms of both metrics. These results suggest that sometimes Linear models might be an optimal choice for imputation in laboratory variables in terms of imputation efficiency and uncertainty of predicted values.Keywords: EHR, machine learning, imputation, laboratory variables, algorithmic bias
Procedia PDF Downloads 83473 Monthly River Flow Prediction Using a Nonlinear Prediction Method
Authors: N. H. Adenan, M. S. M. Noorani
Abstract:
River flow prediction is an essential to ensure proper management of water resources can be optimally distribute water to consumers. This study presents an analysis and prediction by using nonlinear prediction method involving monthly river flow data in Tanjung Tualang from 1976 to 2006. Nonlinear prediction method involves the reconstruction of phase space and local linear approximation approach. The phase space reconstruction involves the reconstruction of one-dimensional (the observed 287 months of data) in a multidimensional phase space to reveal the dynamics of the system. Revenue of phase space reconstruction is used to predict the next 72 months. A comparison of prediction performance based on correlation coefficient (CC) and root mean square error (RMSE) have been employed to compare prediction performance for nonlinear prediction method, ARIMA and SVM. Prediction performance comparisons show the prediction results using nonlinear prediction method is better than ARIMA and SVM. Therefore, the result of this study could be used to developed an efficient water management system to optimize the allocation water resources.Keywords: river flow, nonlinear prediction method, phase space, local linear approximation
Procedia PDF Downloads 409