Search results for: common information model
28898 Multi-Path Signal Synchronization Model with Phase Length Constraints
Authors: Tzu-Jung Huang, Hsun-Jung Cho, Chien-Chia Liäm Huang
Abstract:
To improve the level of service (LoS) of urban arterial systems containing a series of signalized intersections, a proper design of offsets for all intersections associated is of great importance. The MAXBAND model has been the most common approach for this purpose. In this paper, we propose a MAXBAND model with phase constraints so that the lengths of the phases in a cycle are variable. In other words, the length of a cycle is also variable in our setting. We conduct experiments on a real-world traffic network, having several major paths, in Taiwan for numerical evaluations. Actual traffic data were collected through on-site experiments. Numerical evidences suggest that the improvements are around 32%, on average, in terms of total delay of the entire network.Keywords: arterial progression, MAXBAND, signal control, offset
Procedia PDF Downloads 35828897 Proximal Method of Solving Split System of Minimization Problem
Authors: Anteneh Getachew Gebrie, Rabian Wangkeeree
Abstract:
The purpose of this paper is to introduce iterative algorithm solving split system of minimization problem given as a task of finding a common minimizer point of finite family of proper, lower semicontinuous convex functions and whose image under a bounded linear operator is also common minimizer point of another finite family of proper, lower semicontinuous convex functions. We obtain strong convergence of the sequence generated by our algorithm under some suitable conditions on the parameters. The iterative schemes are developed with a way of selecting the step sizes such that the information of operator norm is not necessary. Some applications and numerical experiment is given to analyse the efficiency of our algorithm.Keywords: Hilbert Space, minimization problems, Moreau-Yosida approximate, split feasibility problem
Procedia PDF Downloads 14428896 Participatory Approach for Urban Sustainability through Ostrom’s Principles
Authors: Kuladeep Kumar Sadevi
Abstract:
The shift towards raising global urban population has intense implications on the sustainability of the urban livelihoods. Rapid urbanization has made governments, companies and civil societies recognize that they are barely equipped to deal with growing urban demands, especially water, waste and energy management. Effective management of land, water, energy and waste at a community level should be addressed well to attain greener cities. In pursuit of Green livelihoods; various norms, codes, and green rating programmes have been followed by stakeholders at various levels. While the sustainability is being adapted at smaller scale developments, greening the urban environment at community/city level is still finding its path to reality. This is due to lack of the sense of ownership in the citizens for their immediate neighborhoods and city as a whole. This phenomenon can be well connected to the theory of 'tragedy of commons' with respect to the community engagement to manage the common pool resources. The common pool resource management has been well addressed by Elinor Ostrom, who shared the Nobel Prize in Economics in 2009 for her lifetime of scholarly work investigating how communities succeed or fail at managing common pool (finite) resources. This paper examines the applicability of Elinor Ostrom's 8 Principles for Managing a Commons, to meet urban sustainability. The key objective of this paper is to come up with a model for effective urban common pool resource management, which ultimately leads to sustainability as a whole. The paper brings out a methodology to understand various parameters involved in urban sustainability, examine the synergies of all such parameters, and application of Ostrom’s principles to correlate these parameters in order to attain effective urban resource management.Keywords: common pool resources, green cities, green communities, participatory management, sustainable development, urban resource management, urban sustainability
Procedia PDF Downloads 35628895 Information and Cooperativity in Fiction: The Pragmatics of David Baboulene’s “Knowledge Gaps”
Authors: Cara DiGirolamo
Abstract:
In his 2017 Ph.D. thesis, script doctor David Baboulene presented a theory of fiction in which differences in the knowledge states between participants in a literary experience, including reader, author, and characters, create many story elements, among them suspense, expectations, subtext, theme, metaphor, and allegory. This theory can be adjusted and modeled by incorporating a formal pragmatic approach that understands narrative as a speech act with a conversational function. This approach requires both the Speaker and the Listener to be understood as participants in the discourse. It also uses theories of cooperativity and the QUD to identify the existence of implicit questions. This approach predicts that what an effective literary narrative must do: provide a conversational context early in the story so the reader can engage with the text as a conversational participant. In addition, this model incorporates schema theory. Schema theory is a cognitive model for learning and processing information about the world and transforming it into functional knowledge. Using this approach can extend the QUD model. Instead of describing conversation as a form of information gathering restricted to question-answer sets, the QUD can include knowledge modeling and understanding as a possible outcome of a conversation. With this model, Baboulene’s “Knowledge Gaps” can provide real insight into storytelling as a conversational move, and extend the QUD to be able to simply and effectively apply to a more diverse set of conversational interactions and also to narrative texts.Keywords: literature, speech acts, QUD, literary theory
Procedia PDF Downloads 328894 How to Applicate Knowledge Management in Security Environment within the Scope of Optimum Balance Model
Authors: Hakan Erol, Altan Elibol, Ömer Eryılmaz, Mehmet Şimşek
Abstract:
Organizations aim to manage information in a most possible effective way for sustainment and development. In doing so, they apply various procedures and methods. The very same situation is valid for each service of Armed Forces. During long-lasting endeavors such as shaping and maintaining security environment, supporting and securing peace, knowledge management is a crucial asset. Optimum Balance Model aims to promote the system from a decisive point to a higher decisive point. In this context, this paper analyses the application of optimum balance model to knowledge management in Armed Forces and tries to find answer to the question how Optimum Balance Model is integrated in knowledge management.Keywords: optimum balance model, knowledge management, security environment, supporting peace
Procedia PDF Downloads 39828893 Exploring the Activity Fabric of an Intelligent Environment with Hierarchical Hidden Markov Theory
Authors: Chiung-Hui Chen
Abstract:
The Internet of Things (IoT) was designed for widespread convenience. With the smart tag and the sensing network, a large quantity of dynamic information is immediately presented in the IoT. Through the internal communication and interaction, meaningful objects provide real-time services for users. Therefore, the service with appropriate decision-making has become an essential issue. Based on the science of human behavior, this study employed the environment model to record the time sequences and locations of different behaviors and adopted the probability module of the hierarchical Hidden Markov Model for the inference. The statistical analysis was conducted to achieve the following objectives: First, define user behaviors and predict the user behavior routes with the environment model to analyze user purposes. Second, construct the hierarchical Hidden Markov Model according to the logic framework, and establish the sequential intensity among behaviors to get acquainted with the use and activity fabric of the intelligent environment. Third, establish the intensity of the relation between the probability of objects’ being used and the objects. The indicator can describe the possible limitations of the mechanism. As the process is recorded in the information of the system created in this study, these data can be reused to adjust the procedure of intelligent design services.Keywords: behavior, big data, hierarchical hidden Markov model, intelligent object
Procedia PDF Downloads 23328892 Mistuning in Radial Inflow Turbines
Authors: Valentina Futoryanova, Hugh Hunt
Abstract:
One of the common failure modes of the diesel engine turbochargers is high cycle fatigue of the turbine wheel blades. Mistuning of the blades due to the casting process is believed to contribute to the failure mode. Laser vibrometer is used to characterize mistuning for a population of turbine wheels through the analysis of the blade response to piezo speaker induced noise. The turbine wheel design under investigation is radial and is typically used in 6-12 L diesel engine applications. Amplitudes and resonance frequencies are reviewed and summarized. The study also includes test results for a paddle wheel that represents a perfectly tuned system and acts as a reference. Mass spring model is developed for the paddle wheel and the model suitability is tested against the actual data. Randomization is applied to the stiffness matrix to model the mistuning effect in the turbine wheels. Experimental data is shown to have good agreement with the model.Keywords: vibration, radial turbines, mistuning, turbine blades, modal analysis, periodic structures, finite element
Procedia PDF Downloads 43228891 Information Technology and Business Alignments among Different Divisions: A Comparative Analysis of Japan and South Korea
Authors: Michiko Miyamoto
Abstract:
This paper empirically investigates whether information technology (IT) strategies, business strategies, and divisions are aligned to meet overall business goals for Korean Small and medium-sized enterprises (SMEs), based on structure based Strategic Alignment Model, and make comparison with those of Japanese SMEs. Using 2,869 valid responses of Korean Human Capital Corporate Panel survey, a result of this study suggests that Korean human resources (HR) departments have a major influence over IT strategy, which is the same as Japanese SMEs, even though their management styles are quite different. As for IT strategy, it is not related to other departments at all for Korean SMEs. The Korean management seems to possess a great power over each division, such as Sales/Service, Research and Development/Technical Experts, HR, and Production.Keywords: IT-business alignment, structured based strategic alignment model, structural equation model, human resources department
Procedia PDF Downloads 27128890 Surface to the Deeper: A Universal Entity Alignment Approach Focusing on Surface Information
Authors: Zheng Baichuan, Li Shenghui, Li Bingqian, Zhang Ning, Chen Kai
Abstract:
Entity alignment (EA) tasks in knowledge graphs often play a pivotal role in the integration of knowledge graphs, where structural differences often exist between the source and target graphs, such as the presence or absence of attribute information and the types of attribute information (text, timestamps, images, etc.). However, most current research efforts are focused on improving alignment accuracy, often along with an increased reliance on specific structures -a dependency that inevitably diminishes their practical value and causes difficulties when facing knowledge graph alignment tasks with varying structures. Therefore, we propose a universal knowledge graph alignment approach that only utilizes the common basic structures shared by knowledge graphs. We have demonstrated through experiments that our method achieves state-of-the-art performance in fair comparisons.Keywords: knowledge graph, entity alignment, transformer, deep learning
Procedia PDF Downloads 4528889 Inter Religion Harmony and World Peace: Theory from Shah Wali Ullah's Philosophy
Authors: Muhammad Usman Ghani
Abstract:
Religious tolerance is essential for the establishment of peace in the world. In the system created by Almighty Allah where a lot of diversity is found, still, this world holds unity itself. In today's world, human beings have been divided into clashes of civilizations or divided on the basis of religions or lingual differences. A religious scholar of Indo- Pak subcontinent describes four ethics, on the basis of which all religions of the world can unite. He says in his philosophy of religion that, there is a number of elements common in all religions but four are very common and they are: cleanliness, nobel deeds, relation to Almighty (existence of Almighty) and justice. He says that this universe also holds its integrity in itself. All humans are different in their attributes but to be a human being is common in them. Similarly, all species of the universe are different in their nature, but to be the creature of God is commonly shared by all of them.Keywords: inter-religious relation, peace and harmony, unity, four common ethics/virtues
Procedia PDF Downloads 34328888 The Design of a Mixed Matrix Model for Activity Levels Extraction and Sub Processes Classification of a Work Project (Case: Great Tehran Electrical Distribution Company)
Authors: Elham Allahmoradi, Bahman Allahmoradi, Ali Bonyadi Naeini
Abstract:
Complex systems have many aspects. A variety of methods have been developed to analyze these systems. The most efficient of these methods should not only be simple, but also provide useful and comprehensive information about many aspects of the system. Matrix methods are considered the most commonly methods used to analyze and design systems. Each matrix method can examine a particular aspect of the system. If these methods are combined, managers can access to more comprehensive and broader information about the system. This study was conducted in four steps. In the first step, a process model of a real project has been extracted through IDEF3. In the second step, activity levels have been attained by writing a process model in the form of a design structure matrix (DSM) and sorting it through triangulation algorithm (TA). In the third step, sub-processes have been obtained by writing the process model in the form of an interface structure matrix (ISM) and clustering it through cluster identification algorithm (CIA). In the fourth step, a mixed model has been developed to provide a unified picture of the project structure through the simultaneous presentation of activities and sub-processes. Finally, the paper is completed with a conclusion.Keywords: integrated definition for process description capture (IDEF3) method, design structure matrix (DSM), interface structure matrix (ism), mixed matrix model, activity level, sub-process
Procedia PDF Downloads 49428887 Information Communication Technology (ICT) Using Management in Nursing College under the Praboromarajchanok Institute
Authors: Suphaphon Udomluck, Pannathorn Chachvarat
Abstract:
Information Communication Technology (ICT) using management is essential for effective decision making in organization. The Concerns Based Adoption Model (CBAM) was employed as the conceptual framework. The purposes of the study were to assess the situation of Information Communication Technology (ICT) using management in College of Nursing under the Praboromarajchanok Institute. The samples were multi – stage sampling of 10 colleges of nursing that participated include directors, vice directors, head of learning groups, teachers, system administrator and responsible for ICT. The total participants were 280; the instrument used were questionnaires that include 4 parts, general information, Information Communication Technology (ICT) using management, the Stage of concern Questionnaires (SoC), and the Levels of Use (LoU) ICT Questionnaires respectively. Reliability coefficients were tested; alpha coefficients were 0.967for Information Communication Technology (ICT) using management, 0.884 for SoC and 0.945 for LoU. The data were analyzed by frequency, percentage, mean, standard deviation, Pearson Product Moment Correlation and Multiple Regression. They were founded as follows: The high level overall score of Information Communication Technology (ICT) using management and issue were administration, hardware, software, and people. The overall score of the Stage of concern (SoC)ICTis at high level and the overall score of the Levels of Use (LoU) ICTis at moderate. The Information Communication Technology (ICT) using management had the positive relationship with the Stage of concern (SoC)ICTand the Levels of Use (LoU) ICT(p < .01). The results of Multiple Regression revealed that administration hardwear, software and people ware could predict SoC of ICT (18.5%) and LoU of ICT (20.8%).The factors that were significantly influenced by SoCs were people ware. The factors that were significantly influenced by LoU of ICT were administration hardware and people ware.Keywords: information communication technology (ICT), management, the concerns-based adoption model (CBAM), stage of concern(SoC), the levels of use(LoU)
Procedia PDF Downloads 31828886 Multi-Level Air Quality Classification in China Using Information Gain and Support Vector Machine
Authors: Bingchun Liu, Pei-Chann Chang, Natasha Huang, Dun Li
Abstract:
Machine Learning and Data Mining are the two important tools for extracting useful information and knowledge from large datasets. In machine learning, classification is a wildly used technique to predict qualitative variables and is generally preferred over regression from an operational point of view. Due to the enormous increase in air pollution in various countries especially China, Air Quality Classification has become one of the most important topics in air quality research and modelling. This study aims at introducing a hybrid classification model based on information theory and Support Vector Machine (SVM) using the air quality data of four cities in China namely Beijing, Guangzhou, Shanghai and Tianjin from Jan 1, 2014 to April 30, 2016. China's Ministry of Environmental Protection has classified the daily air quality into 6 levels namely Serious Pollution, Severe Pollution, Moderate Pollution, Light Pollution, Good and Excellent based on their respective Air Quality Index (AQI) values. Using the information theory, information gain (IG) is calculated and feature selection is done for both categorical features and continuous numeric features. Then SVM Machine Learning algorithm is implemented on the selected features with cross-validation. The final evaluation reveals that the IG and SVM hybrid model performs better than SVM (alone), Artificial Neural Network (ANN) and K-Nearest Neighbours (KNN) models in terms of accuracy as well as complexity.Keywords: machine learning, air quality classification, air quality index, information gain, support vector machine, cross-validation
Procedia PDF Downloads 23528885 Dual-Network Memory Model for Temporal Sequences
Authors: Motonobu Hattori
Abstract:
In neural networks, when new patters are learned by a network, they radically interfere with previously stored patterns. This drawback is called catastrophic forgetting. We have already proposed a biologically inspired dual-network memory model which can much reduce this forgetting for static patterns. In this model, information is first stored in the hippocampal network, and thereafter, it is transferred to the neocortical network using pseudo patterns. Because, temporal sequence learning is more important than static pattern learning in the real world, in this study, we improve our conventional dual-network memory model so that it can deal with temporal sequences without catastrophic forgetting. The computer simulation results show the effectiveness of the proposed dual-network memory model.Keywords: catastrophic forgetting, dual-network, temporal sequences, hippocampal
Procedia PDF Downloads 27028884 Understanding Stock-Out of Pharmaceuticals in Timor-Leste: A Case Study in Identifying Factors Impacting on Pharmaceutical Quantification in Timor-Leste
Authors: Lourenco Camnahas, Eileen Willis, Greg Fisher, Jessie Gunson, Pascale Dettwiller, Charlene Thornton
Abstract:
Stock-out of pharmaceuticals is a common issue at all level of health services in Timor-Leste, a small post-conflict country. This lead to the research questions: what are the current methods used to quantify pharmaceutical supplies; what factors contribute to the on-going pharmaceutical stock-out? The study examined factors that influence the pharmaceutical supply chain system. Methodology: Privett and Goncalvez dependency model has been adopted for the design of the qualitative interviews. The model examines pharmaceutical supply chain management at three management levels: management of individual pharmaceutical items, health facilities, and health systems. The interviews were conducted in order to collect information on inventory management, logistics management information system (LMIS) and the provision of pharmaceuticals. Andersen' behavioural model for healthcare utilization also informed the interview schedule, specifically factors linked to environment (healthcare system and external environment) and the population (enabling factors). Forty health professionals (bureaucrats, clinicians) and six senior officers from a United Nations Agency, a global multilateral agency and a local non-governmental organization were interviewed on their perceptions of factors (healthcare system/supply chain and wider environment) impacting on stock out. Additionally, policy documents for the entire healthcare system, along with population data were collected. Findings: An analysis using Pozzebon’s critical interpretation identified a range of difficulties within the system from poor coordination to failure to adhere to policy guidelines along with major difficulties with inventory management, quantification, forecasting, and budgetary constraints. Weak logistics management information system, lack of capacity in inventory management, monitoring and supervision are additional organizational factors that also contributed to the issue. There were various methods of quantification of pharmaceuticals applied in the government sector, and non-governmental organizations. Lack of reliable data is one of the major problems in the pharmaceutical provision. Global Fund has the best quantification methods fed by consumption data and malaria cases. There are other issues that worsen stock-out: political intervention, work ethic and basic infrastructure such as unreliable internet connectivity. Major issues impacting on pharmaceutical quantification have been identified. However, current data collection identified limitations within the Andersen model; specifically, a failure to take account of predictors in the healthcare system and the environment (culture/politics/social. The next step is to (a) compare models used by three non-governmental agencies with the government model; (b) to run the Andersen explanatory model for pharmaceutical expenditure for 2 to 5 drug items used by these three development partners in order to see how it correlates with the present model in terms of quantification and forecasting the needs; (c) to repeat objectives (a) and (b) using the government model; (d) to draw a conclusion about the strength.Keywords: inventory management, pharmaceutical forecasting and quantification, pharmaceutical stock-out, pharmaceutical supply chain management
Procedia PDF Downloads 24428883 Media Richness Perspective on Web 2.0 Usage for Knowledge Creation: The Case of the Cocoa Industry in Ghana
Authors: Albert Gyamfi
Abstract:
Cocoa plays critical role in the socio-economic development of Ghana. Meanwhile, smallholder farmers most of whom are illiterate dominate the industry. According to the cocoa-based agricultural knowledge and information system (AKIS) model knowledge is created and transferred to the industry between three key actors: cocoa researchers, extension experts, and cocoa farmers. Dwelling on the SECI model, the media richness theory (MRT), and the AKIS model, a conceptual model of web 2.0-based AKIS model (AKIS 2.0) is developed and used to assess the possible effects of social media usage for knowledge creation in the Ghanaian cocoa industry. A mixed method approach with a survey questionnaire was employed, and a second-order multi-group structural equation model (SEM) was used to analyze the data. The study concludes that the use of web 2.0 applications for knowledge creation would lead to sustainable interactions among the key knowledge actors for effective knowledge creation in the cocoa industry in Ghana.Keywords: agriculture, cocoa, knowledge, media, web 2.0
Procedia PDF Downloads 33328882 A Dynamic Model for Assessing the Advanced Glycation End Product Formation in Diabetes
Authors: Victor Arokia Doss, Kuberapandian Dharaniyambigai, K. Julia Rose Mary
Abstract:
Advanced Glycation End (AGE) products are the end products due to the reaction between excess reducing sugar present in diabetes and free amino group in protein lipids and nucleic acids. Thus, non-enzymic glycation of molecules such as hemoglobin, collagen, and other structurally and functionally important proteins add to the pathogenic complications such as diabetic retinopathy, neuropathy, nephropathy, vascular changes, atherosclerosis, Alzheimer's disease, rheumatoid arthritis, and chronic heart failure. The most common non-cross linking AGE, carboxymethyl lysine (CML) is formed by the oxidative breakdown of fructosyllysine, which is a product of glucose and lysine. CML is formed in a wide variety of tissues and is an index to assess the extent of glycoxidative damage. Thus we have constructed a mathematical and computational model that predicts the effect of temperature differences in vivo, on the formation of CML, which is now being considered as an important intracellular milieu. This hybrid model that had been tested for its parameter fitting and its sensitivity with available experimental data paves the way for designing novel laboratory experiments that would throw more light on the pathological formation of AGE adducts and in the pathophysiology of diabetic complications.Keywords: advanced glycation end-products, CML, mathematical model, computational model
Procedia PDF Downloads 12928881 Modern Imputation Technique for Missing Data in Linear Functional Relationship Model
Authors: Adilah Abdul Ghapor, Yong Zulina Zubairi, Rahmatullah Imon
Abstract:
Missing value problem is common in statistics and has been of interest for years. This article considers two modern techniques in handling missing data for linear functional relationship model (LFRM) namely the Expectation-Maximization (EM) algorithm and Expectation-Maximization with Bootstrapping (EMB) algorithm using three performance indicators; namely the mean absolute error (MAE), root mean square error (RMSE) and estimated biased (EB). In this study, we applied the methods of imputing missing values in the LFRM. Results of the simulation study suggest that EMB algorithm performs much better than EM algorithm in both models. We also illustrate the applicability of the approach in a real data set.Keywords: expectation-maximization, expectation-maximization with bootstrapping, linear functional relationship model, performance indicators
Procedia PDF Downloads 39928880 Inclusion and Changes of a Research Criterion in the Institute for Quality and Accreditation of Computing, Engineering and Technology Accreditation Model
Authors: J. Daniel Sanchez Ruiz
Abstract:
The paper explains why and how a research criterion was included within an accreditation system for undergraduate engineering programs, in spite of not being a common practice of accreditation agencies at a global level. This paper is divided into three parts. The first presents the context and the motivations that led the Institute for Quality and Accreditation of Computing, Engineering and Technology Programs (ICACIT) to add a research criterion. The second describes the criterion adopted and the feedback received during 2017 accreditation cycle. The third, the author proposes changes to the accreditation criteria that respond in a pertinent way to the results-based accreditation model and the national context. The author seeks to reconcile an outcome based accreditation model, aligned with the established by the International Engineering Alliance, with the particular context of higher education in Peru.Keywords: accreditation, engineering education, quality assurance, research
Procedia PDF Downloads 28128879 Universiti Sains Malaysia
Authors: Eisa A. Alsafran, Francis T. Edum-Fotwe, Wayne E. Lord
Abstract:
The degree to which a public client actively participates in Public Private Partnership (PPP) schemes, is seen as a determinant of the success of the arrangement, and in particular, efficiency in the delivery of the assets of any infrastructure development. The asset delivery is often an early barometer for judging the overall performance of the PPP. Currently, there are no defined descriptors for the degree of such participation. The lack of defined descriptors makes the association between the degree of participation and efficiency of asset delivery, difficult to establish. This is particularly so if an optimum effect is desired. In addition, such an association is important for the strategic decision to embark on any PPP initiative. This paper presents a conceptual model of different levels of participation that characterise PPP schemes. The modelling was achieved by a systematic review of reported sources that address essential aspects and structures of PPP schemes, published from 2001 to 2015. As a precursor to the modelling, the common areas of Public Client Participation (PCP) were investigated. Equity and risk emerged as two dominant factors in the common areas of PCP, and were therefore adopted to form the foundation of the modelling. The resultant conceptual model defines the different states of combined PCP. The defined states provide a more rational basis for establishing how the degree of PCP affects the efficiency of asset delivery in PPP schemes.Keywords: asset delivery, infrastructure development, public private partnership, public client participation
Procedia PDF Downloads 26528878 The Use of Medicinal Plants among Middle Aged People in Rural Area, West Java, Indonesia
Authors: Rian Diana, Naufal Muharam Nurdin, Faisal Anwar, Hadi Riyadi, Ali Khomsan
Abstract:
The use of traditional medicine (herbs and medicinal plants) are common among Indonesian people especially the elderly. Few study explore the use of medicinal plants in middle aged people. This study aims to collect information on the use of medicinal plants in middle aged people in rural areas. This cross sectional study included 224 subjects aged 45-59 years old and conducted in Cianjur District, West Java in 2014. Semi-structured questionnaires were used to collect information about preference in treatment of illness, the use of medicinal plants, and their purposes. Information also recorded plant names, parts used, mode of preparation, and dosage. Buying drugs in stall (83.9%) is the first preference in treatment of illness, followed by modern treatment 19.2% (doctors) and traditional treatment 17.0% (herbs/medicinal plants). 87 subjects (38.8%) were using herbs and medicinal plants for curative (66.7%), preventive (31.2%), and rehabilitative (2.1%) purposes. In this study, 48 species are used by the subjects. Physalis minima L. 'cecenet', Orthosiphon aristatus Mic. 'kumis kucing', and Annona muricata 'sirsak' are commonly used for the treatment of hypertension and stiffness. Leaves (64.6%) are the most common part used. Medicinal plants were washed and boiled in a hot water. Subject drinks the herbs with a different dosage. One in three middle aged people used herbal and medicinal plants for curative and preventive treatment particularly hypertension and stiffness. Increasing knowledge about herbal or medicinal plants dosage and their interaction with medical drugs are important to do.Doses vary between 1-3 glasses/day for treatment and 1-2 glasses/months for prevention of diseases.Keywords: herbs, hypertension, medicinal plants, middle age, rural
Procedia PDF Downloads 24328877 Architectural Framework to Preserve Information of Cardiac Valve Control
Authors: Lucia Carrion Gordon, Jaime Santiago Sanchez Reinoso
Abstract:
According to the relation of Digital Preservation and the Health field as a case of study, the architectural model help us to explain that definitions. .The principal goal of Data Preservation is to keep information for a long term. Regarding of Mediacal information, in order to perform a heart transplant, physicians need to preserve this organ in an adequate way. This approach between the two perspectives, the medical and the technological allow checking the similarities about the concepts of preservation. Digital preservation and medical advances are related in the same level as knowledge improvement.Keywords: medical management, digital, data, heritage, preservation
Procedia PDF Downloads 42028876 Development of a Technology Assessment Model by Patents and Customers' Review Data
Authors: Kisik Song, Sungjoo Lee
Abstract:
Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.Keywords: technology assessment, patents, citation information, opinion mining
Procedia PDF Downloads 46628875 Effect of Ecologic Fertilizers on Productivity and Yield Quality of Common and Spelt Wheat
Authors: Danutė Jablonskytė-Raščė, Audronė MankevičIenė, Laura Masilionytė
Abstract:
During the period 2009–2015, in Joniškėlis Experimental Station of the Lithuanian Research Centre for Agriculture and Forestry, the effect of ecologic fertilizers Ekoplant, bio-activators Biokal 01 and Terra Sorb Foliar and their combinations on the formation of the productivity elements, grain yield and quality of winter wheat, spelt (Triticum spelta L.), and common wheat (Triticum aestivum L.) was analysed in ecological agro-system. The soil under FAO classification – Endocalcari-Endo-hypogleyic-Cambisol. In a clay loam soil, ecological fertilizer produced from sunflower hull ash and this fertilizer in combination with plant extracts and bio-humus exerted an influence on the grain yield of spelt and common wheat and their mixture (increased the grain yield by 10.0%, compared with the unfertilized crops). Spelt grain yield was by on average 16.9% lower than that of common wheat and by 11.7% lower than that of the mixture, but the role of spelt in organic production systems is important because with no mineral fertilization it produced grains with a higher (by 4%) gluten content and exhibited a greater ability to suppress weeds (by on average 61.9% lower weed weight) compared with the grain yield and weed suppressive ability of common wheat and mixture. Spelt cultivation in a mixture with common wheat significantly improved quality indicators of the mixture (its grain contained by 2.0% higher protein content and by 4.0% higher gluten content than common wheat grain), reduced disease incidence (by 2-8%), and weed infestation level (by 34-81%).Keywords: common and spelt-wheat, ecological fertilizers, bio-activators, productivity elements, yield, quality
Procedia PDF Downloads 30028874 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods
Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer
Abstract:
Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.Keywords: cross-validation, importance sampling, information criteria, predictive accuracy
Procedia PDF Downloads 39228873 Bivariate Time-to-Event Analysis with Copula-Based Cox Regression
Authors: Duhania O. Mahara, Santi W. Purnami, Aulia N. Fitria, Merissa N. Z. Wirontono, Revina Musfiroh, Shofi Andari, Sagiran Sagiran, Estiana Khoirunnisa, Wahyudi Widada
Abstract:
For assessing interventions in numerous disease areas, the use of multiple time-to-event outcomes is common. An individual might experience two different events called bivariate time-to-event data, the events may be correlated because it come from the same subject and also influenced by individual characteristics. The bivariate time-to-event case can be applied by copula-based bivariate Cox survival model, using the Clayton and Frank copulas to analyze the dependence structure of each event and also the covariates effect. By applying this method to modeling the recurrent event infection of hemodialysis insertion on chronic kidney disease (CKD) patients, from the AIC and BIC values we find that the Clayton copula model was the best model with Kendall’s Tau is (τ=0,02).Keywords: bivariate cox, bivariate event, copula function, survival copula
Procedia PDF Downloads 8228872 Quantifying Uncertainties in an Archetype-Based Building Stock Energy Model by Use of Individual Building Models
Authors: Morten Brøgger, Kim Wittchen
Abstract:
Focus on reducing energy consumption in existing buildings at large scale, e.g. in cities or countries, has been increasing in recent years. In order to reduce energy consumption in existing buildings, political incentive schemes are put in place and large scale investments are made by utility companies. Prioritising these investments requires a comprehensive overview of the energy consumption in the existing building stock, as well as potential energy-savings. However, a building stock comprises thousands of buildings with different characteristics making it difficult to model energy consumption accurately. Moreover, the complexity of the building stock makes it difficult to convey model results to policymakers and other stakeholders. In order to manage the complexity of the building stock, building archetypes are often employed in building stock energy models (BSEMs). Building archetypes are formed by segmenting the building stock according to specific characteristics. Segmenting the building stock according to building type and building age is common, among other things because this information is often easily available. This segmentation makes it easy to convey results to non-experts. However, using a single archetypical building to represent all buildings in a segment of the building stock is associated with loss of detail. Thermal characteristics are aggregated while other characteristics, which could affect the energy efficiency of a building, are disregarded. Thus, using a simplified representation of the building stock could come at the expense of the accuracy of the model. The present study evaluates the accuracy of a conventional archetype-based BSEM that segments the building stock according to building type- and age. The accuracy is evaluated in terms of the archetypes’ ability to accurately emulate the average energy demands of the corresponding buildings they were meant to represent. This is done for the buildings’ energy demands as a whole as well as for relevant sub-demands. Both are evaluated in relation to the type- and the age of the building. This should provide researchers, who use archetypes in BSEMs, with an indication of the expected accuracy of the conventional archetype model, as well as the accuracy lost in specific parts of the calculation, due to use of the archetype method.Keywords: building stock energy modelling, energy-savings, archetype
Procedia PDF Downloads 15428871 Derivation of Bathymetry from High-Resolution Satellite Images: Comparison of Empirical Methods through Geographical Error Analysis
Authors: Anusha P. Wijesundara, Dulap I. Rathnayake, Nihal D. Perera
Abstract:
Bathymetric information is fundamental importance to coastal and marine planning and management, nautical navigation, and scientific studies of marine environments. Satellite-derived bathymetry data provide detailed information in areas where conventional sounding data is lacking and conventional surveys are inaccessible. The two empirical approaches of log-linear bathymetric inversion model and non-linear bathymetric inversion model are applied for deriving bathymetry from high-resolution multispectral satellite imagery. This study compares these two approaches by means of geographical error analysis for the site Kankesanturai using WorldView-2 satellite imagery. Based on the Levenberg-Marquardt method calibrated the parameters of non-linear inversion model and the multiple-linear regression model was applied to calibrate the log-linear inversion model. In order to calibrate both models, Single Beam Echo Sounding (SBES) data in this study area were used as reference points. Residuals were calculated as the difference between the derived depth values and the validation echo sounder bathymetry data and the geographical distribution of model residuals was mapped. The spatial autocorrelation was calculated by comparing the performance of the bathymetric models and the results showing the geographic errors for both models. A spatial error model was constructed from the initial bathymetry estimates and the estimates of autocorrelation. This spatial error model is used to generate more reliable estimates of bathymetry by quantifying autocorrelation of model error and incorporating this into an improved regression model. Log-linear model (R²=0.846) performs better than the non- linear model (R²=0.692). Finally, the spatial error models improved bathymetric estimates derived from linear and non-linear models up to R²=0.854 and R²=0.704 respectively. The Root Mean Square Error (RMSE) was calculated for all reference points in various depth ranges. The magnitude of the prediction error increases with depth for both the log-linear and the non-linear inversion models. Overall RMSE for log-linear and the non-linear inversion models were ±1.532 m and ±2.089 m, respectively.Keywords: log-linear model, multi spectral, residuals, spatial error model
Procedia PDF Downloads 29728870 Who Am I at Work: Work Identity Formation
Authors: Carol Belle-Hallsworth
Abstract:
Human interaction at work evolves over time and, with it, work identity. The social identity is built upon the development of its underpinning and preceding stages. Work identity can be viewed in the same way and will shift based on changes in the work environment and challenges to the work identity (threats to the four stages). This paper provides an analysis of how the stages of trust, autonomy, industry and initiative are related to the employee identity at work. Describing how they are related to each other and the development of identity. It has become common to notice changes in employee behavior during and after major operational changes in an organization. Previous studies suggest that there are emotional triggers that result in the new behaviors displayed. This study seeks to test a theoretical model by testing the relationship between the first four Erikson stages as constructs. A randomized sample of participants undertook a self-administered survey to capture information on trust, autonomy, initiative, and industry.Keywords: work identity, change management, organizational management, technology implementation
Procedia PDF Downloads 30628869 Development of a 3D Model of Real Estate Properties in Fort Bonifacio, Taguig City, Philippines Using Geographic Information Systems
Authors: Lyka Selene Magnayi, Marcos Vinas, Roseanne Ramos
Abstract:
As the real estate industry continually grows in the Philippines, Geographic Information Systems (GIS) provide advantages in generating spatial databases for efficient delivery of information and services. The real estate sector is not only providing qualitative data about real estate properties but also utilizes various spatial aspects of these properties for different applications such as hazard mapping and assessment. In this study, a three-dimensional (3D) model and a spatial database of real estate properties in Fort Bonifacio, Taguig City are developed using GIS and SketchUp. Spatial datasets include political boundaries, buildings, road network, digital terrain model (DTM) derived from Interferometric Synthetic Aperture Radar (IFSAR) image, Google Earth satellite imageries, and hazard maps. Multiple model layers were created based on property listings by a partner real estate company, including existing and future property buildings. Actual building dimensions, building facade, and building floorplans are incorporated in these 3D models for geovisualization. Hazard model layers are determined through spatial overlays, and different scenarios of hazards are also presented in the models. Animated maps and walkthrough videos were created for company presentation and evaluation. Model evaluation is conducted through client surveys requiring scores in terms of the appropriateness, information content, and design of the 3D models. Survey results show very satisfactory ratings, with the highest average evaluation score equivalent to 9.21 out of 10. The output maps and videos obtained passing rates based on the criteria and standards set by the intended users of the partner real estate company. The methodologies presented in this study were found useful and have remarkable advantages in the real estate industry. This work may be extended to automated mapping and creation of online spatial databases for better storage, access of real property listings and interactive platform using web-based GIS.Keywords: geovisualization, geographic information systems, GIS, real estate, spatial database, three-dimensional model
Procedia PDF Downloads 158