Search results for: distance based analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 46394

Search results for: distance based analysis

44774 Effects of Boiling Temperature and Time on Colour, Texture and Sensory Properties of Volutharpa ampullacea perryi Meat

Authors: Xianbao Sun, Jinlong Zhao, Shudong He, Jing Li

Abstract:

Volutharpa ampullacea perryi is a high-protein marine shellfish. However, few data are available on the effects of boiling temperatures and time on quality of the meat. In this study, colour, texture and sensory characteristics of Volutharpa ampullacea perryi meat during the boiling cooking processes (75-100 °C, 5-60 min) were investigated by colors analysis, texture profile analysis (TPA), scanning electron microscope (SEM) and sensory evaluation. The ratio of cooking loss gradually increased with the increase of temperature and time. The colour of meat became lighter and more yellower from 85 °C to 95 °C in a short time (5-20 min), but it became brown after a 30 min treatment. TPA results showed that the Volutharpa ampullacea perryi meat were more firm and less cohesive after a higher temperature (95-100 °C) treatment even in a short period (5-15 min). Based on the SEM analysis, it was easily found that the myofibrils structure was destroyed at a higher temperature (85-100 °C). Sensory data revealed that the meat cooked at 85-90 °C in 10-20 min showed higher scores in overall acceptance, as well as color, hardness and taste. Based on these results, it could be constructed that Volutharpa ampullacea perryi meat should be heated on a suitable condition (such as 85 °C 15 min or 90 °C 10 min) in the boiling cooking to be ensure a better acceptability.

Keywords: Volutharpa ampullacea perryi meat, boiling cooking, colour, sensory, texture

Procedia PDF Downloads 275
44773 Design and Evaluation of an Online Case-Based Library for Technology Integration in Teacher Education

Authors: Mustafa Tevfik Hebebci, Ismail Sahin, Sirin Kucuk, Ismail Celik, Ahmet Oguz Akturk

Abstract:

ADDIE is an instructional design model which has the five core elements: analyze, design, develop, implement, and evaluate. The ADDIE approach provides a systematic process for the analysis of instructional needs, the design and development of instructional programs and materials, implementation of a program, and the evaluation of the effectiveness of an instruction. The case-based study is an instructional design model that is a variant of project-oriented learning. Collecting and analyzing stories can be used in two primary ways -perform task analysis and as a learning support during instruction- by instructional designers. Besides, teachers use technology to develop students’ thinking, enriching the learning environment and providing permanent learning. The purpose of this paper is to introduce an interactive online case-study library website developed in a national project. The design goal of the website is to provide interactive, enhanced, case-based and online educational resource for educators through the purpose and within the scope of a national project. The ADDIE instructional design model was used in the development of the website for the interactive case-based library. This web-based library contains the navigation menus as the follows: “Homepage”, "Registration", "Branches", "Aim of The Research", "About TPACK", "National Project", "Contact Us", etc. This library is developed on a web-based platform, which is important in terms of manageability, accessibility, and updateability of data. Users are able to sort the displayed case-studies by their titles, dates, ratings, view counts, etc. In addition, they encouraged to rate and comment on the case-studies. The usability test is used and the expert opinion is taken for the evaluation of the website. This website is a tool to integrate technology in education. It is believed that this website will be beneficial for pre-service and in-service teachers in terms of their professional developments.

Keywords: design, ADDIE, case based library, technology integration

Procedia PDF Downloads 474
44772 Modeling of the Heat and Mass Transfer in Fluids through Thermal Pollution in Pipelines

Authors: V. Radulescu, S. Dumitru

Abstract:

Introduction: Determination of the temperature field inside a fluid in motion has many practical issues, especially in the case of turbulent flow. The phenomenon is greater when the solid walls have a different temperature than the fluid. The turbulent heat and mass transfer have an essential role in case of the thermal pollution, as it was the recorded during the damage of the Thermoelectric Power-plant Oradea (closed even today). Basic Methods: Solving the theoretical turbulent thermal pollution represents a particularly difficult problem. By using the semi-empirical theories or by simplifying the made assumptions, based on the experimental measurements may be assured the elaboration of the mathematical model for further numerical simulations. The three zones of flow are analyzed separately: the vicinity of the solid wall, the turbulent transition zone, and the turbulent core. For each area are determined the distribution law of temperature. It is determined the dependence of between the Stanton and Prandtl numbers with correction factors, based on measurements experimental. Major Findings/Results: The limitation of the laminar thermal substrate was determined based on the theory of Landau and Levice, using the assumption that the longitudinal component of the velocity pulsation and the pulsation’s frequency varies proportionally with the distance to the wall. For the calculation of the average temperature, the formula is used a similar solution as for the velocity, by an analogous mediation. On these assumptions, the numerical modeling was performed with a gradient of temperature for the turbulent flow in pipes (intact or damaged, with cracks) having 4 different diameters, between 200-500 mm, as there were in the Thermoelectric Power-plant Oradea. Conclusions: It was made a superposition between the molecular viscosity and the turbulent one, followed by addition between the molecular and the turbulent transfer coefficients, necessary to elaborate the theoretical and the numerical modeling. The concept of laminar boundary layer has a different thickness when it is compared the flow with heat transfer and that one without a temperature gradient. The obtained results are within the margin of error of 5%, between the semi-empirical classical theories and the developed model, based on the experimental data. Finally, it is obtained a general correlation between the Stanton number and the Prandtl number, for a specific flow (with associated Reynolds number).

Keywords: experimental measurements, numerical correlations, thermal pollution through pipelines, turbulent thermal flow

Procedia PDF Downloads 162
44771 Binary Decision Diagram Based Methods to Evaluate the Reliability of Systems Considering Failure Dependencies

Authors: Siqi Qiu, Yijian Zheng, Xin Guo Ming

Abstract:

In many reliability and risk analysis, failures of components are supposed to be independent. However, in reality, the ignorance of failure dependencies among components may render the results of reliability and risk analysis incorrect. There are two principal ways to incorporate failure dependencies in system reliability and risk analysis: implicit and explicit methods. In the implicit method, failure dependencies can be modeled by joint probabilities, correlation values or conditional probabilities. In the explicit method, certain types of dependencies can be modeled in a fault tree as mutually independent basic events for specific component failures. In this paper, explicit and implicit methods based on BDD will be proposed to evaluate the reliability of systems considering failure dependencies. The obtained results prove the equivalence of the proposed implicit and explicit methods. It is found that the consideration of failure dependencies decreases the reliability of systems. This observation is intuitive, because more components fail due to failure dependencies. The consideration of failure dependencies helps designers to reduce the dependencies between components during the design phase to make the system more reliable.

Keywords: reliability assessment, risk assessment, failure dependencies, binary decision diagram

Procedia PDF Downloads 469
44770 Review, Analysis and Simulation of Advanced Technology Solutions of Selected Components in Power Electronics Systems (PES) of More Electric Aircraft

Authors: Lucjan Setlak, Emil Ruda

Abstract:

The subject of this paper is to review, comparative analysis and simulation of selected components of power electronic systems (PES), consistent with the concept of a more electric aircraft (MEA). Comparative analysis and simulation in software environment MATLAB / Simulink were carried out based on a group of representatives of civil aircraft (B-787, A-380) and military (F-22 Raptor, F-35) in the context of multi-pulse converters used in them (6- and 12-pulse, and 18- and 24-pulse), which are key components of high-tech electronics on-board power systems of autonomous power systems (ASE) of modern aircraft (airplanes of the future).

Keywords: converters, electric machines, MEA (more electric aircraft), PES (power electronics systems)

Procedia PDF Downloads 489
44769 Development of Automated Quality Management System for the Management of Heat Networks

Authors: Nigina Toktasynova, Sholpan Sagyndykova, Zhanat Kenzhebayeva, Maksat Kalimoldayev, Mariya Ishimova, Irbulat Utepbergenov

Abstract:

Any business needs a stable operation and continuous improvement, therefore it is necessary to constantly interact with the environment, to analyze the work of the enterprise in terms of employees, executives and consumers, as well as to correct any inconsistencies of certain types of processes and their aggregate. In the case of heat supply organizations, in addition to suppliers, local legislation must be considered which often is the main regulator of pricing of services. In this case, the process approach used to build a functional organizational structure in these types of businesses in Kazakhstan is a challenge not only in the implementation, but also in ways of analyzing the employee's salary. To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC according to the method of Kaplan and Norton, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.

Keywords: balanced scorecard, heat supply, quality management system, the theory of fuzzy sets

Procedia PDF Downloads 365
44768 Providing Tailored as a Human Rights Obligation: Feminist Lawyering as an Alternative Practice to Address Gender-Based Violence Against Women Refugees

Authors: Maelle Noir

Abstract:

International Human rights norms prescribe the obligation to protect refugee women against violence which requires, inter alia, state provision of justiciable, accessible, affordable and non-discriminatory access to justice. However, the interpretation and application of the law still lack gender sensitivity, intersectionality and a trauma-informed approach. Consequently, many refugee survivors face important structural obstacles preventing access to justice and often experience secondary traumatisation when navigating the legal system. This paper argues that the unique nature of the experiences of refugees with gender-based violence against women exacerbated throughout the migration journey calls for a tailored practice of the law to ensure adequate access to justice. The argument developed here is that the obligation to provide survivors with justiciable, accessible, affordable and non-discriminatory access to justice implies radically transforming the practice of the law altogether. This paper, therefore, proposes feminist lawyering as an alternative approach to the practice of the law when addressing gender-based violence against women refugees. First, this paper discusses the specific nature of gender-based violence against refugees with a particular focus on two aspects of the power-violence nexus: the analysis of the shift in gender roles and expectations following displacement as one of the causes of gender-based violence against women refugees and the argument that the asylum situation itself constitutes a form of state-sponsored and institutional violence. Second, the re-traumatising and re-victimising nature of the legal system is explored with the objective to demonstrate States’ failure to comply with their legal obligation to provide refugee women with effective access to justice. Third, this paper discusses some key practical strategies that have been proposed and implemented to transform the practice of the law when dealing with gender-based violence outside of the refugee context. Lastly, this analysis is applied to the specificities of the experiences of refugee survivors of gender-based violence.

Keywords: feminist lawyering, feminist legal theory, gender-based violence, human rights law, intersectionality, refugee protection

Procedia PDF Downloads 180
44767 Robust Model Predictive Controller for Uncertain Nonlinear Wheeled Inverted Pendulum Systems: A Tube-Based Approach

Authors: Tran Gia Khanh, Dao Phuong Nam, Do Trong Tan, Nguyen Van Huong, Mai Xuan Sinh

Abstract:

This work presents the problem of tube-based robust model predictive controller for a class of continuous-time systems in the presence of input disturbances. The main objective is to point out the state trajectory of closed system being maintained inside a sequence of tubes. An estimation of attraction region of the closed system is pointed out based on input state stability (ISS) theory and linearized model in each time interval. The theoretical analysis and simulation results demonstrate the performance of the proposed algorithm for a wheeled inverted pendulum system.

Keywords: input state stability (ISS), tube-based robust MPC, continuous-time nonlinear systems, wheeled inverted pendulum

Procedia PDF Downloads 216
44766 The Analysis of Deceptive and Truthful Speech: A Computational Linguistic Based Method

Authors: Seham El Kareh, Miramar Etman

Abstract:

Recently, detecting liars and extracting features which distinguish them from truth-tellers have been the focus of a wide range of disciplines. To the author’s best knowledge, most of the work has been done on facial expressions and body gestures but only few works have been done on the language used by both liars and truth-tellers. This paper sheds light on four axes. The first axis copes with building an audio corpus for deceptive and truthful speech for Egyptian Arabic speakers. The second axis focuses on examining the human perception of lies and proving our need for computational linguistic-based methods to extract features which characterize truthful and deceptive speech. The third axis is concerned with building a linguistic analysis program that could extract from the corpus the inter- and intra-linguistic cues for deceptive and truthful speech. The program built here is based on selected categories from the Linguistic Inquiry and Word Count program. Our results demonstrated that Egyptian Arabic speakers on one hand preferred to use first-person pronouns and present tense compared to the past tense when lying and their lies lacked of second-person pronouns, and on the other hand, when telling the truth, they preferred to use the verbs related to motion and the nouns related to time. The results also showed that there is a need for bigger data to prove the significance of words related to emotions and numbers.

Keywords: Egyptian Arabic corpus, computational analysis, deceptive features, forensic linguistics, human perception, truthful features

Procedia PDF Downloads 203
44765 Fuzzy Optimization Multi-Objective Clustering Ensemble Model for Multi-Source Data Analysis

Authors: C. B. Le, V. N. Pham

Abstract:

In modern data analysis, multi-source data appears more and more in real applications. Multi-source data clustering has emerged as a important issue in the data mining and machine learning community. Different data sources provide information about different data. Therefore, multi-source data linking is essential to improve clustering performance. However, in practice multi-source data is often heterogeneous, uncertain, and large. This issue is considered a major challenge from multi-source data. Ensemble is a versatile machine learning model in which learning techniques can work in parallel, with big data. Clustering ensemble has been shown to outperform any standard clustering algorithm in terms of accuracy and robustness. However, most of the traditional clustering ensemble approaches are based on single-objective function and single-source data. This paper proposes a new clustering ensemble method for multi-source data analysis. The fuzzy optimized multi-objective clustering ensemble method is called FOMOCE. Firstly, a clustering ensemble mathematical model based on the structure of multi-objective clustering function, multi-source data, and dark knowledge is introduced. Then, rules for extracting dark knowledge from the input data, clustering algorithms, and base clusterings are designed and applied. Finally, a clustering ensemble algorithm is proposed for multi-source data analysis. The experiments were performed on the standard sample data set. The experimental results demonstrate the superior performance of the FOMOCE method compared to the existing clustering ensemble methods and multi-source clustering methods.

Keywords: clustering ensemble, multi-source, multi-objective, fuzzy clustering

Procedia PDF Downloads 184
44764 Study of Gait Stability Evaluation Technique Based on Linear Inverted Pendulum Model

Authors: Kang Sungjae

Abstract:

This research proposes a gait stability evaluation technique based on the linear inverted pendulum model and moving support foot Zero Moment Point. With this, an improvement towards the gait analysis of the orthosis walk is validated. The application of Lagrangian mechanics approximation to the solutions of the dynamics equations for the linear inverted pendulum does not only simplify the solution, but it provides a smooth Zero Moment Point for the double feet support phase. The Zero Moment Point gait analysis techniques mentioned above validates reference trajectories for the center of mass of the gait orthosis, the timing of the steps and landing position references for the swing feet. The stability evaluation technique are tested with a 6 DOF powered gait orthosis. The results obtained are promising for implementations.

Keywords: locomotion, center of mass, gait stability, linear inverted pendulum model

Procedia PDF Downloads 512
44763 Game-Theory-Based on Downlink Spectrum Allocation in Two-Tier Networks

Authors: Yu Zhang, Ye Tian, Fang Ye Yixuan Kang

Abstract:

The capacity of conventional cellular networks has reached its upper bound and it can be well handled by introducing femtocells with low-cost and easy-to-deploy. Spectrum interference issue becomes more critical in peace with the value-added multimedia services growing up increasingly in two-tier cellular networks. Spectrum allocation is one of effective methods in interference mitigation technology. This paper proposes a game-theory-based on OFDMA downlink spectrum allocation aiming at reducing co-channel interference in two-tier femtocell networks. The framework is formulated as a non-cooperative game, wherein the femto base stations are players and frequency channels available are strategies. The scheme takes full account of competitive behavior and fairness among stations. In addition, the utility function reflects the interference from the standpoint of channels essentially. This work focuses on co-channel interference and puts forward a negative logarithm interference function on distance weight ratio aiming at suppressing co-channel interference in the same layer network. This scenario is more suitable for actual network deployment and the system possesses high robustness. According to the proposed mechanism, interference exists only when players employ the same channel for data communication. This paper focuses on implementing spectrum allocation in a distributed fashion. Numerical results show that signal to interference and noise ratio can be obviously improved through the spectrum allocation scheme and the users quality of service in downlink can be satisfied. Besides, the average spectrum efficiency in cellular network can be significantly promoted as simulations results shown.

Keywords: femtocell networks, game theory, interference mitigation, spectrum allocation

Procedia PDF Downloads 155
44762 Nest-Site Selection of Crested Lark (Galerida cristata) in Yazd Province, Iran

Authors: Shirin Aghanajafizadeh

Abstract:

Nest site selection of Crested Lark was investigated in Boroyeh wildlife sanctuary of Harat during spring 2014. Habitat variables such as number of plant species, soil texture, distance to the nearest water resources, farms and roads were compared in the species presence plots with absence ones. Our analysis showed that the average number of Zygophyllum atriplicoidesand, Artemisia sieberi were higher while fine-textured soil percent cover (with very little and gravel) was lower in species presence plots than control plots. We resulted that the most affecting factor in the species nest site selection is the number of Z .atriplicoides and soil texture. Z. atriplicoides and A. sieberi can provide cover for nests and chickens against predators and environmental harsh events such as sunshine and wind. The stability of built nest forces the birds to select sites with not fine-textured soil. Some of the nests were detected in Alfalfa farms that can be related to its cover producing capability.

Keywords: habitat selection, Yazd Province, presence and absence plots, habitat variables

Procedia PDF Downloads 181
44761 YOLO-IR: Infrared Small Object Detection in High Noise Images

Authors: Yufeng Li, Yinan Ma, Jing Wu, Chengnian Long

Abstract:

Infrared object detection aims at separating small and dim target from clutter background and its capabilities extend beyond the limits of visible light, making it invaluable in a wide range of applications such as improving safety, security, efficiency, and functionality. However, existing methods are usually sensitive to the noise of the input infrared image, leading to a decrease in target detection accuracy and an increase in the false alarm rate in high-noise environments. To address this issue, an infrared small target detection algorithm called YOLO-IR is proposed in this paper to improve the robustness to high infrared noise. To address the problem that high noise significantly reduces the clarity and reliability of target features in infrared images, we design a soft-threshold coordinate attention mechanism to improve the model’s ability to extract target features and its robustness to noise. Since the noise may overwhelm the local details of the target, resulting in the loss of small target features during depth down-sampling, we propose a deep and shallow feature fusion neck to improve the detection accuracy. In addition, because the generalized Intersection over Union (IoU)-based loss functions may be sensitive to noise and lead to unstable training in high-noise environments, we introduce a Wasserstein-distance based loss function to improve the training of the model. The experimental results show that YOLO-IR achieves a 5.0% improvement in recall and a 6.6% improvement in F1-score over existing state-of-art model.

Keywords: infrared small target detection, high noise, robustness, soft-threshold coordinate attention, feature fusion

Procedia PDF Downloads 60
44760 An Evaluation Model for Automatic Map Generalization

Authors: Quynhan Tran, Hong Fan, Quockhanh Pham

Abstract:

Automatic map generalization is a well-known problem in cartography. The development of map generalization research accompanied the development of cartography. The traditional map is plotted manually by cartographic experts. The paper studies none-scale automation generalization of resident polygons and house marker symbol, proposes methodology to evaluate the result maps based on minimal spanning tree. In this paper, the minimal spanning tree before and after map generalization is compared to evaluate whether the generalization result maintain the geographical distribution of features. The minimal spanning tree in vector format is firstly converted into a raster format and the grid size is 2mm (distance on the map). The statistical number of matching grid before and after map generalization and the ratio of overlapping grid to the total grids is calculated. Evaluation experiments are conduct to verify the results. Experiments show that this methodology can give an objective evaluation for the feature distribution and give specialist an hand while they evaluate result maps of none-scale automation generalization with their eyes.

Keywords: automatic cartography generalization, evaluation model, geographic feature distribution, minimal spanning tree

Procedia PDF Downloads 631
44759 How Polarization and Ideological Divisiveness Increase the Likelihood of Executive Action: Evidence from the Italian Case

Authors: Umberto Platini

Abstract:

This paper analyses the role of government fragmentation as predictor of the use of emergency decrees in parliamentary democracies. In particular, it focuses on the relationship between ideological divisiveness within cabinets and the choice by executives to issue emergency decrees rather initiating ordinary legislative procedures. A Bayesian multilevel analysis conducted on the population of government-initiated legislation in Italy between 1996 and 2018 finds significant evidence that those legislative proposals which are further away from the ideological centre of gravity of the executive are around three times more likely to be issued as emergency decrees. Likewise, legislative projects regulating more contentious policy areas are significantly more likely to be issued by decree. However, for more contentious issues the importance of ideological distance as a predictor diminishes. This evidence suggests that cabinets prefer decrees to ordinary legislative procedures when they expect that the bargaining environment in Parliament is more hostile. These results persist regardless of the fluctuations of the political-economic cycle. Their robustness is also tested against a battery of controls and against fixed effects both at the government level and at the legislature level.

Keywords: Bayesian multilevel logit models, executive action, executive decrees, ideology, legislative studies, polarization

Procedia PDF Downloads 101
44758 Predicting Success and Failure in Drug Development Using Text Analysis

Authors: Zhi Hao Chow, Cian Mulligan, Jack Walsh, Antonio Garzon Vico, Dimitar Krastev

Abstract:

Drug development is resource-intensive, time-consuming, and increasingly expensive with each developmental stage. The success rates of drug development are also relatively low, and the resources committed are wasted with each failed candidate. As such, a reliable method of predicting the success of drug development is in demand. The hypothesis was that some examples of failed drug candidates are pushed through developmental pipelines based on false confidence and may possess common linguistic features identifiable through sentiment analysis. Here, the concept of using text analysis to discover such features in research publications and investor reports as predictors of success was explored. R studios were used to perform text mining and lexicon-based sentiment analysis to identify affective phrases and determine their frequency in each document, then using SPSS to determine the relationship between our defined variables and the accuracy of predicting outcomes. A total of 161 publications were collected and categorised into 4 groups: (i) Cancer treatment, (ii) Neurodegenerative disease treatment, (iii) Vaccines, and (iv) Others (containing all other drugs that do not fit into the 3 categories). Text analysis was then performed on each document using 2 separate datasets (BING and AFINN) in R within the category of drugs to determine the frequency of positive or negative phrases in each document. A relative positivity and negativity value were then calculated by dividing the frequency of phrases with the word count of each document. Regression analysis was then performed with SPSS statistical software on each dataset (values from using BING or AFINN dataset during text analysis) using a random selection of 61 documents to construct a model. The remaining documents were then used to determine the predictive power of the models. Model constructed from BING predicts the outcome of drug performance in clinical trials with an overall percentage of 65.3%. AFINN model had a lower accuracy at predicting outcomes compared to the BING model at 62.5% but was not effective at predicting the failure of drugs in clinical trials. Overall, the study did not show significant efficacy of the model at predicting outcomes of drugs in development. Many improvements may need to be made to later iterations of the model to sufficiently increase the accuracy.

Keywords: data analysis, drug development, sentiment analysis, text-mining

Procedia PDF Downloads 154
44757 Innovation Management in E-Health Care: The Implementation of New Technologies for Health Care in Europe and the USA

Authors: Dariusz M. Trzmielak, William Bradley Zehner, Elin Oftedal, Ilona Lipka-Matusiak

Abstract:

The use of new technologies should create new value for all stakeholders in the healthcare system. The article focuses on demonstrating that technologies or products typically enable new functionality, a higher standard of service, or a higher level of knowledge and competence for clinicians. It also highlights the key benefits that can be achieved through the use of artificial intelligence, such as relieving clinicians of many tasks and enabling the expansion and greater specialisation of healthcare services. The comparative analysis allowed the authors to create a classification of new technologies in e-health according to health needs and benefits for patients, doctors, and healthcare systems, i.e., the main stakeholders in the implementation of new technologies and products in healthcare. The added value of the development of new technologies in healthcare is diagnosed. The work is both theoretical and practical in nature. The primary research methods are bibliographic analysis and analysis of research data and market potential of new solutions for healthcare organisations. The bibliographic analysis is complemented by the author's case studies of implemented technologies, mostly based on artificial intelligence or telemedicine. In the past, patients were often passive recipients, the end point of the service delivery system, rather than stakeholders in the system. One of the dangers of powerful new technologies is that patients may become even more marginalised. Healthcare will be provided and delivered in an increasingly administrative, programmed way. The doctor may also become a robot, carrying out programmed activities - using 'non-human services'. An alternative approach is to put the patient at the centre, using technologies, products, and services that allow them to design and control technologies based on their own needs. An important contribution to the discussion is to open up the different dimensions of the user (carer and patient) and to make them aware of healthcare units implementing new technologies. The authors of this article outline the importance of three types of patients in the successful implementation of new medical solutions. The impact of implemented technologies is analysed based on: 1) "Informed users", who are able to use the technology based on a better understanding of it; 2) "Engaged users" who play an active role in the broader healthcare system as a result of the technology; 3) "Innovative users" who bring their own ideas to the table based on a deeper understanding of healthcare issues. The authors' research hypothesis is that the distinction between informed, engaged, and innovative users has an impact on the perceived and actual quality of healthcare services. The analysis is based on case studies of new solutions implemented in different medical centres. In addition, based on the observations of the Polish author, who is a manager at the largest medical research institute in Poland, with analytical input from American and Norwegian partners, the added value of the implementations for patients, clinicians, and the healthcare system will be demonstrated.

Keywords: innovation, management, medicine, e-health, artificial intelligence

Procedia PDF Downloads 13
44756 Project Based Learning in Language Lab: An Analysis in ESP Learning Context

Authors: S. Priya

Abstract:

A project based learning assignment in English for Specific Purposes (ESP) context based on Communicative English as prescribed in the university syllabus for engineering students and its learning outcome from ESP context is the focus of analysis through this paper. The task based on Project Based Learning (PBL) was conducted in the digital language lab which had audio visual aids to support the team presentation. The total strength of 48 students of Mechanical Branch were divided into 6 groups, each consisting of 8 students. The group members were selected on random numbering basis. They were given a group task to represent a power point presentation on a topic related to their core branch. They had to discuss the issue and choose their topic and represent in a given format. It provided the individual role of each member in the presentation. A brief overview of the project and the outcome of its technical aspects were also had to be included. Each group had to highlight the contributions of that innovative technology through their presentation. The power point should be provided in a CD format. The variations in the choice of subjects, their usage of digital technologies, co-ordination for competition, learning experience of first time stage presentation, challenges of team cohesiveness were some criteria observed as their learning experience. For many other students undergoing the stages of planning, preparation and practice as steps for presentation had been the learning outcomes as given through their feedback form. The evaluation pattern is distributed for individual contribution and group effectiveness which promotes quality of presentation. The evaluated skills are communication skills, group cohesiveness, and audience response, quality of technicality and usage of technical terms. This paper thus analyses how project based learning improves the communication, life skills and technical skills in English for Specific learning context through PBL.

Keywords: language lab, ESP context, communicative skills, life skills

Procedia PDF Downloads 238
44755 Fabrication of ZnO Nanorods Based Biosensor via Hydrothermal Method

Authors: Muhammad Tariq, Jafar Khan Kasi, Samiullah, Ajab Khan Kasi

Abstract:

Biosensors are playing vital role in industrial, clinical, and chemical analysis applications. Among other techniques, ZnO based biosensor is an easy approach due to its exceptional chemical and electrical properties. ZnO nanorods have positively charged isoelectric point which helps immobilize the negative charge glucose oxides (GOx). Here, we report ZnO nanorods based biosensors for the immobilization of GOx. The ZnO nanorods were grown by hydrothermal method on indium tin oxide substrate (ITO). The fabrication of biosensors was carried through batch processing using conventional photolithography. The buffer solutions of GOx were prepared in phosphate with a pH value of around 7.3. The biosensors effectively immobilized the GOx and result was analyzed by calculation of voltage and current on nanostructures.

Keywords: hydrothermal growth, sol-gel, zinc dioxide, biosensors

Procedia PDF Downloads 296
44754 Improving Taint Analysis of Android Applications Using Finite State Machines

Authors: Assad Maalouf, Lunjin Lu, James Lynott

Abstract:

We present a taint analysis that can automatically detect when string operations result in a string that is free of taints, where all the tainted patterns have been removed. This is an improvement on the conservative behavior of previous taint analyzers, where a string operation on a tainted string always leads to a tainted string unless the operation is manually marked as a sanitizer. The taint analysis is built on top of a string analysis that uses finite state automata to approximate the sets of values that string variables can take during the execution of a program. The proposed approach has been implemented as an extension of FlowDroid and experimental results show that the resulting taint analyzer is much more precise than the original FlowDroid.

Keywords: android, static analysis, string analysis, taint analysis

Procedia PDF Downloads 175
44753 Comprehensive Expert and Social Assessment of the Urban Environment of Almaty in the Process of Training Master's and Doctoral Students on Architecture and Urban Planning

Authors: Alexey Abilov

Abstract:

The article highlights the experience of training master's and doctoral students at Satbayev University by preparing their course works for disciplines "Principles of Sustainable Architecture", "Energy Efficiency in Urban planning", "Urban planning analysis, "Social foundations of Architecture". The purpose of these works is the acquisition by students of practical skills necessary in their future professional activities, which are achieved through comprehensive assessment of individual sections of the Almaty urban environment. The methodology of student’s researches carried out under the guidance of the author of this publication is based on an expert assessment of the territory through its full-scale survey, analysis of project documents and statistical data, as well as on a social assessment of the territory based on the results of a questionnaire survey of residents. A comprehensive qualitative and quantitative assessment of the selected sites according to the criteria of the quality of the living environment also allows to formulate specific recommendations for designers who carry out a pre-project analysis of the city territory in the process of preparing draft master plans and detailed planning projects.

Keywords: urban environment, expert/social assessment of the territory, questionnaire survey, comprehensive approach

Procedia PDF Downloads 66
44752 Soil Degradati̇on Mapping Using Geographic Information System, Remote Sensing and Laboratory Analysis in the Oum Er Rbia High Basin, Middle Atlas, Morocco

Authors: Aafaf El Jazouli, Ahmed Barakat, Rida Khellouk

Abstract:

Mapping of soil degradation is derived from field observations, laboratory measurements, and remote sensing data, integrated quantitative methods to map the spatial characteristics of soil properties at different spatial and temporal scales to provide up-to-date information on the field. Since soil salinity, texture and organic matter play a vital role in assessing topsoil characteristics and soil quality, remote sensing can be considered an effective method for studying these properties. The main objective of this research is to asses soil degradation by combining remote sensing data and laboratory analysis. In order to achieve this goal, the required study of soil samples was taken at 50 locations in the upper basin of Oum Er Rbia in the Middle Atlas in Morocco. These samples were dried, sieved to 2 mm and analyzed in the laboratory. Landsat 8 OLI imagery was analyzed using physical or empirical methods to derive soil properties. In addition, remote sensing can serve as a supporting data source. Deterministic potential (Spline and Inverse Distance weighting) and probabilistic interpolation methods (ordinary kriging and universal kriging) were used to produce maps of each grain size class and soil properties using GIS software. As a result, a correlation was found between soil texture and soil organic matter content. This approach developed in ongoing research will improve the prospects for the use of remote sensing data for mapping soil degradation in arid and semi-arid environments.

Keywords: Soil degradation, GIS, interpolation methods (spline, IDW, kriging), Landsat 8 OLI, Oum Er Rbia high basin

Procedia PDF Downloads 158
44751 Study of the Stability of the Slope Open-Pit Mines: Case of the Mine of Phosphates – Tebessa, Algeria

Authors: Mohamed Fredj, Abdallah Hafsaoui, Radouane Nakache

Abstract:

The study of the stability of the mining works in rock masses fractured is the major concern of the operating engineer. For geotechnical works in mines and quarries, it there is not today's general methodology for analysis and the quantification of the risks relating to the dangers inherent in these concrete types (falling boulders, landslides, etc.). The reasons for this are uncertainty, which weighs on available data or lack of knowledge of the values of the parameters required for this analysis type. Stability calculations must be based on reliable knowledge of the distribution of discontinuities that dissect the Rocky massif and the resistance to shear of the intact rock and discontinuities. This study is aimed to study the stability of slope of mine (Kef Sennoun - Tebessa, Algeria). The problem is analyzed using a numerical model based on the finite elements (software Plaxis 3D).

Keywords: stability, discontinuities, finite elements, rock mass, open-pit mine

Procedia PDF Downloads 315
44750 Graph Cuts Segmentation Approach Using a Patch-Based Similarity Measure Applied for Interactive CT Lung Image Segmentation

Authors: Aicha Majda, Abdelhamid El Hassani

Abstract:

Lung CT image segmentation is a prerequisite in lung CT image analysis. Most of the conventional methods need a post-processing to deal with the abnormal lung CT scans such as lung nodules or other lesions. The simplest similarity measure in the standard Graph Cuts Algorithm consists of directly comparing the pixel values of the two neighboring regions, which is not accurate because this kind of metrics is extremely sensitive to minor transformations such as noise or other artifacts problems. In this work, we propose an improved version of the standard graph cuts algorithm based on the Patch-Based similarity metric. The boundary penalty term in the graph cut algorithm is defined Based on Patch-Based similarity measurement instead of the simple intensity measurement in the standard method. The weights between each pixel and its neighboring pixels are Based on the obtained new term. The graph is then created using theses weights between its nodes. Finally, the segmentation is completed with the minimum cut/Max-Flow algorithm. Experimental results show that the proposed method is very accurate and efficient, and can directly provide explicit lung regions without any post-processing operations compared to the standard method.

Keywords: graph cuts, lung CT scan, lung parenchyma segmentation, patch-based similarity metric

Procedia PDF Downloads 165
44749 Right Solution of Geodesic Equation in Schwarzschild Metric and Overall Examination of Physical Laws

Authors: Kwan U. Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim

Abstract:

108 years have passed since a great number of physicists explained astronomical and physical phenomena by solving geodesic equations in Schwarzschild metric. However, when solving the geodesic equations in Schwarzschild metric, they did not correctly solve one branch of the component of space among spatial and temporal components of four-dimensional force and did not come up with physical laws correctly by means of physical analysis from the results obtained by solving the geodesic equations. In addition to it, they did not treat the astronomical and physical phenomena in a physical way based on the correct physical laws obtained from the solution of the geodesic equations in Schwarzschild metric. Therefore, some former scholars mentioned that Einstein’s theoretical basis of the general theory of relativity was obscure and incorrect, but they have not given a correct physical solution to the problems. Furthermore, since the general theory of relativity has not given a quantitative solution to obscure and incorrect problems, the generalization of gravitational theory has not been successfully completed yet, although the former scholars thought of it and tried to do it. In order to solve the problems it is necessary to explore the obscure and incorrect problems in general theory of relativity based on the physical laws and to find out the methodology of solving the problems. Therefore, first of all, as the first step for achieving the purpose, the right solution of the geodesic equation in Schwarzschild metric has been presented. Next, the correct physical laws found by making a physical analysis of the results have been presented, the obscure and incorrect problems have been shown, and an analysis of them has been made based on the physical laws. In addition, the experimental verification of the physical laws found by us has been made.

Keywords: equivalence principle, general relativity, geometrodynamics, Schwarzschild, Poincaré

Procedia PDF Downloads 73
44748 Extended Boolean Petri Nets Generating N-Ary Trees

Authors: Riddhi Jangid, Gajendra Pratap Singh

Abstract:

Petri nets, a mathematical tool, is used for modeling in different areas of computer sciences, biological networks, chemical systems and many other disciplines. A Petri net model of a given system is created by the graphical representation that describes the properties and behavior of the system. While looking for the behavior of any system, 1-safe Petri nets are of particular interest to many in the application part. Boolean Petri nets correspond to those class in 1- safe Petri nets that generate all the binary n-vectors in their reachability analysis. We study the class by changing different parameters like the token counts in the places and how the structure of the tree changes in the reachability analysis. We discuss here an extended class of Boolean Petri nets that generates n-ary trees in their reachability-based analysis.

Keywords: marking vector, n-vector, petri nets, reachability

Procedia PDF Downloads 76
44747 Analysis of Maintenance Operations in an Industrial Bakery Line

Authors: Mehmet Savsar

Abstract:

This paper presents a practical case application of simulation modeling and analysis in a specific industrial setting. Various maintenance related parameters of the equipment in the system under consideration are determined and a simulation model is developed to study system behavior. System performance is determined based on established parameters and operational policies, which included system operation with and without preventive maintenance implementation. The results show that preventive maintenance practice has significant effects on improving system productivity. The simulation procedures outlined in this paper can be used by operation managers to perform production line analysis under different maintenance policies in various industrial settings.

Keywords: simulation, production line, machine failures, maintenance, industrial bakery

Procedia PDF Downloads 481
44746 A Hybrid LES-RANS Approach to Analyse Coupled Heat Transfer and Vortex Structures in Separated and Reattached Turbulent Flows

Authors: C. D. Ellis, H. Xia, X. Chen

Abstract:

Experimental and computational studies investigating heat transfer in separated flows have been of increasing importance over the last 60 years, as efforts are being made to understand and improve the efficiency of components such as combustors, turbines, heat exchangers, nuclear reactors and cooling channels. Understanding of not only the time-mean heat transfer properties but also the unsteady properties is vital for design of these components. As computational power increases, more sophisticated methods of modelling these flows become available for use. The hybrid LES-RANS approach has been applied to a blunt leading edge flat plate, utilising a structured grid at a moderate Reynolds number of 20300 based on the plate thickness. In the region close to the wall, the RANS method is implemented for two turbulence models; the one equation Spalart-Allmaras model and Menter’s two equation SST k-ω model. The LES region occupies the flow away from the wall and is formulated without any explicit subgrid scale LES modelling. Hybridisation is achieved between the two methods by the blending of the nearest wall distance. Validation of the flow was obtained by assessing the mean velocity profiles in comparison to similar studies. Identifying the vortex structures of the flow was obtained by utilising the λ2 criterion to identify vortex cores. The qualitative structure of the flow compared with experiments of similar Reynolds number. This identified the 2D roll up of the shear layer, breaking down via the Kelvin-Helmholtz instability. Through this instability the flow progressed into hairpin like structures, elongating as they advanced downstream. Proper Orthogonal Decomposition (POD) analysis has been performed on the full flow field and upon the surface temperature of the plate. As expected, the breakdown of POD modes for the full field revealed a relatively slow decay compared to the surface temperature field. Both POD fields identified the most energetic fluctuations occurred in the separated and recirculation region of the flow. Latter modes of the surface temperature identified these levels of fluctuations to dominate the time-mean region of maximum heat transfer and flow reattachment. In addition to the current research, work will be conducted in tracking the movement of the vortex cores and the location and magnitude of temperature hot spots upon the plate. This information will support the POD and statistical analysis performed to further identify qualitative relationships between the vortex dynamics and the response of the surface heat transfer.

Keywords: heat transfer, hybrid LES-RANS, separated and reattached flow, vortex dynamics

Procedia PDF Downloads 226
44745 Online Consortium of Independent Colleges and Universities (OCICU): Using Cluster Analysis to Grasp Student and Institutional Value of Consolidated Online Offerings in Higher Education

Authors: Alex Rodriguez, Adam Guerrero

Abstract:

Purpose: This study is designed to examine the institutions that comprise the Online Consortium of Independent Colleges and Universities (OCICU) to understand better the types of higher education institutions that comprise their membership. The literature on this topic is extensive in analyzing the current economic environment around higher education, which is largely considered to be negative for independent, tuition-driven institutions, and is forcing colleges and universities to reexamine how the college-attending population defines value and how institutions can best utilize their existing resources (and those of other institutions) to meet that value expectation. The results from this analysis are intended to give OCICU the ability to target their current customer base better, based on their most notable differences, and other institutions to see how to best approach consolidation within higher education. Design/Methodology: This study utilized k-means cluster analysis in order to explore the possibility that different segments exist within the seventy-one colleges and universities that have comprised OCICU. It analyzed fifty different variables, whose selection was based on the previous literature, collected by the Integrated Postsecondary Education Data System (IPEDS), whose data is self-reported by individual institutions. Findings: OCICU member institutions are partitioned into two clusters: "access institutions" and "conventional institutions” based largely on the student profile they target. Value: The methodology of the study is relatively unique as there are not many studies within the field of higher education marketing that have employed cluster analysis, and this type of analysis has never been conducted on OCICU members, specifically, or that of any higher education consolidated offering. OCICU can use the findings of this study to obtain a better grasp as to the specific needs of the two market segments OCICU currently serves and develop measurable marketing programs around how those segments are defined that communicate the value sought by current and potential OCICU members or those of similar institutions. Other consolidation efforts within higher education can also employ the same methodology to determine their own market segments.

Keywords: Consolidation, Colleges, Enrollment, Higher Education, Marketing, Strategy, Universities

Procedia PDF Downloads 128