Search results for: steps
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1599

Search results for: steps

429 The Examination of Cement Effect on Isotropic Sands during Static, Dynamic, Melting and Freezing Cycles

Authors: Mehdi Shekarbeigi

Abstract:

The consolidation of loose substrates as well as substrate layers through promoting stabilizing materials is one of the most commonly used road construction techniques. Cement, lime, and flax, as well as asphalt emulsion, are common materials used for soil stabilization to enhance the soil’s strength and durability properties. Cement could be simply used to stabilize permeable materials such as sand in a relatively short time threshold. In this research, typical Portland cement is selected for the stabilization of isotropic sand; the effect of static and cyclic loading on the behavior of these soils has been examined with various percentages of Portland cement. Thus, firstly, a soil’s general features are investigated, and then static tests, including direct cutting, density and single axis tests, and California Bearing Ratio, are performed on the samples. After that, the dynamic behavior of cement on silica sand with the same grain size is analyzed. These experiments are conducted on cement samples of 3, 6, and 9 of the same rates and ineffective limiting pressures of 0 to 1200 kPa with 200 kPa steps of the face according to American Society for Testing and Materials D 3999 standards. Also, to test the effect of temperature on molds and frost samples, 0, 5, 10, and 20 are carried out during 0, 5, 10, and 20-second periods. Results of the static tests showed that increasing the cement percentage increases the soil density and shear strength. The single-axis compressive strength increase is higher for samples with higher cement content and lower densities. The results also illustrate the relationship between single-axial compressive strength and cement weight parameters. Results of the dynamic experiments indicate that increasing the number of loading cycles and melting and freezing cycles enhances permeability and decreases the applied pressure. According to the results of this research, it could be stated that samples containing 9% cement have the highest amount of shear modulus and, therefore, decrease the permeability of soil. This amount could be considered as the optimal amount. Also, the enhancement of effective limited pressure from 400 to 800kPa increased the shear modulus of the sample by an average of 20 to 30 percent in small strains.

Keywords: cement, isotropic sands, static load, three-axis cycle, melting and freezing cycles

Procedia PDF Downloads 62
428 Adaption of the Design Thinking Method for Production Planning in the Meat Industry Using Machine Learning Algorithms

Authors: Alica Höpken, Hergen Pargmann

Abstract:

The resource-efficient planning of the complex production planning processes in the meat industry and the reduction of food waste is a permanent challenge. The complexity of the production planning process occurs in every part of the supply chain, from agriculture to the end consumer. It arises from long and uncertain planning phases. Uncertainties such as stochastic yields, fluctuations in demand, and resource variability are part of this process. In the meat industry, waste mainly relates to incorrect storage, technical causes in production, or overproduction. The high amount of food waste along the complex supply chain in the meat industry could not be reduced by simple solutions until now. Therefore, resource-efficient production planning by conventional methods is currently only partially feasible. The realization of intelligent, automated production planning is basically possible through the application of machine learning algorithms, such as those of reinforcement learning. By applying the adapted design thinking method, machine learning methods (especially reinforcement learning algorithms) are used for the complex production planning process in the meat industry. This method represents a concretization to the application area. A resource-efficient production planning process is made available by adapting the design thinking method. In addition, the complex processes can be planned efficiently by using this method, since this standardized approach offers new possibilities in order to challenge the complexity and the high time consumption. It represents a tool to support the efficient production planning in the meat industry. This paper shows an elegant adaption of the design thinking method to apply the reinforcement learning method for a resource-efficient production planning process in the meat industry. Following, the steps that are necessary to introduce machine learning algorithms into the production planning of the food industry are determined. This is achieved based on a case study which is part of the research project ”REIF - Resource Efficient, Economic and Intelligent Food Chain” supported by the German Federal Ministry for Economic Affairs and Climate Action of Germany and the German Aerospace Center. Through this structured approach, significantly better planning results are achieved, which would be too complex or very time consuming using conventional methods.

Keywords: change management, design thinking method, machine learning, meat industry, reinforcement learning, resource-efficient production planning

Procedia PDF Downloads 112
427 Predictability of Thermal Response in Housing: A Case Study in Australia, Adelaide

Authors: Mina Rouhollahi, J. Boland

Abstract:

Changes in cities’ heat balance due to rapid urbanization and the urban heat island (UHI) have increased energy demands for space cooling and have resulted in uncomfortable living conditions for urban residents. Climate resilience and comfortable living spaces can be addressed through well-designed urban development. The sustainable housing can be more effective in controlling high levels of urban heat. In Australia, to mitigate the effects of UHIs and summer heat waves, one solution to sustainable housing has been the trend to compact housing design and the construction of energy efficient dwellings. This paper analyses whether current housing configurations and orientations are effective in avoiding increased demands for air conditioning and having an energy efficient residential neighborhood. A significant amount of energy is consumed to ensure thermal comfort in houses. This paper reports on the modelling of heat transfer within the homes using the measurements of radiation, convection and conduction between exterior/interior wall surfaces and outdoor/indoor environment respectively. The simulation was tested on selected 7.5-star energy efficient houses constructed of typical material elements and insulation in Adelaide, Australia. The chosen design dwellings were analyzed in extremely hot weather through one year. The data were obtained via a thermal circuit to accurately model the fundamental heat transfer mechanisms on both boundaries of the house and through the multi-layered wall configurations. The formulation of the Lumped capacitance model was considered in discrete time steps by adopting a non-linear model method. The simulation results focused on the effects of orientation of the solar radiation on the dynamic thermal characteristics of the houses orientations. A high star rating did not necessarily coincide with a decrease in peak demands for cooling. A more effective approach to avoid increasing the demands for air conditioning and energy may be to integrate solar–climatic data to evaluate the performance of energy efficient houses.

Keywords: energy-efficient residential building, heat transfer, neighborhood orientation, solar–climatic data

Procedia PDF Downloads 119
426 Investigation of Dry-Blanching and Freezing Methods of Fruits

Authors: Epameinondas Xanthakis, Erik Kaunisto, Alain Le-Bail, Lilia Ahrné

Abstract:

Fruits and vegetables are characterized as perishable food matrices due to their short shelf life as several deterioration mechanisms are being involved. Prior to the common preservation methods like freezing or canning, fruits and vegetables are being blanched in order to inactivate deteriorative enzymes. Both conventional blanching pretreatments and conventional freezing methods hide drawbacks behind their beneficial impacts on the preservation of those matrices. Conventional blanching methods may require longer processing times, leaching of minerals and nutrients due to the contact with the warm water which in turn leads to effluent production with large BOD. An important issue of freezing technologies is the size of the formed ice crystals which is also critical for the final quality of the frozen food as it can cause irreversible damage to the cellular structure and subsequently to degrade the texture and the colour of the product. Herein, the developed microwave blanching methodology and the results regarding quality aspects and enzyme inactivation will be presented. Moreover, heat transfer phenomena, mass balance, temperature distribution, and enzyme inactivation (such as Pectin Methyl Esterase and Ascorbic Acid Oxidase) of our microwave blanching approach will be evaluated based on measurements and computer modelling. The present work is part of the COLDμWAVE project which aims to the development of an innovative environmentally sustainable process for blanching and freezing of fruits and vegetables with improved textural and nutritional quality. In this context, COLDµWAVE will develop tailored equipment for MW blanching of vegetables that has very high energy efficiency and no water consumption. Furthermore, the next steps of this project regarding the development of innovative pathways in MW assisted freezing to improve the quality of frozen vegetables, by exploring in depth previous results acquired by the authors, will be presented. The application of MW assisted freezing process on fruits and vegetables it is expected to lead to improved quality characteristics compared to the conventional freezing. Acknowledgments: COLDμWAVE has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grand agreement No 660067.

Keywords: blanching, freezing, fruits, microwave blanching, microwave

Procedia PDF Downloads 251
425 Magnitude and Determinants of Overweight and Obesity among High School Adolescents in Addis Ababa, Ethiopia

Authors: Mulugeta Shegaze, Mekitie Wondafrash, Alemayehu A. Alemayehu, Shikur Mohammed, Zewdu Shewangezaw, Mukerem Abdo, Gebresilasea Gendisha

Abstract:

Background: The 2004 World Health Assembly called for specific actions to halt the overweight and obesity epidemic that is currently penetrating urban populations in the developing world. Adolescents require particular attention due to their vulnerability to develop obesity and the fact that adolescent weight tracks strongly into adulthood. However, there is scarcity of information on the modifiable risk factors to be targeted for primary intervention among urban adolescents in Ethiopia. This study was aimed at determining the magnitude and risk factors of overweight and obesity among high school adolescents in Addis Ababa. Methods: An institution-based cross-sectional study was conducted in February and March 2014 on 456 randomly selected adolescents from 20 high schools in Addis Ababa city.  Demographic data and other risk factors of overweight and obesity were collected using self-administered structured questionnaire, whereas anthropometric measurements of weight and height were taken using calibrated equipment and standardized techniques. The WHO STEPS instrument for chronic disease risk was applied to assess dietary habit and physical activity. Overweight and obesity status was determined based on BMI-for-age percentiles of WHO 2007 reference population. Results: The prevalence rates of overweight, obesity, and overall overweight/ obesity among high school adolescents in Addis Ababa were 9.7% (95%CI = 6.9-12.4%), 4.2% (95%CI = 2.3-6.0%), and 13.9% (95%CI = 10.6-17.1%), respectively. Overweight/obesity prevalence was highest among female adolescents, in private schools, and in the higher wealth category. In multivariable regression model, being female [AOR(95%CI) = 5.4(2.5,12.1)], being from private school [AOR(95%CI) = 3.0(1.4,6.2)], having >3 regular meals [AOR(95%CI) = 4.0(1.3,13.0)], consumption of sweet foods [AOR(95%CI) = 5.0(2.4,10.3)] and spending >3 hours/day sitting [AOR(95%CI) = 3.5(1.7,7.2)] were found to increase overweight/ obesity risk, whereas high Total Physical Activity level [AOR(95%CI) = 0.21(0.08,0.57)] and better nutrition knowledge [AOR(95%CI) = 0.160.07,0.37)] were found protective. Conclusions: More than one in ten of the high school adolescents were affected by overweight/obesity with dietary habit and physical activity are important modifiable risk factors. Well-tailored nutrition education program targeting lifestyle change should be initiated with more emphasis to female adolescents and students in private schools.

Keywords: adolescents, NCDs, overweight, obesity

Procedia PDF Downloads 291
424 Familiarity with Engineering Project Management And Their Duties In Projects

Authors: Mokhtar Nikgoo

Abstract:

Today's industrial world has undergone tremendous changes in certain periods. These changes are called environmental changes. And they have a direct impact on organizations and bodies. Therefore, the importance of knowing these changes is clear. This importance has caused the manufacturing organizations to move towards multiple products and constantly change and expand their system. This research tries to show how the organization moves in this category by defining the basic steps of implementing a project. One of the most important features of a hard-to-order production organization is the definition of different production projects from different customers. Therefore, the lack of sufficient understanding of the type of work causes the project to be defined for the organization in question, and the managers of the organization (in every organizational level) are constantly involved with different projects. In the implementation of the production project of the aforementioned organizations, directing the facilities and people of the organization towards the implementation of the project is of particular importance. Therefore, it is felt necessary to define the project manager and his basic duties. Considering the importance of this topic, the project chapter deals with project management and its importance and examines all the different issues in that category from the perspective of implementation. A project includes certain activities of the organization that require the use of different resources and all the activities of the organization in order to implement the project with defined facilities and at the designated times. Project management is planning, organizing and controlling the organization's resources for a short-term goal that has been created for short-term and medium-term goals and objectives. Project management has the important task of centering and integrating (coordinating) task and line managers. In other words, project management requires having a strong and appropriate relationship with the internal people of the system to carry out the assigned activities and must have a general and technical knowledge related to various activities in the project environment. It seems that everything with project management in It is communication. One of the characteristics of production organizations under the order is the relationship between the customer (customers) and the organization until the completion of the defined project. Due to the nature of the work, it is necessary for a person to establish this relationship between the client and the organization's people and to establish this relationship in such a way that it does not cause a lack of coordination in the organization's activities. Therefore, project management has a very important role at this stage, because the relationship between the client and his organization will be any problems and problems and points of view that the client has, he must inform the management so that he can implement the cases with its analysis and special processes. To be transferred to other departments and line managers.

Keywords: project management, crisis management, project delays bill, project duration

Procedia PDF Downloads 33
423 A Geo DataBase to Investigate the Maximum Distance Error in Quality of Life Studies

Authors: Paolino Di Felice

Abstract:

The background and significance of this study come from papers already appeared in the literature which measured the impact of public services (e.g., hospitals, schools, ...) on the citizens’ needs satisfaction (one of the dimensions of QOL studies) by calculating the distance between the place where they live and the location on the territory of the services. Those studies assume that the citizens' dwelling coincides with the centroid of the polygon that expresses the boundary of the administrative district, within the city, they belong to. Such an assumption “introduces a maximum measurement error equal to the greatest distance between the centroid and the border of the administrative district.”. The case study, this abstract reports about, investigates the implications descending from the adoption of such an approach but at geographical scales greater than the urban one, namely at the three levels of nesting of the Italian administrative units: the (20) regions, the (110) provinces, and the 8,094 municipalities. To carry out this study, it needs to be decided: a) how to store the huge amount of (spatial and descriptive) input data and b) how to process them. The latter aspect involves: b.1) the design of algorithms to investigate the geometry of the boundary of the Italian administrative units; b.2) their coding in a programming language; b.3) their execution and, eventually, b.4) archiving the results in a permanent support. The IT solution we implemented is centered around a (PostgreSQL/PostGIS) Geo DataBase structured in terms of three tables that fit well to the hierarchy of nesting of the Italian administrative units: municipality(id, name, provinceId, istatCode, regionId, geometry) province(id, name, regionId, geometry) region(id, name, geometry). The adoption of the DBMS technology allows us to implement the steps "a)" and "b)" easily. In particular, step "b)" is simplified dramatically by calling spatial operators and spatial built-in User Defined Functions within SQL queries against the Geo DB. The major findings coming from our experiments can be summarized as follows. The approximation that, on the average, descends from assimilating the residence of the citizens with the centroid of the administrative unit of reference is of few kilometers (4.9) at the municipalities level, while it becomes conspicuous at the other two levels (28.9 and 36.1, respectively). Therefore, studies such as those mentioned above can be extended up to the municipal level without affecting the correctness of the interpretation of the results, but not further. The IT framework implemented to carry out the experiments can be replicated for studies referring to the territory of other countries all over the world.

Keywords: quality of life, distance measurement error, Italian administrative units, spatial database

Procedia PDF Downloads 356
422 An Exploratory Study on the Impact of Video-stimulated Reflection on Novice EFL Teachers’ Professional Development

Authors: Ibrahima Diallo

Abstract:

The literature on teacher education foregrounds reflection as an important aspect of professional practice. Reflection for a teacher consists in critically analysing and evaluating retrospectively a lesson to see what worked, what did not work, and how to improve it for the future. Now, many teacher education programmes worldwide consider the ability to reflect as one of the hallmarks of an effective educator. However, in some context like Senegal, reflection has not been given due consideration in teacher education programmes. In contexts where it has been in the education landscape for some time now, reflection is mostly depicted as an individual written activity and many teacher trainees have become disenchanted by the repeated enactments of this task that is solely intended to satisfy course requirements. This has resulted in whitewashing weaknesses or even ‘faking’ reflection. Besides, the “one-size-fits-all” approach of reflection could not flourish because how reflection impacts on practice is still unproven. Therefore, reflective practice needs to be contextualised and made more thought-provoking through dialogue and by using classroom data. There is also a need to highlight change brought in teachers’ practice through reflection. So, this study introduces reflection in a new context and aims to show evidenced change in novice EFL teachers’ practice through dialogic data-led reflection. The purpose of this study is also to contribute to the scarce literature on reflection in sub-Saharan Africa by bringing new perspectives on contextualised teacher-led reflection. Eight novice EFL teachers participated in this qualitative longitudinal study, and data have been gathered online through post-lesson reflection recordings and lesson videos for a period of four months. Then, the data have been thematically analysed using NVivo to systematically organize and manage the large amount of data. The analysis followed the six steps approach to thematic analysis. Major themes related to teachers’ classroom practice and their conception of reflection emerged from the analysis of the data. The results showed that post-lesson reflection with a peer can help novice EFL teachers gained more awareness on their classroom practice. Dialogic reflection also helped them evaluate their lessons and seek for improvement. The analysis of the data also gave insight on teachers’ conception of reflection in an EFL context. It was found that teachers were more engaged in reflection when using their lesson video recordings. Change in teaching behaviour as a result of reflection was evidenced by the analysis of the lesson video recordings. This study has shown that video-stimulated reflection is practical form of professional development that can be embedded in teachers’ professional life.

Keywords: novice EFL teachers, practice, professional development, video-stimulated reflection

Procedia PDF Downloads 87
421 Adversarial Attacks and Defenses on Deep Neural Networks

Authors: Jonathan Sohn

Abstract:

Deep neural networks (DNNs) have shown state-of-the-art performance for many applications, including computer vision, natural language processing, and speech recognition. Recently, adversarial attacks have been studied in the context of deep neural networks, which aim to alter the results of deep neural networks by modifying the inputs slightly. For example, an adversarial attack on a DNN used for object detection can cause the DNN to miss certain objects. As a result, the reliability of DNNs is undermined by their lack of robustness against adversarial attacks, raising concerns about their use in safety-critical applications such as autonomous driving. In this paper, we focus on studying the adversarial attacks and defenses on DNNs for image classification. There are two types of adversarial attacks studied which are fast gradient sign method (FGSM) attack and projected gradient descent (PGD) attack. A DNN forms decision boundaries that separate the input images into different categories. The adversarial attack slightly alters the image to move over the decision boundary, causing the DNN to misclassify the image. FGSM attack obtains the gradient with respect to the image and updates the image once based on the gradients to cross the decision boundary. PGD attack, instead of taking one big step, repeatedly modifies the input image with multiple small steps. There is also another type of attack called the target attack. This adversarial attack is designed to make the machine classify an image to a class chosen by the attacker. We can defend against adversarial attacks by incorporating adversarial examples in training. Specifically, instead of training the neural network with clean examples, we can explicitly let the neural network learn from the adversarial examples. In our experiments, the digit recognition accuracy on the MNIST dataset drops from 97.81% to 39.50% and 34.01% when the DNN is attacked by FGSM and PGD attacks, respectively. If we utilize FGSM training as a defense method, the classification accuracy greatly improves from 39.50% to 92.31% for FGSM attacks and from 34.01% to 75.63% for PGD attacks. To further improve the classification accuracy under adversarial attacks, we can also use a stronger PGD training method. PGD training improves the accuracy by 2.7% under FGSM attacks and 18.4% under PGD attacks over FGSM training. It is worth mentioning that both FGSM and PGD training do not affect the accuracy of clean images. In summary, we find that PGD attacks can greatly degrade the performance of DNNs, and PGD training is a very effective way to defend against such attacks. PGD attacks and defence are overall significantly more effective than FGSM methods.

Keywords: deep neural network, adversarial attack, adversarial defense, adversarial machine learning

Procedia PDF Downloads 177
420 Trial Version of a Systematic Material Selection Tool in Building Element Design

Authors: Mine Koyaz, M. Cem Altun

Abstract:

Selection of the materials satisfying the expected performances is significantly important for any design. Today, with the constantly evolving and developing technologies, the material options are so wide that the necessity of the use of some support tools in the selection process is arising. Therefore, as a sub process of building element design, a systematic material selection tool is developed, that defines four main steps of the material selection; definition, research, comparison and decision. The main purpose of the tool is being an educational instrument that would show a methodic way of material selection in architectural detailing for the use of architecture students. The tool predefines the possible uses of various material databases and other sources of information on material properties. Hence, it is to be used as a guidance for designers, especially with a limited material knowledge and experience. The material selection tool not only embraces technical properties of materials related with building elements’ functional requirements, but also its sensual properties related with the identity of design and its environmental impacts with respect to the sustainability of the design. The method followed in the development of the tool has two main sections; first the examination and application of the existing methods and second the development of trial versions and their applications. Within the scope of the existing methods; design support tools, methodic approaches for the building element design and material selection process, material properties, material databases, methodic approaches for the decision making process are examined. The existing methods are applied by architecture students and newly graduate architects through different design problems. With respect to the results of these applications, strong and weak sides of the existing material selection tools are presented. A main flow chart of the material selection tool has been developed with the objective to apply the strong aspects of the existing methods and develop their weak sides. Through different stages, a different aspect of the material selection process is investigated and the tool took its final form. Systematic material selection tool, within the building element design process, guides the users with a minimum background information, to practically and accurately determine the ideal material that is to be chosen, satisfying the needs of their design. The tool has a flexible structure that answers different needs of different designs and designers. The trial version issued in this paper shows one of the paths that could be followed and illustrates its application over a design problem.

Keywords: architectural education, building element design, material selection tool, systematic approach

Procedia PDF Downloads 328
419 Leveraging xAPI in a Corporate e-Learning Environment to Facilitate the Tracking, Modelling, and Predictive Analysis of Learner Behaviour

Authors: Libor Zachoval, Daire O Broin, Oisin Cawley

Abstract:

E-learning platforms, such as Blackboard have two major shortcomings: limited data capture as a result of the limitations of SCORM (Shareable Content Object Reference Model), and lack of incorporation of Artificial Intelligence (AI) and machine learning algorithms which could lead to better course adaptations. With the recent development of Experience Application Programming Interface (xAPI), a large amount of additional types of data can be captured and that opens a window of possibilities from which online education can benefit. In a corporate setting, where companies invest billions on the learning and development of their employees, some learner behaviours can be troublesome for they can hinder the knowledge development of a learner. Behaviours that hinder the knowledge development also raise ambiguity about learner’s knowledge mastery, specifically those related to gaming the system. Furthermore, a company receives little benefit from their investment if employees are passing courses without possessing the required knowledge and potential compliance risks may arise. Using xAPI and rules derived from a state-of-the-art review, we identified three learner behaviours, primarily related to guessing, in a corporate compliance course. The identified behaviours are: trying each option for a question, specifically for multiple-choice questions; selecting a single option for all the questions on the test; and continuously repeating tests upon failing as opposed to going over the learning material. These behaviours were detected on learners who repeated the test at least 4 times before passing the course. These findings suggest that gauging the mastery of a learner from multiple-choice questions test scores alone is a naive approach. Thus, next steps will consider the incorporation of additional data points, knowledge estimation models to model knowledge mastery of a learner more accurately, and analysis of the data for correlations between knowledge development and identified learner behaviours. Additional work could explore how learner behaviours could be utilised to make changes to a course. For example, course content may require modifications (certain sections of learning material may be shown to not be helpful to many learners to master the learning outcomes aimed at) or course design (such as the type and duration of feedback).

Keywords: artificial intelligence, corporate e-learning environment, knowledge maintenance, xAPI

Procedia PDF Downloads 108
418 Comparison of Finite Difference Schemes for Numerical Study of Ripa Model

Authors: Sidrah Ahmed

Abstract:

The river and lakes flows are modeled mathematically by shallow water equations that are depth-averaged Reynolds Averaged Navier-Stokes equations under Boussinesq approximation. The temperature stratification dynamics influence the water quality and mixing characteristics. It is mainly due to the atmospheric conditions including air temperature, wind velocity, and radiative forcing. The experimental observations are commonly taken along vertical scales and are not sufficient to estimate small turbulence effects of temperature variations induced characteristics of shallow flows. Wind shear stress over the water surface influence flow patterns, heat fluxes and thermodynamics of water bodies as well. Hence it is crucial to couple temperature gradients with shallow water model to estimate the atmospheric effects on flow patterns. The Ripa system has been introduced to study ocean currents as a variant of shallow water equations with addition of temperature variations within the flow. Ripa model is a hyperbolic system of partial differential equations because all the eigenvalues of the system’s Jacobian matrix are real and distinct. The time steps of a numerical scheme are estimated with the eigenvalues of the system. The solution to Riemann problem of the Ripa model is composed of shocks, contact and rarefaction waves. Solving Ripa model with Riemann initial data with the central schemes is difficult due to the eigen structure of the system.This works presents the comparison of four different finite difference schemes for the numerical solution of Riemann problem for Ripa model. These schemes include Lax-Friedrichs, Lax-Wendroff, MacCormack scheme and a higher order finite difference scheme with WENO method. The numerical flux functions in both dimensions are approximated according to these methods. The temporal accuracy is achieved by employing TVD Runge Kutta method. The numerical tests are presented to examine the accuracy and robustness of the applied methods. It is revealed that Lax-Freidrichs scheme produces results with oscillations while Lax-Wendroff and higher order difference scheme produce quite better results.

Keywords: finite difference schemes, Riemann problem, shallow water equations, temperature gradients

Procedia PDF Downloads 190
417 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 302
416 A Survey and Analysis on Inflammatory Pain Detection and Standard Protocol Selection Using Medical Infrared Thermography from Image Processing View Point

Authors: Mrinal Kanti Bhowmik, Shawli Bardhan Jr., Debotosh Bhattacharjee

Abstract:

Human skin containing temperature value more than absolute zero, discharges infrared radiation related to the frequency of the body temperature. The difference in infrared radiation from the skin surface reflects the abnormality present in human body. Considering the difference, detection and forecasting the temperature variation of the skin surface is the main objective of using Medical Infrared Thermography(MIT) as a diagnostic tool for pain detection. Medical Infrared Thermography(MIT) is a non-invasive imaging technique that records and monitors the temperature flow in the body by receiving the infrared radiated from the skin and represent it through thermogram. The intensity of the thermogram measures the inflammation from the skin surface related to pain in human body. Analysis of thermograms provides automated anomaly detection associated with suspicious pain regions by following several image processing steps. The paper represents a rigorous study based survey related to the processing and analysis of thermograms based on the previous works published in the area of infrared thermal imaging for detecting inflammatory pain diseases like arthritis, spondylosis, shoulder impingement, etc. The study also explores the performance analysis of thermogram processing accompanied by thermogram acquisition protocols, thermography camera specification and the types of pain detected by thermography in summarized tabular format. The tabular format provides a clear structural vision of the past works. The major contribution of the paper introduces a new thermogram acquisition standard associated with inflammatory pain detection in human body to enhance the performance rate. The FLIR T650sc infrared camera with high sensitivity and resolution is adopted to increase the accuracy of thermogram acquisition and analysis. The survey of previous research work highlights that intensity distribution based comparison of comparable and symmetric region of interest and their statistical analysis assigns adequate result in case of identifying and detecting physiological disorder related to inflammatory diseases.

Keywords: acquisition protocol, inflammatory pain detection, medical infrared thermography (MIT), statistical analysis

Procedia PDF Downloads 328
415 Development of Optimized Eye Mascara Packages with Bioinspired Spiral Methodology

Authors: Daniela Brioschi, Rovilson Mafalda, Silvia Titotto

Abstract:

In the present days, packages are considered a fundamental element in the commercialization of products and services. A good package is capable of helping to attract new customers and also increasing a product’s purchase intent. In this scenario, packaging design emerges as an important tool, since products and design of their packaging are so interconnected that they are no longer seen as separate elements. Packaging design is, in fact, capable of generating desire for a product. The packaging market for cosmetics, especially makeup market, has also been experiencing an increasing level of sophistication and requirements. Considering packaging represents an important link of communication with the final user and plays a significant role on the sales process, it is of great importance that packages accomplish not only with functional requirements but also with the visual appeal. One of the possibilities for the design of packages and, in this context, packages for make-up, is the bioinspired design – or biomimicry. The bio-inspired design presents a promising paradigm for innovation in both design and sustainable design, by using biological system analogies to develop solutions. It has gained importance as a widely diffused movement in design for environmentally conscious development and is also responsible for several useful and innovative designs. As eye mascara packages are also part of the constant evolution on the design for cosmetics area and the traditional packages present the disadvantage of product drying along time, this project aims to develop a new and innovative package for this product, by using a selected bioinspired design methodology during the development process and also suitable computational tools. In order to guide the development process of the package, it was chosen the spiral methodology, conceived by The Biomimicry Institut, which consists of a reliable tool, since it was based on traditional design methodologies. The spiral design comprises identification, translation, discovery, abstraction, emulation and evaluation steps, that can work iteratively as the process develops as a spiral. As support tool for packaging, 3D modelling is being used by the software Inventor Autodesk Inventor 2018. Although this is an ongoing research, first results showed that spiral methodology design, together with Autodesk Inventor, consist of suitable instruments for the bio-inspired design process, and also nature proved itself to be an amazing and inexhaustible source of inspiration.

Keywords: bio-inspired design, design methodology, packaging, cosmetics

Procedia PDF Downloads 171
414 Detection of Powdery Mildew Disease in Strawberry Using Image Texture and Supervised Classifiers

Authors: Sultan Mahmud, Qamar Zaman, Travis Esau, Young Chang

Abstract:

Strawberry powdery mildew (PM) is a serious disease that has a significant impact on strawberry production. Field scouting is still a major way to find PM disease, which is not only labor intensive but also almost impossible to monitor disease severity. To reduce the loss caused by PM disease and achieve faster automatic detection of the disease, this paper proposes an approach for detection of the disease, based on image texture and classified with support vector machines (SVMs) and k-nearest neighbors (kNNs). The methodology of the proposed study is based on image processing which is composed of five main steps including image acquisition, pre-processing, segmentation, features extraction and classification. Two strawberry fields were used in this study. Images of healthy leaves and leaves infected with PM (Sphaerotheca macularis) disease under artificial cloud lighting condition. Colour thresholding was utilized to segment all images before textural analysis. Colour co-occurrence matrix (CCM) was introduced for extraction of textural features. Forty textural features, related to a physiological parameter of leaves were extracted from CCM of National television system committee (NTSC) luminance, hue, saturation and intensity (HSI) images. The normalized feature data were utilized for training and validation, respectively, using developed classifiers. The classifiers have experimented with internal, external and cross-validations. The best classifier was selected based on their performance and accuracy. Experimental results suggested that SVMs classifier showed 98.33%, 85.33%, 87.33%, 93.33% and 95.0% of accuracy on internal, external-I, external-II, 4-fold cross and 5-fold cross-validation, respectively. Whereas, kNNs results represented 90.0%, 72.00%, 74.66%, 89.33% and 90.3% of classification accuracy, respectively. The outcome of this study demonstrated that SVMs classified PM disease with a highest overall accuracy of 91.86% and 1.1211 seconds of processing time. Therefore, overall results concluded that the proposed study can significantly support an accurate and automatic identification and recognition of strawberry PM disease with SVMs classifier.

Keywords: powdery mildew, image processing, textural analysis, color co-occurrence matrix, support vector machines, k-nearest neighbors

Procedia PDF Downloads 109
413 Evaluating Daylight Performance in an Office Environment in Malaysia, Using Venetian Blind Systems

Authors: Fatemeh Deldarabdolmaleki, Mohamad Fakri Zaky Bin Ja'afar

Abstract:

This paper presents fenestration analysis to study the balance between utilizing daylight and eliminating the disturbing parameters in a private office room with interior venetian blinds taking into account different slat angles. Mean luminance of the scene and window, luminance ratio of the workplane and window, work plane illumination and daylight glare probability(DGP) were calculated as a function of venetian blind design properties. Recently developed software, analyzing High Dynamic Range Images (HDRI captured by CCD camera), such as radiance based evalglare and hdrscope help to investigate luminance-based metrics. A total of Eight-day measurement experiment was conducted to investigate the impact of different venetian blind angles in an office environment under daylight condition in Serdang, Malaysia. Detailed result for the selected case study showed that artificial lighting is necessary during the morning session for Malaysian buildings with southwest windows regardless of the venetian blind’s slat angle. However, in some conditions of afternoon session the workplane illuminance level exceeds the maximum illuminance of 2000 lx such as 10° and 40° slat angles. Generally, a rising trend is discovered toward mean window luminance level during the day. All the conditions have less than 10% of the pixels exceeding 2000 cd/m² before 1:00 P.M. However, 40% of the selected hours have more than 10% of the scene pixels higher than 2000 cd/m² after 1:00 P.M. Surprisingly in no blind condition, there is no extreme case of window/task ratio, However, the extreme cases happen for 20°, 30°, 40° and 50° slat angles. As expected mean window luminance level is higher than 2000 cd/m² after 2:00 P.M for most cases except 60° slat angle condition. Studying the daylight glare probability, there is not any DGP value higher than 0.35 in this experiment, due to the window’s direction, location of the building and studied workplane. Specifically, this paper reviews different blind angle’s response to the suggested metrics by the previous standards, and finally conclusions and knowledge gaps are summarized and suggested next steps for research are provided. Addressing these gaps is critical for the continued progress of the energy efficiency movement.

Keywords: daylighting, office environment, energy simulation, venetian blind

Procedia PDF Downloads 212
412 Intercultural and Inclusive Teaching Competency Implementation within a Canadian Polytechnic's Academic Model: A Pre- and Post-Assessment Analysis

Authors: Selinda England, Ben Bodnaryk

Abstract:

With an unprecedented increase in provincial immigration and government support for greater international and culturally diverse learners, a trade/applied learning-focused polytechnic with four campuses within one Canadian province saw the need for intercultural awareness and an intercultural teaching competence strategy for faculty training. An institution-wide pre-assessment needs survey was conducted in 2018, in which 87% of faculty professed to have some/no training when working with international and/or culturally diverse learners. After researching fellow Polytechnics in Canada and seeing very little in the way of faculty support for intercultural competence, an institutional project team comprised of members from all facets of the Polytechnic was created and included: Indigenous experts, Academic Chairs, Directors, Human Resource Managers, and international/settlement subject matter experts. The project team was organized to develop and implement a new academic model focused on enriching intercultural competence among faculty. Utilizing a competency based model, the project team incorporated inclusive terminology into competency indicators and devised a four-phase proposal for implementing intercultural teacher training: a series of workshops focused on the needs of international and culturally diverse learners, including teaching strategies based on current TESOL methodologies, literature and online resources for quick access when planning lessons, faculty assessment examples and models of interculturally proficient instructors, and future job descriptions - all which promote and encourage development of specific intercultural skills. Results from a post-assessment survey (to be conducted in Spring 2020) and caveats regarding improvements and next steps will be shared. The project team believes its intercultural and inclusive teaching competency-based model is one of the first, institution-wide faculty supported initiatives within the Canadian college and Polytechnic post-secondary educational environment; it aims to become a leader in both the province and nation regarding intercultural competency training for trades, industry, and business minded community colleges and applied learning institutions.

Keywords: cultural diversity and education, diversity training teacher training, teaching and learning, teacher training

Procedia PDF Downloads 96
411 Object Oriented Classification Based on Feature Extraction Approach for Change Detection in Coastal Ecosystem across Kochi Region

Authors: Mohit Modi, Rajiv Kumar, Manojraj Saxena, G. Ravi Shankar

Abstract:

Change detection of coastal ecosystem plays a vital role in monitoring and managing natural resources along the coastal regions. The present study mainly focuses on the decadal change in Kochi islands connecting the urban flatland areas and the coastal regions where sand deposits have taken place. With this, in view, the change detection has been monitored in the Kochi area to apprehend the urban growth and industrialization leading to decrease in the wetland ecosystem. The region lies between 76°11'19.134"E to 76°25'42.193"E and 9°52'35.719"N to 10°5'51.575"N in the south-western coast of India. The IRS LISS-IV satellite image has been processed using a rule-based algorithm to classify the LULC and to interpret the changes between 2005 & 2015. The approach takes two steps, i.e. extracting features as a single GIS vector layer using different parametric values and to dissolve them. The multi-resolution segmentation has been carried out on the scale ranging from 10-30. The different classes like aquaculture, agricultural land, built-up, wetlands etc. were extracted using parameters like NDVI, mean layer values, the texture-based feature with corresponding threshold values using a rule set algorithm. The objects obtained in the segmentation process were visualized to be overlaying the satellite image at a scale of 15. This layer was further segmented using the spectral difference segmentation rule between the objects. These individual class layers were dissolved in the basic segmented layer of the image and were interpreted in vector-based GIS programme to achieve higher accuracy. The result shows a rapid increase in an industrial area of 40% based on industrial area statistics of 2005. There is a decrease in wetlands area which has been converted into built-up. New roads have been constructed which are connecting the islands to urban areas as well as highways. The increase in coastal region has been visualized due to sand depositions. The outcome is well supported by quantitative assessments which will empower rich understanding of land use land cover change for appropriate policy intervention and further monitoring.

Keywords: land use land cover, multiresolution segmentation, NDVI, object based classification

Procedia PDF Downloads 168
410 The Contribution of Boards to Company Performance via Strategic Management

Authors: Peter Crow

Abstract:

Boards and directors have been subjects of much scholarly research and public interest over several decades, more so since the succession of high profile company failures of the early 2000s. An array of research outputs including information, correlations, descriptions, models, hypotheses and theories have been reported. While some of this research has shed light on aspects of the board–performance relationship and on board tasks and behaviours, the nature and characteristics of the supposed board–performance relationship remain undetermined. That satisfactory explanations of how boards influence company performance have yet to emerge is a significant blind spot. Yet the board is ultimately responsible for company performance, in accordance with the wishes of shareholders. The aim of this paper is to explore corporate governance and board practice through the lens of strategic management, and to take tentative steps towards a new conception of corporate governance. The findings of a recent longitudinal multiple-case study designed to explore the board’s involvement in strategic management are reported. Qualitative and quantitative data was collected from two quasi-public large companies in New Zealand including from first-hand observations of boards in session, semi-structured interviews with chief executives and chairmen and the inspection of company and board documentation. A synthetic timeline framework was used to collate the financial, board structure, board activity and decision-making data, in order to provide a holistic perspective. Decision sequences were identified, and realist techniques of abduction and retroduction were iteratively applied to analyse the multi-year data set. Using several models previously proposed in the literature as a guide, conjectures were formed, tested and refined—the culmination of which was a provisional model of how boards can influence performance via strategic management. The model builds on both existing theoretical perspectives and theoretical models proposed in the corporate governance and strategic management literature. This paper seeks to add to the understanding of how boards can make meaningful contributions to value creation via strategic management, and to comment on the qualities of directors, social interactions in boardrooms and other circumstances within which influence might be possible given the highly contingent relationship between board activity and business performance outcomes.

Keywords: board practice, case study, corporate governance, strategic management

Procedia PDF Downloads 207
409 Tunnel Convergence Monitoring by Distributed Fiber Optics Embedded into Concrete

Authors: R. Farhoud, G. Hermand, S. Delepine-lesoille

Abstract:

Future underground facility of French radioactive waste disposal, named Cigeo, is designed to store intermediate and high level - long-lived French radioactive waste. Intermediate level waste cells are tunnel-like, about 400m length and 65 m² section, equipped with several concrete layers, which can be grouted in situ or composed of tunnel elements pre-grouted. The operating space into cells, to allow putting or removing waste containers, should be monitored for several decades without any maintenance. To provide the required information, design was performed and tested in situ in Andra’s underground laboratory (URL) at 500m under the surface. Based on distributed optic fiber sensors (OFS) and backscattered Brillouin for strain and Raman for temperature interrogation technics, the design consists of 2 loops of OFS, at 2 different radiuses, around the monitored section (Orthoradiale strains) and longitudinally. Strains measured by distributed OFS cables were compared to classical vibrating wire extensometers (VWE) and platinum probes (Pt). The OFS cables were composed of 2 cables sensitive to strains and temperatures and one only for temperatures. All cables were connected, between sensitive part and instruments, to hybrid cables to reduce cost. The connection has been made according to 2 technics: splicing fibers in situ after installation or preparing each fiber with a connector and only plugging them together in situ. Another challenge was installing OFS cables along a tunnel mad in several parts, without interruption along several parts. First success consists of the survival rate of sensors after installation and quality of measurements. Indeed, 100% of OFS cables, intended for long-term monitoring, survived installation. Few new configurations were tested with relative success. Measurements obtained were very promising. Indeed, after 3 years of data, no difference was observed between cables and connection methods of OFS and strains fit well with VWE and Pt placed at the same location. Data, from Brillouin instrument sensitive to strains and temperatures, were compensated with data provided by Raman instrument only sensitive to temperature and into a separated fiber. These results provide confidence in the next steps of the qualification processes which consists of testing several data treatment approach for direct analyses.

Keywords: monitoring, fiber optic, sensor, data treatment

Procedia PDF Downloads 116
408 Reverse Engineering of a Secondary Structure of a Helicopter: A Study Case

Authors: Jose Daniel Giraldo Arias, Camilo Rojas Gomez, David Villegas Delgado, Gullermo Idarraga Alarcon, Juan Meza Meza

Abstract:

The reverse engineering processes are widely used in the industry with the main goal to determine the materials and the manufacture used to produce a component. There are a lot of characterization techniques and computational tools that are used in order to get this information. A study case of a reverse engineering applied to a secondary sandwich- hybrid type structure used in a helicopter is presented. The methodology used consists of five main steps, which can be applied to any other similar component: Collect information about the service conditions of the part, disassembly and dimensional characterization, functional characterization, material properties characterization and manufacturing processes characterization, allowing to obtain all the supports of the traceability of the materials and processes of the aeronautical products that ensure their airworthiness. A detailed explanation of each step is covered. Criticality and comprehend the functionalities of each part, information of the state of the art and information obtained from interviews with the technical groups of the helicopter’s operators were analyzed,3D optical scanning technique, standard and advanced materials characterization techniques and finite element simulation allow to obtain all the characteristics of the materials used in the manufacture of the component. It was found that most of the materials are quite common in the aeronautical industry, including Kevlar, carbon, and glass fibers, aluminum honeycomb core, epoxy resin and epoxy adhesive. The stacking sequence and volumetric fiber fraction are a critical issue for the mechanical behavior; a digestion acid method was used for this purpose. This also helps in the determination of the manufacture technique which for this case was Vacuum Bagging. Samples of the material were manufactured and submitted to mechanical and environmental tests. These results were compared with those obtained during reverse engineering, which allows concluding that the materials and manufacture were correctly determined. Tooling for the manufacture was designed and manufactured according to the geometry and manufacture process requisites. The part was manufactured and the mechanical, and environmental tests required were also performed. Finally, a geometric characterization and non-destructive techniques allow verifying the quality of the part.

Keywords: reverse engineering, sandwich-structured composite parts, helicopter, mechanical properties, prototype

Procedia PDF Downloads 395
407 Utilizing Topic Modelling for Assessing Mhealth App’s Risks to Users’ Health before and during the COVID-19 Pandemic

Authors: Pedro Augusto Da Silva E Souza Miranda, Niloofar Jalali, Shweta Mistry

Abstract:

BACKGROUND: Software developers utilize automated solutions to scrape users’ reviews to extract meaningful knowledge to identify problems (e.g., bugs, compatibility issues) and possible enhancements (e.g., users’ requests) to their solutions. However, most of these solutions do not consider the health risk aspects to users. Recent works have shed light on the importance of including health risk considerations in the development cycle of mHealth apps to prevent harm to its users. PROBLEM: The COVID-19 Pandemic in Canada (and World) is currently forcing physical distancing upon the general population. This new lifestyle made the usage of mHealth applications more essential than ever, with a projected market forecast of 332 billion dollars by 2025. However, this new insurgency in mHealth usage comes with possible risks to users’ health due to mHealth apps problems (e.g., wrong insulin dosage indication due to a UI error). OBJECTIVE: These works aim to raise awareness amongst mHealth developers of the importance of considering risks to users’ health within their development lifecycle. Moreover, this work also aims to help mHealth developers with a Proof-of-Concept (POC) solution to understand, process, and identify possible health risks to users of mHealth apps based on users’ reviews. METHODS: We conducted a mixed-method study design. We developed a crawler to mine the negative reviews from two samples of mHealth apps (my fitness, medisafe) from the Google Play store users. For each mHealth app, we performed the following steps: • The reviews are divided into two groups, before starting the COVID-19 (reviews’ submission date before 15 Feb 2019) and during the COVID-19 (reviews’ submission date starts from 16 Feb 2019 till Dec 2020). For each period, the Latent Dirichlet Allocation (LDA) topic model was used to identify the different clusters of reviews based on similar topics of review The topics before and during COVID-19 are compared, and the significant difference in frequency and severity of similar topics are identified. RESULTS: We successfully scraped, filtered, processed, and identified health-related topics in both qualitative and quantitative approaches. The results demonstrated the similarity between topics before and during the COVID-19.

Keywords: natural language processing (NLP), topic modeling, mHealth, COVID-19, software engineering, telemedicine, health risks

Procedia PDF Downloads 115
406 Capacity Building and Training of Health Personals for Disaster Preparedness in North East India

Authors: U. K. Tamuli

Abstract:

Introduction: North East India is graced with natural beauty and hazards. This area is prone to major earthquakes, floods, landslides, accidents, terrorist activities etc. Academy of Trauma (AOT), an NGO of Doctors, conducts training programs, mock drills, field trials amongst the doctors and paramedics in North East India. The present study is to evaluate the efficacy of such training in terms of sensitivity, awareness, and delivery systems of the products. Here the health care delivery system for disaster management is inadequate. Clear guideline of mass casualty management is unavailable. AOT has initiated steps to increase the awareness and handling of mass casualty management to improve the emergency health care delivery system. Method: AOT has conducted training programmes on emergency health management, mass casualty management and hospital preparedness amongst 800 doctors and 1200 paramedics in twenty-two districts of Assam in Northeast India. The training module consists of lectures, hands-on workshop using manikins, mock drills, distribution of manuals, emergency management exercises, periodic exchange of experience and debriefings. AOT evaluates the impact of these trainings by conducting pre and post tests of delegates, trainer’s evaluation, delegate’s satisfaction and confidence level and their suggestions. Results: The module, training, hands-on workshops, mock drills were highly appreciated. There is significant improvement in scores on the post-training tests. The confidence level of the participants has risen to deal with emergency medical situation Conclusion: These kinds of trainings increase the awareness of the medical members to handle mass casualties in different situations. One such training actually sensitises the delegates. Repetition of such training, TOT (Training-of-Trainers) programs, and individual efforts of delegates are extremely important for sustenance and success of health care delivery service during disasters in the developing countries. Further collaboration, assistance, networking, suggestions from established global agencies in this field will be highly appreciated.

Keywords: capacity building, North East India, non-governmental organization, trauma

Procedia PDF Downloads 268
405 Improving the Detection of Depression in Sri Lanka: Cross-Sectional Study Evaluating the Efficacy of a 2-Question Screen for Depression

Authors: Prasad Urvashi, Wynn Yezarni, Williams Shehan, Ravindran Arun

Abstract:

Introduction: Primary health services are often the first point of contact that patients with mental illness have with the healthcare system. A number of tools have been developed to increase detection of depression in the context of primary care. However, one challenge amongst many includes utilizing these tools within the limited primary care consultation timeframe. Therefore, short questionnaires that screen for depression that are just as effective as more comprehensive diagnostic tools may be beneficial in improving detection rates of patients visiting a primary care setting. Objective: To develop and determine the sensitivity and specificity of a 2-Question Questionnaire (2-QQ) to screen for depression in in a suburban primary care clinic in Ragama, Sri Lanka. The purpose is to develop a short screening tool for depression that is culturally adapted in order to increase the detection of depression in the Sri Lankan patient population. Methods: This was a cross-sectional study involving two steps. Step one: verbal administration of 2-QQ to patients by their primary care physician. Step two: completion of the Peradeniya Depression Scale, a validated diagnostic tool for depression, the patient after their consultation with the primary care physician. The results from the PDS were then correlated to the results from the 2-QQ for each patient to determine sensitivity and specificity of the 2-QQ. Results: A score of 1/+ on the 2-QQ was most sensitive but least specific. Thus, setting the threshold at this level is effective for correctly identifying depressed patients, but also inaccurately captures patients who are not depressed. A score of 6 on the 2-QQ was most specific but least sensitive. Setting the threshold at this level is effective for correctly identifying patients without depression, but not very effective at capturing patients with depression. Discussion: In the context of primary care, it may be worthwhile setting the 2-QQ screen at a lower threshold for positivity (such as a score of 1 or above). This would generate a high test sensitivity and thus capture the majority of patients that have depression. On the other hand, by setting a low threshold for positivity, patients who do not have depression but score higher than 1 on the 2-QQ will also be falsely identified as testing positive for depression. However, the benefits of identifying patients who present with depression may outweigh the harms of falsely identifying a non-depressed patient. It is our hope that the 2-QQ will serve as a quick primary screen for depression in the primary care setting and serve as a catalyst to identify and treat individuals with depression.

Keywords: depression, primary care, screening tool, Sri Lanka

Procedia PDF Downloads 234
404 The Thoughts and Feelings of 60-72 Month Old Children about School and Teacher

Authors: Ayse Ozturk Samur, Gozde Inal Kiziltepe

Abstract:

No matter what level of education it is, starting a school is an exciting process as it includes new experiences. In this process, child steps into a different environment and institution except from the family institution which he was born into and feels secure. That new environment is different from home; it is a social environment which has its own rules, and involves duties and responsibilities that should be fulfilled and new vital experiences. The children who have a positive attitude towards school and like school are more enthusiastic and eager to participate in classroom activities. Moreover, a close relationship with the teacher enables the child to have positive emotions and ideas about the teacher and school and helps children adapt to school easily. In this study, it is aimed to identify children’s perceptions of academic competence, attitudes towards school and ideas about their teachers. In accordance with the aim a mixed method that includes both qualitative and quantitative data collection methods are used. The study is supported with qualitative data after collecting quantitative data. The study group of the research consists of randomly chosen 250 children who are 60-72 month old and attending a preschool institution in a city center located West Anatolian region of Turkey. Quantitative data was collected using Feelings about School scale. The scale consists of 12 items and 4 dimensions; school, teacher, mathematic, and literacy. Reliability and validity study for the scale used in the study was conducted by the researchers with 318 children who were 60-72 months old. For content validity experts’ ideas were asked, for construct validity confirmatory factor analysis was utilized. Reliability of the scale was examined by calculating internal consistency coefficient (Cronbach alpha). At the end of the analyses it was found that FAS is a valid and reliable instrument to identify 60-72 month old children’ perception of their academic competency, attitude toward school and ideas about their teachers. For the qualitative dimension of the study, semi-structured interviews were done with 30 children aged 60-72 month. At the end of the study, it was identified that children’s’ perceptions of their academic competencies and attitudes towards school was medium-level and their ideas about their teachers were high. Based on the semi structured interviews done with children, it is identified that they have a positive perception of school and teacher. That means quantitatively gathered data is supported by qualitatively collected data.

Keywords: feelings, preschool education, school, teacher, thoughts

Procedia PDF Downloads 208
403 Structuring Highly Iterative Product Development Projects by Using Agile-Indicators

Authors: Guenther Schuh, Michael Riesener, Frederic Diels

Abstract:

Nowadays, manufacturing companies are faced with the challenge of meeting heterogeneous customer requirements in short product life cycles with a variety of product functions. So far, some of the functional requirements remain unknown until late stages of the product development. A way to handle these uncertainties is the highly iterative product development (HIP) approach. By structuring the development project as a highly iterative process, this method provides customer oriented and marketable products. There are first approaches for combined, hybrid models comprising deterministic-normative methods like the Stage-Gate process and empirical-adaptive development methods like SCRUM on a project management level. However, almost unconsidered is the question, which development scopes can preferably be realized with either empirical-adaptive or deterministic-normative approaches. In this context, a development scope constitutes a self-contained section of the overall development objective. Therefore, this paper focuses on a methodology that deals with the uncertainty of requirements within the early development stages and the corresponding selection of the most appropriate development approach. For this purpose, internal influencing factors like a company’s technology ability, the prototype manufacturability and the potential solution space as well as external factors like the market accuracy, relevance and volatility will be analyzed and combined into an Agile-Indicator. The Agile-Indicator is derived in three steps. First of all, it is necessary to rate each internal and external factor in terms of the importance for the overall development task. Secondly, each requirement has to be evaluated for every single internal and external factor appropriate to their suitability for empirical-adaptive development. Finally, the total sums of internal and external side are composed in the Agile-Indicator. Thus, the Agile-Indicator constitutes a company-specific and application-related criterion, on which the allocation of empirical-adaptive and deterministic-normative development scopes can be made. In a last step, this indicator will be used for a specific clustering of development scopes by application of the fuzzy c-means (FCM) clustering algorithm. The FCM-method determines sub-clusters within functional clusters based on the empirical-adaptive environmental impact of the Agile-Indicator. By means of the methodology presented in this paper, it is possible to classify requirements, which are uncertainly carried out by the market, into empirical-adaptive or deterministic-normative development scopes.

Keywords: agile, highly iterative development, agile-indicator, product development

Procedia PDF Downloads 228
402 The Millennium Development Goals and Algerian Economic Policy: Some Evidences

Authors: Abdelkader Guendouz, Fatima Zohra Adel

Abstract:

Even if both the economic and the human development are an axial pillar in its global policy, Algerian government seems to be more and more engaged in the international context aiming to reach of the so called millennium development goals, and this since its beginning. By looking closely at the Algerian economic policy, it is easy to mention the existence of several programs in which both economic and social realisations including among others, poverty reduction, enhancement of education level and conditions, woman statute and gender equity amelioration targets. The efforts of Algerian government in the field of these targets had been acheminated through three main plans, which are: -PSRE (Plan de Soutien à la Relance Economique), for the period of 2001 to 2004, initiated with about 7 billion US dollar, had been focused on three objectives, namely, poverty reduction, job creation and regional equilibrium with rural areas revitalization. -PCSC (le Programme complémentaire de soutien à la croissance économique), for the period of 2005 to 2009, with a starting funding of 114 billion US dollar. This program aims to develop public services and supporting public investments, especially in which concerns social infrastructures. Now, and at the end of the maturity of the MDGs agenda, an important question is to be asked: what are the main realizations regarding these MDGs? In order to answer this question, the present paper tries to examine the Algerian economic policy (but also the social one) by considering the MDGs challenges, for the period from 2000 to 2010, but also until 2015. This examination is focused on three main targets, namely poverty, education, and health. Firstly, statistical assessment for the Algerian economic and social situation shows that almost all MDGs had been reached during the period of 2000 to 2009 and it continues to maintain and improve them. This observation can be endorsed by invoking some achievements. Starting by the reduction of poverty, the proportion of population living with less than 1 US dollar per a day passed from 8.0 % in 2000 to 0.5 % in 2009, and 0.3 % in 2015. For education sphere, the enrolment ratio of six-year child, which is the most significant index for school attendance, is about 98 % for 2009 against 93 % in 1999, and only 43 % in 1966. Concluding with health care and relevant services; the Algerian government has accomplished big steps in providing easy access to this sector for the population. Moreover, the percentage of assisted accouchement had been raised from 91.2 % in 2000 to 97.2 % in 2009.

Keywords: Algerian economic policy, MDGs, poverty, education, health

Procedia PDF Downloads 239
401 Determining the Effective Substance of Cottonseed Extract on the Treatment of Leishmaniasis

Authors: Mehrosadat Mirmohammadi, Sara Taghdisi, Ali Padash, Mohammad Hossein Pazandeh

Abstract:

Gossypol, a yellowish anti-nutritional compound found in cotton plants, exists in various plant parts, including seeds, husks, leaves, and stems. Chemically, gossypol is a potent polyphenolic aldehyde with antioxidant and therapeutic properties. However, its free form can be toxic, posing risks to both humans and animals. Initially, we extracted gossypol from cotton seeds using n-hexane as a solvent (yield: 84.0 ± 4.0%). We also obtained cotton seed and cotton boll extracts via Soxhlet extraction (25:75 hydroalcoholic ratio). These extracts, combined with cornstarch, formed four herbal medicinal formulations. Ethical approval allowed us to investigate their effects on Leishmania-caused skin wounds, comparing them to glucantime (local ampoule). Herbal formulas outperformed the control group (ethanol only) in wound treatment (p-value 0.05). The average wound diameter after two months did not significantly differ between plant extract ointments and topical glucantime. Notably, cotton boll extract with 1% extra gossypol crystal showed the best therapeutic effect. We extracted gossypol from cotton seeds using n-hexane via Soxhlet extraction. Saponification, acidification, and recrystallization steps followed. FTIR, UV-Vis, and HPLC analyses confirmed the product’s identity. Herbal medicines from cotton seeds effectively treated chronic wounds compared to the ethanol-only control group. Wound diameter differed significantly between extract ointments and glucantime injections. It seems that due to the presence of large amounts of fat in the oil, the extraction of gossypol from it faces many obstacles. The extraction of this compound with our technique showed that extraction from oil has a higher efficiency, perhaps because of the preparation of oil by cold pressing method, the possibility of losing this compound is much less than when extraction is done with Soxhlet. On the other hand, the gossypol in the oil is mostly bound to the protein, which somehow protects the gossypol until the last stage of the extraction process. Since this compound is very sensitive to light and heat, it was extracted as a derivative with acetic acid. Also, in the treatment section, it was found that the ointment prepared with the extract is more effective and Gossypol is one of the effective ingredients in the treatment. Therefore, gossypol can be extracted from the oil and added to the extract from which gossypol has been extracted to make an effective medicine with a certain dose.

Keywords: cottonseed, glucantime, gossypol, leishmaniasis

Procedia PDF Downloads 38
400 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 130