Search results for: corpus based approach
33820 Psychosocial Determinants of School Violent Behavior and the Efficacy of Covert Sensitization in Combination with Systematic approach Therapy among Male Students in Lagos Metropolis: Implications for Student Counselors
Authors: Fidel O. Okopi, Aminu Kazeem Ibrahim
Abstract:
The study investigated psychosocial determinants ‘attitudes and self-esteem’ of school violent behaviors and the efficacy of covert sensitization therapy in combination with systematic approach therapy among male students in Lagos metropolis. Ex-post facto experimental research design was adopted for the study. The samples consisted of 39 school violent behavior students identified through the School Disciplinary Record Books and another 39 non-school violent behavior students identified through randomization. The two groups were from four randomly selected Public Senior Secondary Schools. School Violent Behavior Attitudes Scale (SVBAS) and School Violent Behavior Self-Esteem Scale (SVBSES) were used to collect data for the study. Face and Content validity with the Reliability coefficient of 0.772 for SVBAS and 0.813 for SVBSES were obtained. The results showed that the attitude of school violent behavior students do not significantly differ from that of school non-violent behavior students; the self-esteem of school violent behavior students differs significantly from that of school non-violent behavior students and that Covert Sensitization therapy in combination with Systematic Approach therapy were effective in modifying the self-esteem and attitude of school violent behavior students as surf iced in the pre-test and post-test analysis of school violent behavior students’ responses. The School counselors can modify male school violent behaviors that are traced to attitude and self-esteem with Covert Sensitization therapy in combination with Systematic Approach therapy in metropolitan areas.Keywords: psychosocial determinants, violent behavior, covert sensitization therapy, systematic approach therapy
Procedia PDF Downloads 39733819 An Energy-Efficient Model of Integrating Telehealth IoT Devices with Fog and Cloud Computing-Based Platform
Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo
Abstract:
The rapid growth of telehealth Internet of Things (IoT) devices has raised concerns about energy consumption and efficient data processing. This paper introduces an energy-efficient model that integrates telehealth IoT devices with a fog and cloud computing-based platform, offering a sustainable and robust solution to overcome these challenges. Our model employs fog computing as a localized data processing layer while leveraging cloud computing for resource-intensive tasks, significantly reducing energy consumption. We incorporate adaptive energy-saving strategies. Simulation analysis validates our approach's effectiveness in enhancing energy efficiency for telehealth IoT systems integrated with localized fog nodes and both private and public cloud infrastructures. Future research will focus on further optimization of the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability in other healthcare and industry sectors.Keywords: energy-efficient, fog computing, IoT, telehealth
Procedia PDF Downloads 8633818 Community Arts-Based Learning for Interdisciplinary Pedagogy: Measuring Program Effectiveness Using Design Imperatives for 'a New American University'
Authors: Kevin R. Wilson, Roger Mantie
Abstract:
Community arts-based learning and participatory education are pedagogical techniques that serve to be advantageous for students, curriculum development, and local communities. Using an interpretive approach to examine the significance of this arts-informed research in relation to the eight ‘design imperatives’ proposed as the new model for measuring quality in scholarship for Arizona State University as ‘A New American University’, the purpose of this study was to investigate personal, social, and cultural benefits resulting from student engagement in interdisciplinary community-based projects. Students from a graduate level music education class at the ASU Tempe campus (n=7) teamed with students from an undergraduate level community development class at the ASU Downtown Phoenix campus (n=14) to plan, facilitate, and evaluate seven community-based projects in several locations around the Phoenix-metro area. Data was collected using photo evidence, student reports, and evaluative measures designed by the students. The effectiveness of each project was measured in terms of their ability to meet the eight design imperatives to: 1) leverage place; 2) transform society; 3) value entrepreneurship; 4) conduct use-inspired research; 5) enable student success; 6) fuse intellectual disciplines; 7) be socially embedded; and 8) engage globally. Results indicated that this community arts-based project sufficiently captured the essence of each of these eight imperatives. Implications for how the nature of this interdisciplinary initiative allowed for the eight imperatives to manifest are provided, and project success is expounded upon in relation to utility of each imperative. Discussion is also given for how this type of service learning project formatted within the ‘New American University’ model for measuring quality in academia can be a beneficial pedagogical tool in higher education.Keywords: community arts-based learning, participatory education, pedagogy, service learning
Procedia PDF Downloads 40133817 Improving Search Engine Performance by Removing Indexes to Malicious URLs
Authors: Durga Toshniwal, Lokesh Agrawal
Abstract:
As the web continues to play an increasing role in information exchange, and conducting daily activities, computer users have become the target of miscreants which infects hosts with malware or adware for financial gains. Unfortunately, even a single visit to compromised web site enables the attacker to detect vulnerabilities in the user’s applications and force the downloading of multitude of malware binaries. We provide an approach to effectively scan the so-called drive-by downloads on the Internet. Drive-by downloads are result of URLs that attempt to exploit their visitors and cause malware to be installed and run automatically. To scan the web for malicious pages, the first step is to use a crawler to collect URLs that live on the Internet, and then to apply fast prefiltering techniques to reduce the amount of pages that are needed to be examined by precise, but slower, analysis tools (such as honey clients or antivirus programs). Although the technique is effective, it requires a substantial amount of resources. A main reason is that the crawler encounters many pages on the web that are legitimate and needs to be filtered. In this paper, to characterize the nature of this rising threat, we present implementation of a web crawler on Python, an approach to search the web more efficiently for pages that are likely to be malicious, filtering benign pages and passing remaining pages to antivirus program for detection of malwares. Our approaches starts from an initial seed of known, malicious web pages. Using these seeds, our system generates search engines queries to identify other malicious pages that are similar to the ones in the initial seed. By doing so, it leverages the crawling infrastructure of search engines to retrieve URLs that are much more likely to be malicious than a random page on the web. The results shows that this guided approach is able to identify malicious web pages more efficiently when compared to random crawling-based approaches.Keywords: web crawler, malwares, seeds, drive-by-downloads, security
Procedia PDF Downloads 22933816 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy
Authors: Abdullah Al Mamun, Talal Alkharobi
Abstract:
As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.Keywords: big data, cloud computing, cryptography, hadoop, public key
Procedia PDF Downloads 32033815 Design-Based Elements to Sustain Participant Activity in Massive Open Online Courses: A Case Study
Authors: C. Zimmermann, E. Lackner, M. Ebner
Abstract:
Massive Open Online Courses (MOOCs) are increasingly popular learning hubs that are boasting considerable participant numbers, innovative technical features, and a multitude of instructional resources. Still, there is a high level of evidence showing that almost all MOOCs suffer from a declining frequency of participant activity and fairly low completion rates. In this paper, we would like to share the lessons learned in implementing several design patterns that have been suggested in order to foster participant activity. Our conclusions are based on experiences with the ‘Dr. Internet’ MOOC, which was created as an xMOOC to raise awareness for a more critical approach to online health information: participants had to diagnose medical case studies. There is a growing body of recommendations (based on Learning Analytics results from earlier xMOOCs) as to how the decline in participant activity can be alleviated. One promising focus in this regard is instructional design patterns, since they have a tremendous influence on the learner’s motivation, which in turn is a crucial trigger of learning processes. Since Medieval Age storytelling, micro-learning units and specific comprehensible, narrative structures were chosen to animate the audience to follow narration. Hence, MOOC participants are not likely to abandon a course or information channel when their curiosity is kept at a continuously high level. Critical aspects that warrant consideration in this regard include shorter course duration, a narrative structure with suspense peaks (according to the ‘storytelling’ approach), and a course schedule that is diversified and stimulating, yet easy to follow. All of these criteria have been observed within the design of the Dr. Internet MOOC: 1) the standard eight week course duration was shortened down to six weeks, 2) all six case studies had a special quiz format and a corresponding resolution video which was made available in the subsequent week, 3) two out of six case studies were split up in serial video sequences to be presented over the span of two weeks, and 4) the videos were generally scheduled in a less predictable sequence. However, the statistical results from the first run of the MOOC do not indicate any strong influences on the retention rate, so we conclude with some suggestions as to why this might be and what aspects need further consideration.Keywords: case study, Dr. internet, experience, MOOCs, design patterns
Procedia PDF Downloads 26633814 Soil Moisture Control System: A Product Development Approach
Authors: Swapneel U. Naphade, Dushyant A. Patil, Satyabodh M. Kulkarni
Abstract:
In this work, we propose the concept and geometrical design of a soil moisture control system (SMCS) module by following the product development approach to develop an inexpensive, easy to use and quick to install product targeted towards agriculture practitioners. The module delivers water to the agricultural land efficiently by sensing the soil moisture and activating the delivery valve. We start with identifying the general needs of the potential customer. Then, based on customer needs we establish product specifications and identify important measuring quantities to evaluate our product. Keeping in mind the specifications, we develop various conceptual solutions of the product and select the best solution through concept screening and selection matrices. Then, we develop the product architecture by integrating the systems into the final product. In the end, the geometric design is done using human factors engineering concepts like heuristic analysis, task analysis, and human error reduction analysis. The result of human factors analysis reveals the remedies which should be applied while designing the geometry and software components of the product. We find that to design the best grip in terms of comfort and applied force, for a power-type grip, a grip-diameter of 35 mm is the most ideal.Keywords: agriculture, human factors, product design, soil moisture control
Procedia PDF Downloads 17233813 Lexical Based Method for Opinion Detection on Tripadvisor Collection
Authors: Faiza Belbachir, Thibault Schienhinski
Abstract:
The massive development of online social networks allows users to post and share their opinions on various topics. With this huge volume of opinion, it is interesting to extract and interpret these information for different domains, e.g., product and service benchmarking, politic, system of recommendation. This is why opinion detection is one of the most important research tasks. It consists on differentiating between opinion data and factual data. The difficulty of this task is to determine an approach which returns opinionated document. Generally, there are two approaches used for opinion detection i.e. Lexical based approaches and Machine Learning based approaches. In Lexical based approaches, a dictionary of sentimental words is used, words are associated with weights. The opinion score of document is derived by the occurrence of words from this dictionary. In Machine learning approaches, usually a classifier is trained using a set of annotated document containing sentiment, and features such as n-grams of words, part-of-speech tags, and logical forms. Majority of these works are based on documents text to determine opinion score but dont take into account if these texts are really correct. Thus, it is interesting to exploit other information to improve opinion detection. In our work, we will develop a new way to consider the opinion score. We introduce the notion of trust score. We determine opinionated documents but also if these opinions are really trustable information in relation with topics. For that we use lexical SentiWordNet to calculate opinion and trust scores, we compute different features about users like (numbers of their comments, numbers of their useful comments, Average useful review). After that, we combine opinion score and trust score to obtain a final score. We applied our method to detect trust opinions in TRIPADVISOR collection. Our experimental results report that the combination between opinion score and trust score improves opinion detection.Keywords: Tripadvisor, opinion detection, SentiWordNet, trust score
Procedia PDF Downloads 19933812 Conformal Noble Metal High-Entropy Alloy Nanofilms by Atomic Layer Deposition for Enhanced Hydrogen Evolution Reaction/Oxygen Evolution Reaction Electrocatalysis Applications
Authors: Jing Lin, Zou Yiming, Goei Ronn, Li Yun, Amanda Ong Jiamin, Alfred Tok Iing Yoong
Abstract:
High-entropy alloy (HEA) coatings comprise multiple (five or more) principal elements that give superior mechanical, electrical, and thermal properties. However, the current synthesis methods of HEA coating still face huge challenges in facile and controllable preparation, as well as conformal integration, which seriously restricts their potential applications. Herein, we report a controllable synthesis of conformal quinary HEA coating consisting of noble metals (Rh, Ru, Ir, Pt, and Pd) by using the atomic layer deposition (ALD) with a post-annealing approach. This approach realizes low temperature (below 200 °C), precise control (nanoscale), and conformal synthesis (over complex substrates) of HEA coating. Furthermore, the resulting quinary HEA coating shows promising potential as a platform for catalysis, exhibiting substantially enhanced electrocatalytic hydrogen evolution reaction (HER) and oxygen evolution reaction (OER) performances as compared to other noble metal-based structures such as single metal coating or multi-layered metal composites.Keywords: high-entropy alloy, thin-film, catalysis, water splitting, atomic layer deposition
Procedia PDF Downloads 12633811 [Keynote Speech]: Evidence-Based Outcome Effectiveness Longitudinal Study on Three Approaches to Reduce Proactive and Reactive Aggression in Schoolchildren: Group CBT, Moral Education, Bioneurological Intervention
Authors: Annis Lai Chu Fung
Abstract:
While aggression had high stability throughout developmental stages and across generations, it should be the top priority of researchers and frontline helping professionals to develop prevention and intervention programme for aggressive children and children at risk of developing aggressive behaviours. Although there is a substantial amount of anti-bullying programmes, they gave disappointingly small effect sizes. The neglectful practical significance could be attributed to the overly simplistic categorisation of individuals involved as bullies or victims. In the past three decades, the distinction between reactive and proactive aggression has been well-proved. As children displaying reactively aggressive behaviours have distinct social-information processing pattern with those showing proactively aggressive behaviours, it is critical to identify the unique needs of the two subtypes accordingly when designing an intervention. The onset of reactive aggression and proactive aggression was observed at earliest in 4.4 and 6.8 years old respectively. Such findings called for a differential early intervention targeting these high-risk children. However, to the best of the author’s knowledge, the author was the first to establish an evidence-based intervention programme against reactive and proactive aggression. With the largest samples in the world, the author, in the past 10 years, explored three different approaches and their effectiveness against aggression quantitatively and qualitatively with longitudinal design. The three approaches presented are (a) cognitive-behavioral approach, (b) moral education, with Chinese marital arts and ethics as the medium, and (c) bioneurological measures (omega-3 supplementation). The studies adopted a multi-informant approach with repeated measures before and after the intervention, and follow-up assessment. Participants were recruited from primary and secondary schools in Hong Kong. In the cognitive-behavioral approach, 66 reactive aggressors and 63 proactive aggressors, aged from 11 to 17, were identified from 10,096 secondary-school children with questionnaire and subsequent structured interview. Participants underwent 10 group sessions specifically designed for each subtype of aggressor. Results revealed significant declines in aggression levels from the baseline to the follow-up assessment after 1 year. In moral education through the Chinese martial arts, 315 high-risk aggressive children, aged 6 to 12 years, were selected from 3,511 primary-school children and randomly assigned into four types of 10-session intervention group, namely martial-skills-only, martial-ethics-only, both martial-skills-and-ethics, and physical fitness (placebo). Results showed only the martial-skills-and-ethics group had a significant reduction in aggression after treatment and 6 months after treatment comparing with the placebo group. In the bioneurological approach, 218 children, aged from 8 to 17, were randomly assigned to the omega-3 supplement group and the placebo group. Results revealed that compared with the placebo group, the omega-3 supplement group had significant declines in aggression levels at the 6-month follow-up assessment. All three approaches were effective in reducing proactive and reactive aggression. Traditionally, intervention programmes against aggressive behaviour often adapted the cognitive and/or behavioural approach. However, cognitive-behavioural approach for children was recently challenged by its demanding requirement of cognitive ability. Traditional cognitive interventions may not be as beneficial to an older population as in young children. The present study offered an insightful perspective in aggression reduction measures.Keywords: intervention, outcome effectiveness, proactive aggression, reactive aggression
Procedia PDF Downloads 22233810 The Role of Video in Teaching and Learning Pronunciation: A Case Study
Authors: Kafi Razzaq Ahmed
Abstract:
Speaking fluently in a second language requires vocabulary, grammar, and pronunciation skills. Teaching the English language entails teaching pronunciation. In professional literature, there have been a lot of attempts to integrate technology into improving the pronunciation of learners. The technique is also neglected in Kurdish contexts, Salahaddin University – Erbil included. Thus, the main aim of the research is to point out the efficiency of using video materials for both language teachers and learners within and beyond classroom learning and teaching environments to enhance student's pronunciation. To collect practical data, a research project has been designed. In subsequent research, a posttest will be administered after each lesson to 100 first-year students at Salahaddin University-Erbil English departments. All students will be taught the same material using different methods, one based on video materials and the other based on the traditional approach to teaching pronunciation. Finally, the results of both tests will be analyzed (also knowing the attitudes of both the teachers and the students about both lessons) to indicate the impact of using video in the process of teaching and learning pronunciation.Keywords: video, pronunciation, teaching, learning
Procedia PDF Downloads 10833809 Model of Production and Marketing Strategies in Alignment with Business Strategy using QFD Approach
Authors: Hamed Saremi, Suzan Taghavy, Shahla Saremi
Abstract:
In today's competitive world, organizations are expected to surpass the competitors and benefit from the resources and benefits. Therefore, organizations need to improve the current performance is felt more than ever that this requires to identify organizational optimal strategies, and consider all strategies simultaneously. In this study, to enhance competitive advantage and according to customer requirements, alignment between business, production and marketing strategies, House of Quality (QFD) approach has been used and zero-one linear programming model has been studied. First, the alignment between production and marketing strategies with business strategy, independent weights of these strategies is calculated. Then with using QFD approach the aligned weights of optimal strategies in each production and marketing field will be obtained and finally the aligned marketing strategies selection with the purpose of allocating budget and specialist human resource to marketing functions will be done that lead to increasing competitive advantage and benefit.Keywords: strategy alignment, house of quality deployment, production strategy, marketing strategy, business strategy
Procedia PDF Downloads 43533808 Historical Memory and Social Representation of Violence in Latin American Cinema: A Cultural Criminology Approach
Authors: Maylen Villamanan Alba
Abstract:
Latin America is marked by its history: conquest, colonialism, and slavery left deep footprints in most Latin American countries. Also, the past century has been affected by wars, military dictatorships, and political violence, which profoundly influenced Latin American popular culture. Consequently, reminiscences of historical crimes are frequently present in daily life, media, public opinion, and arts. This legacy is remembered in novels, paintings, songs, and films. In fact, Latin American cinema has a trend which refers to the verisimilitude with reality in fiction films. These films about historical violence are narrated as fictional characters, but their stories are based on real historical contexts. Therefore, cultural criminology has considered films as a significant field to understand social representations of violence related to historical crimes. The aim of the present contribution is to analyze the legacy of past and historical memory in social representations of violence in Latin American cinema as a critical approach to historical crimes. This qualitative research is based on content analysis. The sample is seven multi-award winning films of the International Festival of New Latin American Cinema of Havana. The films selected are Kamchatka, Argentina (2002); Carandiru, Brazil (2003); Enlightened by fire, Argentina (2005); Post-mortem, Chile (2010); No, Chile (2012) Wakolda; Argentina (2013) and The Clan, Argentina (2015). Cultural criminology highlights that cinema shapes meanings of social practices such as historical crimes. Critical criminology offers a critical theory framework to interpret Latin American cinema. This analysis reveals historical conditions deeply associated with power relationships, policy, and inequality issues. As indicated by this theory, violence is characterized as a structural process based on social asymmetries. These social asymmetries are crossed by social scopes, including institutional and personal dimensions. Thus, institutions of the states are depicted through personal stories of characters involved with human conflicts. Intimacy and social background are linked in the personages who simultaneously perform roles such as soldiers, policemen, professionals or inmates and they are at the same time depict as human beings with family, gender, racial, ideological or generational issues. Social representations of violence related to past legacy are a portrait of historical crimes perpetrated against Latin American citizens. Thereby, they have contributed to political positions, social behaviors, and public opinion. The legacy of these historical crimes suggests a path that should never be taken again. It means past legacy is a reminder, a warning, and a historic lesson for Latin American people. Social representations of violence are permeated by historical memory as denunciation under a critical approach.Keywords: Latin American cinema, historical memory, social representation, violence
Procedia PDF Downloads 14733807 Mechanical Cortical Bone Characterization with the Finite Element Method Based Inverse Method
Authors: Djamel Remache, Marie Semaan, Cécile Baron, Martine Pithioux, Patrick Chabrand, Jean-Marie Rossi, Jean-Louis Milan
Abstract:
Cortical bone is a complex multi-scale structure. Even though several works have contributed significantly to understanding its mechanical behavior, this behavior remains poorly understood. Nanoindentation testing is one of the primary testing techniques for the mechanical characterization of bone at small scales. The purpose of this study was to provide new nanoindentation data of cortical bovine bone in different directions and at different bone microstructures (osteonal, interstitial and laminar bone), and then to identify anisotropic properties of samples with FEM (finite element method) based inverse method. Experimentally and numerical results were compared. Experimental and numerical results were compared. The results compared were in good agreement.Keywords: mechanical behavior of bone, nanoindentation, finite element analysis, inverse optimization approach
Procedia PDF Downloads 33633806 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements
Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker
Abstract:
Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.Keywords: adaptive, CAx, function blocks, turbomachinery
Procedia PDF Downloads 29733805 Liquefaction Susceptibility of Tailing Storage Facility-Comparison of National Centre for Earthquake Engineering Research and Finite Element Methods
Authors: Mehdi Ghatei, Masoomeh Lorestani
Abstract:
Upstream Tailings Storage Facilities (TSFs) may experience slope instabilities due to soil liquefaction, especially in regions known to be seismically active. In this study, liquefaction susceptibility of an upstream-raised TSF in Western Australia was assessed using two different approaches. The first approach assessed liquefaction susceptibility using Cone Penetration Tests with pore pressure measurement (CPTu) as described by the National Centre for Earthquake Engineering Research (NCEER). This assessment was based on the four CPTu tests that were conducted on the perimeter embankment of the TSF. The second approach used the Finite Element (FE) method with application of an equivalent linear model to predict the undrained cyclic behavior, the pore water pressure and the liquefaction of the materials. The tailings parameters were estimated from the CPTu profiles and from the laboratory tests. The cyclic parameters were estimated from the literature where test results of similar material were available. The results showed that there was a good agreement, in the liquefaction susceptibility of the tailings material, between the NCEER and FE methods with equivalent linear model.Keywords: liquefaction , CPTU, NCEER, finite element method, equivalent linear model
Procedia PDF Downloads 27233804 Close-Reading Works of Art and the Ideal of Naïveté: Elements of an Anti-Cartesian Approach to Humanistic Liberal Education
Authors: Peter Hajnal
Abstract:
The need to combine serious training in disciplinary/scholarly approaches to problems of general significance with an educational experience that engages students with these very same problems on a personal level is one of the key challenges facing modern liberal education in the West. The typical approach to synthesizing these two goals, one highly abstract, the other elusively practical, proceeds by invoking ideals traditionally associated with Enlightenment and 19th century “humanism”. These ideas are in turn rooted in an approach to reality codified by Cartesianism and the rise of modern science. Articulating this connection of the modern humanist tradition with Cartesianism allows for demonstrating how the central problem of modern liberal education is rooted in the strict separation of knowledge and personal experience inherent in the dualism of Descartes. The question about the shape of contemporary liberal education is, therefore, the same as asking whether an anti-Cartesian version of liberal education is possible at all. Although the formulation of a general answer to this question is a tall order (whether in abstract or practical terms), and might take different forms (nota bene in Eastern and Western contexts), a key inspiration may be provided by a certain shift of attitude towards the Cartesian conception of the relationship of knowledge and experience required by discussion based close-reading of works of visual art. Taking the work of Stanley Cavell as its central inspiration, my paper argues that this shift of attitude in question is best described as a form of “second naïveté”, and that it provides a useful model of conceptualizing in more concrete terms the appeal for such a “second naïveté” expressed in recent writings on the role of various disciplines in organizing learning by philosophers of such diverse backgrounds and interests as Hilary Putnam and Bruno Latour. The adoption of naïveté so identified as an educational ideal may be seen as a key instrument in thinking of the educational context as itself a medium of synthesis of the contemplative and the practical. Moreover, it is helpful in overcoming the bad dilemma of ideological vs. conservative approaches to liberal education, as well as in correcting a certain commonly held false view of the historical roots of liberal education in the Renaissance, which turns out to offer much more of a sui generis approach to practice rather than represent a mere precursor to the Cartesian conception.Keywords: liberal arts, philosophy, education, Descartes, naivete
Procedia PDF Downloads 19133803 Numerical Computation of Generalized Rosenau Regularized Long-Wave Equation via B-Spline Over Butcher’s Fifth Order Runge-Kutta Approach
Authors: Guesh Simretab Gebremedhin, Saumya Rajan Jena
Abstract:
In this work, a septic B-spline scheme has been used to simplify the process of solving an approximate solution of the generalized Rosenau-regularized long-wave equation (GR-RLWE) with initial boundary conditions. The resulting system of first-order ODEs has dealt with Butcher’s fifth order Runge-Kutta (BFRK) approach without using finite difference techniques for discretizing the time-dependent variables at each time level. Here, no transformation or any kind of linearization technique is employed to tackle the nonlinearity of the equation. Two test problems have been selected for numerical justifications and comparisons with other researchers on the basis of efficiency, accuracy, and results of the two invariants Mᵢ (mass) and Eᵢ (energy) of some motion that has been used to test the conservative properties of the proposed scheme.Keywords: septic B-spline scheme, Butcher's fifth order Runge-Kutta approach, error norms, generalized Rosenau-RLW equation
Procedia PDF Downloads 6633802 Microfluidized Fiber Based Oleogels for Encapsulation of Lycopene
Authors: Behic Mert
Abstract:
This study reports a facile approach to structure soft solids from microfluidizer lycopene-rich plant based structure and oil. First carotenoid-rich plant material (pumpkin was used in this study) processed with high-pressure microfluidizer to release lycopene molecules, then an emulsion was formed by mixing processed plant material and oil. While, in emulsion state lipid soluble carotenoid molecules were allowed to dissolve in the oil phase, the fiber material of plant material provided the network which was required for emulsion stabilization. Additional hydrocolloids (gelatin, xhantan, and pectin) up to 0.5% were also used to reinforce the emulsion stability and their impact on final product properties were evaluated via rheological, textural and oxidation studies. Finally, water was removed from emulsion phase by drying in a tray dryer at 40°C for 36 hours, and subsequent shearing resulted in soft solid (ole gel) structures. The microstructure of these systems was revealed by cryo-scanning electron microscopy. Effect of hydrocolloids on total lycopene and surface lycopene contents were also evaluated. The surface lycopene was lowest in gelatin containing oleo gels and highest in pectin-containing oleo gels. This study outlines the novel emulsion-based structuring method that can be used to encapsulate lycopene without the need of separate extraction of them.Keywords: lycopene, encapsulation, fiber, oleo gel
Procedia PDF Downloads 26633801 Optimal Design of a PV/Diesel Hybrid System for Decentralized Areas through Economic Criteria
Authors: David B. Tsuanyo, Didier Aussel, Yao Azoumah, Pierre Neveu
Abstract:
An innovative concept called “Flexy-Energy”is developing at 2iE. This concept aims to produce electricity at lower cost by smartly mix different available energies sources in accordance to the load profile of the region. With a higher solar irradiation and due to the fact that Diesel generator are massively used in sub-Saharan rural areas, PV/Diesel hybrid systems could be a good application of this concept and a good solution to electrify this region, provided they are reliable, cost effective and economically attractive to investors. Presentation of the developed approach is the aims of this paper. The PV/Diesel hybrid system designed consists to produce electricity and/or heat from a coupling between Diesel gensets and PV panels without batteries storage, while ensuring the substitution of gasoil by bio-fuels available in the area where the system will be installed. The optimal design of this system is based on his technical performances; the Life Cycle Cost (LCC) and Levelized Cost of Energy are developed and use as economic criteria. The Net Present Value (NPV), the internal rate of return (IRR) and the discounted payback (DPB) are also evaluated according to dual electricity pricing (in sunny and unsunny hours). The PV/Diesel hybrid system obtained is compared to the standalone Diesel gensets. The approach carried out in this paper has been applied to Siby village in Mali (Latitude 12 ° 23'N 8 ° 20'W) with 295 kWh as daily demand. This approach provides optimal physical characteristics (size of the components, number of component) and dynamical characteristics in real time (number of Diesel generator on, their load rate, fuel specific consumptions, and PV penetration rate) of the system. The system obtained is slightly cost effective; but could be improved with optimized tariffing strategies.Keywords: investments criteria, optimization, PV hybrid, sizing, rural electrification
Procedia PDF Downloads 44133800 Merging Sequence Diagrams Based Slicing
Authors: Bouras Zine Eddine, Talai Abdelouaheb
Abstract:
The need to merge software artifacts seems inherent to modern software development. Distribution of development over several teams and breaking tasks into smaller, more manageable pieces are an effective means to deal with the kind of complexity. In each case, the separately developed artifacts need to be assembled as efficiently as possible into a consistent whole in which the parts still function as described. Also, earlier changes are introduced into the life cycle and easier is their management by designers. Interaction-based specifications such as UML sequence diagrams have been found effective in this regard. As a result, sequence diagrams can be used not only for capturing system behaviors but also for merging changes in order to create a new version. The objective of this paper is to suggest a new approach to deal with the problem of software merging at the level of sequence diagrams by using the concept of dependence analysis that captures, formally, all mapping and differences between elements of sequence diagrams and serves as a key concept to create a new version of sequence diagram.Keywords: system behaviors, sequence diagram merging, dependence analysis, sequence diagram slicing
Procedia PDF Downloads 34033799 An Approach to Addressing Homelessness in Hong Kong: Life Story Approach
Authors: Tak Mau Simon Chan, Ying Chuen Lance Chan
Abstract:
Homelessness has been a popular and controversial debate in Hong Kong, a city which is densely populated and well-known for very expensive housing. The constitution of the homeless as threats to the community and environmental hygiene is ambiguous and debatable in the Hong Kong context. The lack of an intervention model is the critical research gap thus far, aside from the tangible services delivered. The life story approach (LSA), with its unique humanistic orientation, has been well applied in recent decades to depict the needs of various target groups, but not the homeless. It is argued that the life story approach (LSA), which has been employed by health professionals in the landscape of dementia, and health and social care settings, can be used as a reference in the local Chinese context through indigenization. This study, therefore, captures the viewpoints of service providers and users by constructing an indigenous intervention model that refers to the LSA in serving the chronically homeless. By informing 13 social workers and 27 homeless individuals in 8 focus groups whilst 12 homeless individuals have participated in individual in-depth interviews, a framework of LSA in homeless people is proposed. Through thematic analysis, three main themes of their life stories was generated, namely, the family, negative experiences and identity transformation. The three domains solidified framework that not only can be applied to the homeless, but also other disadvantaged groups in the Chinese context. Based on the three domains of family, negative experiences and identity transformation, the model is applied in the daily practices of social workers who help the homeless. The domain of family encompasses familial relationships from the past to the present to the speculated future with ten sub-themes. The domain of negative experiences includes seven sub-themes, with reference to the deviant behavior committed. The last domain, identity transformation, incorporates the awareness and redefining of one’s identity and there are a total of seven sub-themes. The first two domains are important components of personal histories while the third is more of an unknown, exploratory and yet to-be-redefined territory which has a more positive and constructive orientation towards developing one’s identity and life meaning. The longitudinal temporal dimension of moving from the past – present - future enriches the meaning making process, facilitates the integration of life experiences and maintains a more hopeful dialogue. The model is tested and its effectiveness is measured by using qualitative and quantitative methods to affirm the extent that it is relevant to the local context. First, it contributes to providing a clear guideline for social workers who can use the approach as a reference source. Secondly, the framework acts as a new intervention means to address problem saturated stories and the intangible needs of the homeless. Thirdly, the model extends the application to beyond health related issues. Last but not least, the model is highly relevant to the local indigenous context.Keywords: homeless, indigenous intervention, life story approach, social work practice
Procedia PDF Downloads 29633798 A Framework for Defining Innovation Districts: A Case Study of 22@ Barcelona
Authors: Arnault Morisson
Abstract:
Innovation districts are being implemented as urban regeneration strategies in cities as diverse as Barcelona (Spain), Boston (Massachusetts), Chattanooga (Tennessee), Detroit (Michigan), Medellin (Colombia), and Montréal (Canada). Little, however, is known about the concept. This paper aims to provide a framework to define innovation districts. The research methodology is based on a qualitative approach using 22@ Barcelona as a case study. 22@ Barcelona was the first innovation district ever created and has been a model for the innovation districts of Medellin (Colombia) and Boston (Massachusetts) among others. Innovation districts based on the 22@ Barcelona’s model can be defined as top-down urban innovation ecosystems designed around four multilayered and multidimensional models of innovation: urban planning, productive, collaborative, and creative, all coordinated under strong leadership, with the ultimate objectives to accelerate the innovation process and competitiveness of a locality. Innovation districts aim to respond to a new economic paradigm in which economic production flows back to cities.Keywords: innovation ecosystem, governance, technology park, urban planning, urban policy, urban regeneration
Procedia PDF Downloads 37233797 A Network Approach to Analyzing Financial Markets
Authors: Yusuf Seedat
Abstract:
The necessity to understand global financial markets has increased following the unfortunate spread of the recent financial crisis around the world. Financial markets are considered to be complex systems consisting of highly volatile move-ments whose indexes fluctuate without any clear pattern. Analytic methods of stock prices have been proposed in which financial markets are modeled using common network analysis tools and methods. It has been found that two key components of social network analysis are relevant to modeling financial markets, allowing us to forecast accurate predictions of stock prices within the financial market. Financial markets have a number of interacting components, leading to complex behavioral patterns. This paper describes a social network approach to analyzing financial markets as a viable approach to studying the way complex stock markets function. We also look at how social network analysis techniques and metrics are used to gauge an understanding of the evolution of financial markets as well as how community detection can be used to qualify and quantify in-fluence within a network.Keywords: network analysis, social networks, financial markets, stocks, nodes, edges, complex networks
Procedia PDF Downloads 19133796 Analysis of Complex Business Negotiations: Contributions from Agency-Theory
Authors: Jan Van Uden
Abstract:
The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations
Procedia PDF Downloads 13933795 Application of Lattice Boltzmann Method to Different Boundary Conditions in a Two Dimensional Enclosure
Authors: Jean Yves Trepanier, Sami Ammar, Sagnik Banik
Abstract:
Lattice Boltzmann Method has been advantageous in simulating complex boundary conditions and solving for fluid flow parameters by streaming and collision processes. This paper includes the study of three different test cases in a confined domain using the method of the Lattice Boltzmann model. 1. An SRT (Single Relaxation Time) approach in the Lattice Boltzmann model is used to simulate Lid Driven Cavity flow for different Reynolds Number (100, 400 and 1000) with a domain aspect ratio of 1, i.e., square cavity. A moment-based boundary condition is used for more accurate results. 2. A Thermal Lattice BGK (Bhatnagar-Gross-Krook) Model is developed for the Rayleigh Benard convection for both test cases - Horizontal and Vertical Temperature difference, considered separately for a Boussinesq incompressible fluid. The Rayleigh number is varied for both the test cases (10^3 ≤ Ra ≤ 10^6) keeping the Prandtl number at 0.71. A stability criteria with a precise forcing scheme is used for a greater level of accuracy. 3. The phase change problem governed by the heat-conduction equation is studied using the enthalpy based Lattice Boltzmann Model with a single iteration for each time step, thus reducing the computational time. A double distribution function approach with D2Q9 (density) model and D2Q5 (temperature) model are used for two different test cases-the conduction dominated melting and the convection dominated melting. The solidification process is also simulated using the enthalpy based method with a single distribution function using the D2Q5 model to provide a better understanding of the heat transport phenomenon. The domain for the test cases has an aspect ratio of 2 with some exceptions for a square cavity. An approximate velocity scale is chosen to ensure that the simulations are within the incompressible regime. Different parameters like velocities, temperature, Nusselt number, etc. are calculated for a comparative study with the existing works of literature. The simulated results demonstrate excellent agreement with the existing benchmark solution within an error limit of ± 0.05 implicates the viability of this method for complex fluid flow problems.Keywords: BGK, Nusselt, Prandtl, Rayleigh, SRT
Procedia PDF Downloads 12833794 A Deterministic Large Deviation Model Based on Complex N-Body Systems
Authors: David C. Ni
Abstract:
In the previous efforts, we constructed N-Body Systems by an extended Blaschke product (EBP), which represents a non-temporal and nonlinear extension of Lorentz transformation. In this construction, we rely only on two parameters, nonlinear degree, and relative momentum to characterize the systems. We further explored root computation via iteration with an algorithm extended from Jenkins-Traub method. The solution sets demonstrate a form of σ+ i [-t, t], where σ and t are the real numbers, and the [-t, t] shows various canonical distributions. In this paper, we correlate the convergent sets in the original domain with solution sets, which demonstrating large-deviation distributions in the codomain. We proceed to compare our approach with the formula or principles, such as Donsker-Varadhan and Wentzell-Freidlin theories. The deterministic model based on this construction allows us to explore applications in the areas of finance and statistical mechanics.Keywords: nonlinear Lorentz transformation, Blaschke equation, iteration solutions, root computation, large deviation distribution, deterministic model
Procedia PDF Downloads 39333793 Exploring Transitions between Communal- and Market-Based Knowledge Sharing
Authors: Benbya Hind, Belbaly Nassim
Abstract:
Markets and communities are often cast as alternative forms of knowledge sharing, but an open question is how and why people dynamically transition between them. To study these transitions, we design a technology that allows geographically distributed participants to either buy knowledge (using virtual points) or request it for free. We use a data-driven, inductive approach, studying 550 members in over 5000 interactions, during nine months. Because the technology offered participants choices between market or community forms, we can document both individual and collective transitions that emerge as people cycle between these forms. Our inductive analysis revealed that uncertainties endemic to knowledge sharing were the impetus for these transitions. Communities evoke uncertainties about knowledge sharing’s costs and benefits, which markets resolve by quantifying explicit prices. However, if people manipulate markets, they create uncertainties about the validity of those prices, allowing communities to reemerge to establish certainty via identity-based validation.Keywords: knowledge sharing, communities, information technology design, transitions, markets
Procedia PDF Downloads 18033792 Monomial Form Approach to Rectangular Surface Modeling
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications
Procedia PDF Downloads 14633791 Developing Digital Twins of Steel Hull Processes
Authors: V. Ložar, N. Hadžić, T. Opetuk, R. Keser
Abstract:
The development of digital twins strongly depends on efficient algorithms and their capability to mirror real-life processes. Nowadays, such efforts are required to establish factories of the future faced with new demands of custom-made production. The ship hull processes face these challenges too. Therefore, it is important to implement design and evaluation approaches based on production system engineering. In this study, the recently developed finite state method is employed to describe the stell hull process as a platform for the implementation of digital twinning technology. The application is justified by comparing the finite state method with the analytical approach. This method is employed to rebuild a model of a real shipyard ship hull process using a combination of serial and splitting lines. The key performance indicators such as the production rate, work in process, probability of starvation, and blockade are calculated and compared to the corresponding results obtained through a simulation approach using the software tool Enterprise dynamics. This study confirms that the finite state method is a suitable tool for digital twinning applications. The conclusion highlights the advantages and disadvantages of methods employed in this context.Keywords: digital twin, finite state method, production system engineering, shipyard
Procedia PDF Downloads 99