Search results for: time perspective
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20324

Search results for: time perspective

14174 Disaster Victim Identification: A Social Science Perspective

Authors: Victor Toom

Abstract:

Albeit it is never possible to anticipate the full range of difficulties after a catastrophe, efforts to identify victims of mass casualty events have become institutionalized and standardized with the aim of effectively and efficiently addressing the many challenges and contingencies. Such ‘disaster victim identification’ (DVI) practices are dependent on the forensic sciences, are subject of national legislation, and are reliant on technical and organizational protocols to mitigate the many complexities in the wake of catastrophe. Apart from such technological, legal and bureaucratic elements constituting a DVI operation, victims’ families and their emotions are also part and parcel of any effort to identify casualties of mass human fatality incidents. Take for example the fact that forensic experts require (antemortem) information from the group of relatives to make identification possible. An identified body or body part is also repatriated to kin. Relatives are thus main stakeholders in DVI operations. Much has been achieved in years past regarding facilitating victims’ families’ issues and their emotions. Yet, how families are dealt with by experts and authorities is still considered a difficult topic. Due to sensitivities and required emphatic interaction with families on the one hand, and the rationalized DVI efforts, on the other hand, there is still scope for improving communication, providing information and meaningful inclusion of relatives in the DVI effort. This paper aims to bridge the standardized world of DVI efforts and families’ experienced realities and makes suggestions to further improve DVI efforts through inclusion of victims’ families. Based on qualitative interviews, the paper narrates involvement and experiences of inter alia DVI practitioners, victims’ families, advocates and clergy in the wake of the 1995 Srebrenica genocide which killed approximately 8,000 men, and the 9/11 in New York City with 2,750 victims. The paper shows that there are several models of including victims’ families into a DVI operation, and it argues for a model of where victims’ families become a partner in DVI operations.

Keywords: disaster victim identification (DVI), victims’ families, social science (qualitative), 9/11 attacks, Srebrenica genocide

Procedia PDF Downloads 224
14173 Black Masculinity, Media Stereotyping And Its Influence on Policing in the United States: A Functionalist Perspective

Authors: Jack Santiago Monell

Abstract:

In America, misrepresentations of black males have been perpetuated throughout the history of popular culture. Because of these narratives, varying communities have developed biases and stereotypes about what black male masculinity represents and more importantly, how they respond to them. The researcher explored the perspectives of police officers in the following states, Maryland, Pennsylvania, and North Carolina. Because of the nature of police and community relations, and national attention to high profile cases, having officers provide context into how black males are viewed from their lens, was critical while expanding on the theoretical explanations to describe attitudes towards police confrontations. As one of the objectives was to identify specific themes relevant to why police officers may view African American males differently, hence, responding more aggressively, this proved to be the most beneficial method of initial analysis to identify themes. The following nodes (appearance, acting suspicious/ troublesome behavior, upbringing about black males, excessive force) were identified to analyze the transcripts to discern associations. The data was analyzed through NVivo 11, and several themes resulted to elaborate on the data received. In analyzing the data, four themes were identified: appearance, acting suspicious/ troublesome behavior, upbringing about black males, and excessive force. The data conveyed that continuous stereotypes about African American men will ultimately result in excessive use of force or pervasive shootings, albeit the men are armed or unarmed. African American males are consistently targeted because of their racial makeup and appearance over any other probable circumstances. As long as racial bias and stereotypical practices continue in policing, African American males will endlessly be unjustly targeted and at times, the victims of violent encounters with police officers in the United States.

Keywords: African American males, police perceptions, masculinity, popular culture

Procedia PDF Downloads 106
14172 A Historical Analysis of The Concept of Equivalence from Different Theoretical Perspectives in Translation Studies

Authors: Amenador Kate Benedicta, Wang Zhiwei

Abstract:

Since the later parts of the 20th century, the notion of equivalence continues to be a central and critical concept in the development of translation theory. After decades of arguments over word-for-word and free translations methods, scholars attempting to develop more systematic and efficient translation theories began to focus on fundamental translation concepts such as equivalence. Although the concept of equivalence has piqued the interest of many scholars, its definition, scope, and applicability have sparked contentious arguments within the discipline. As a result, several distinct theories and explanations on the concept of equivalence have been put forward over the last half-century. Thus, this study explores and discusses the evolution of the critical concept of equivalence in translation studies through a bibliometric method of investigation of manual and digital books and articles by analyzing different scholars' key contributions and limitations on equivalence from various theoretical perspectives. While analyzing them, emphasis is placed on the innovations that each theory has brought to the comprehension of equivalence. In order to achieve the aim of the study, the article began by discussing the contributions of linguistically motivated theories to the notion of equivalence in translation, followed by functionalist-oriented contributions, before moving on to more recent advancements in translation studies on the concept. Because equivalence is such a broad notion, it is impossible to discuss each researcher in depth. As a result, the most well-known names and their equivalent theories are compared and contrasted in this research. The study emphasizes the developmental progression in our comprehension of the equivalence concept and equivalent effect. It concluded that the various theoretical perspective's contributions to the notion of equivalence rather complement and make up for the limitations of each other. The study also highlighted how troublesome the equivalent concept might become in terms of identifying the nature of translation and how central and unavoidable the concept is in every translation action, despite its limitations. The significance of the study lies in its synthesis of the different contributions and limitations of the various theories offered by scholars on the notion of equivalence, lending literature to both student and scholars in the field, and providing insight on future theoretical development

Keywords: equivalence, functionalist translation theories, linguistic translation approaches, translation theories, Skopos

Procedia PDF Downloads 105
14171 Pro Grow Business Partnerships: Unlocking the Potential of SMEs Indonesia With Resource Advantage Theory of Competition Approach

Authors: Kesi Widjajanti

Abstract:

To develop the growth of small and medium enterprises (SMEs), it is important to unlock potential resources that can improve their performance. Business Partnerships (BP) are currently an interesting topic of strategy to use to expand markets and maximize financial and marketing performance. However, many business partnerships have not quite a role among small and medium companies in the creative industry in the Batik Craft sector in Indonesia. This study is rooted in the Resource Advantage Theory of Competition ( RAToC), which emphasizes that the advantage of company resources can be sourced from organizational and relational resources. With the basis of this theory, SMEs can optimize the allocation of relational resources and organizational goals, improve operational efficiency, and gain a strategic advantage in the market. Companies that are able to actualize organizational and relational resources better than other market players can be used for the process of increasing their superior performance. This study explores key elements from the RAToC perspective and shows how Business Partnerships have the potential to drive SMEs' growth. By aligning visions, and organizational resources, sharing knowledge and leveraging complementary relational resources, SMEs can increase their competitiveness, enter new markets, and achieve superior performance. The theoretical contribution of RAToC in small companies is due to the role of Pro-Grow Business Partnership strength as an important antecedent for improving SMEs' performance. The benefits (scenarios) of a Business Partnership to grow together are directed at optimizing resources that can create additional value for customers so that they can outperform competitors. Furthermore, managerial implications for SMEs who wish to unlock their resource potential can encourage the role of Pro-Grow Business Partnerships, which have specific characteristics, can absorb experience/knowledge capacity and utilize this knowledge for the development of "together" business ventures.

Keywords: pro grow business partnership, performance, SMEs, resources advantage theory of competition, industry kreatif batik handycraft indonesia

Procedia PDF Downloads 67
14170 Preparation of Pegylated Interferon Alpha-2b with High Antiviral Activity Using Linear 20 KDa Polyethylene Glycol Derivative

Authors: Ehab El-Dabaa, Omnia Ali, Mohamed Abd El-Hady, Ahmed Osman

Abstract:

Recombinant human interferon alpha 2 (rhIFN-α2) is FDA approved for treatment of some viral and malignant diseases. Approved pegylated rhIFN-α2 drugs have highly improved pharmacokinetics, pharmacodynamics and therapeutic efficiency compared to native protein. In this work, we studied the pegylation of purified properly refolded rhIFN-α2b using linear 20kDa PEG-NHS (polyethylene glycol- N-hydroxysuccinimidyl ester) to prepare pegylated rhIFN-α2b with high stability and activity. The effect of different parameters like rhIFN-α2b final concentration, pH, rhIFN-α2b/PEG molar ratios and reaction time on the efficiency of pegylation (high percentage of monopegylated rhIFN-α2b) have been studied in small scale (100µl) pegylation reaction trials. Study of the percentages of different components of these reactions (mono, di, polypegylated rhIFN-α2b and unpegylated rhIFN-α2b) indicated that 2h is optimum time to complete the reaction. The pegylation efficiency increased at pH 8 (57.9%) by reducing the protein concentration to 1mg/ml and reducing the rhIFN-α2b/PEG ratio to 1:2. Using larger scale pegylation reaction (65% pegylation efficiency), ion exchange chromatography method has been optimized to prepare and purify the monopegylated rhIFN-α2b with high purity (96%). The prepared monopegylated rhIFN-α2b had apparent Mwt of approximately 65 kDa and high in vitro antiviral activity (2.1x10⁷ ± 0.8 x10⁷ IU/mg). Although it retained approximately 8.4 % of the antiviral activity of the unpegylated rhIFN-α2b, its activity is high compared to other pegylated rhIFN-α2 developed by using similar approach or higher molecular weight branched PEG.

Keywords: antiviral activity, rhIFN-α2b, pegylation, pegylation efficiency

Procedia PDF Downloads 170
14169 Safeguarding Product Quality through Pre-Qualification of Material Manufacturers: A Ship and Offshore Classification Society's Perspective

Authors: Sastry Y. Kandukuri, Isak Andersen

Abstract:

Despite recent advances in the manufacturing sector, quality issues remain a frequent occurrence, and can result in fatal accidents, equipment downtime, and loss of life. Adequate quality is of high importance in high-risk industries such as sea-going vessels and offshore installations in which third party quality assurance and product control play an important essential role in ensuring manufacturing quality of critical components. Classification societies play a vital role in mitigating risk in these industries by making sure that all the stakeholders i.e. manufacturers, builders, and end users are provided with adequate rules and standards that effectively ensures components produced at a high level of quality based on the area of application and risk of its failure. Quality issues have also been linked to the lack of competence or negligence of stakeholders in supply value chain. However, continued actions and regulatory reforms through modernization of rules and requirements has provided additional tools for purchasers and manufacturers to confront these issues. Included among these tools are updated ‘approval of manufacturer class programs’ aimed at developing and implementing a set of standardized manufacturing quality metrics for use by the manufacturer and verified by the classification society. The establishment and collection of manufacturing and testing requirements described in these programs could provide various stakeholders – from industry to vessel owners – with greater insight into the state of quality at a given manufacturing facility, and allow stakeholders to anticipate better and address quality issues while simultaneously reducing unnecessary failures that are costly to the industry. The publication introduces, explains and discusses critical manufacturing and testing requirements set in a leading class society’s approval of manufacturer regime and its rationale and some case studies.

Keywords: classification society, manufacturing, materials processing, materials testing, quality control

Procedia PDF Downloads 343
14168 Non-Invasive Techniques for Management of Carious Primary Dentition Using Silver Diamine Fluoride and Moringa Extract as a Modification of the Hall Technique

Authors: Rasha F. Sharaf

Abstract:

Treatment of dental caries in young children is considered a great challenge for all dentists, especially with uncooperative children. Recently non-invasive techniques have been highlighted as they alleviate the need for local anesthesia and other painful procedures during management of carious teeth and, at the same time, increase the success rate of the treatment done. Silver Diamine Fluoride (SDF) is one of the most effective cariostatic materials that arrest the progression of carious lesions and aid in remineralizing the demineralized tooth structure. Both fluoride and silver ions proved to have an antibacterial action and aid in the precipitation of an insoluble layer that prevents further decay. At the same time, Moringa proved to have an effective antibacterial action against different types of bacteria, therefore, it can be used as a non-invasive technique for the management of caries in children. One of the important theories for the control of caries is by depriving the cariogenic bacteria from nutrients causing their starvation and death, which can be achieved by applying stainless steel crown on primary molars with carious lesions which are not involving the pulp, and this technique is known as Hall technique. The success rate of the Hall technique can be increased by arresting the carious lesion using either SDF or Moringa and gaining the benefit of their antibacterial action. Multiple clinical cases with 1 year follow up will be presented, comparing different treatment options, and using various materials and techniques for non-invasive and non-painful management of carious primary teeth.

Keywords: SDF, hall technique, carious primary teeth, moringa extract

Procedia PDF Downloads 87
14167 Food Safety and Quality Assurance and Skills Development among Farmers in Georgia

Authors: Kakha Nadiardze, Nana Phirosmanashvili

Abstract:

The goal of this paper is to present the problems of lack of information among farmers in food safety. Global food supply chains are becoming more and more diverse, making traceability systems much harder to implement across different food markets. In this abstract, we will present our work for analyzing the key developments in Georgian food market from regulatory controls to administrative procedures to traceability technologies. Food safety and quality assurance are most problematic issues in Georgia as food trade networks become more and more complex, food businesses are under more and more pressure to ensure that their products are safe and authentic. The theme follow-up principles from farm to table must be top-of-mind for all food manufacturers, farmers and retailers. Following the E. coli breakout last year, as well as more recent cases of food mislabeling, developments in food traceability systems is essential to food businesses if they are to present a credible brand image. Alongside this are the ever-developing technologies in food traceability networks, technologies that manufacturers and retailers need to be aware of if they are to keep up with food safety regulations and avoid recall. How to examine best practice in food management is the main question in order to protect company brand through safe and authenticated food. We are working with our farmers to work with our food safety experts and technology developers throughout the food supply chain. We provide time by time food analyses on heavy metals, pesticide residues and different pollutants. We are disseminating information among farmers how the latest food safety regulations will impact the methods to use to identify risks within their products.

Keywords: food safety, GMO, LMO, E. coli, quality

Procedia PDF Downloads 498
14166 Parent’s Evaluation of the Services Offered to Their Children with Autism in UAE Centres

Authors: Mohammad Ali Fteiha, Ghanem Al Bustami

Abstract:

The study aimed to identify the assessment of parents of children with Autism for services provided by the Center for special care in the United Arab Emirates, in terms of quality, comprehensive and the impact of some factors related to the diagnosis and place of service provision and efficient working procedures of service and the child age. In order to achieve the objective of the study, researchers used Parent’s Satisfaction Scale, and Parents Evaluation of Services Effectiveness, both the scale and the parents reports provided with accepted level of validity and reliability. Sample includes 300 families of children with Autism receiving educational and rehabilitation services, treatment and support services in both governmental and private centers in United Arab Emirates. ANOVA test was used through SPSS program to analyze the collected data. The results of the study have indicated that there are significant differences in the assessment of services provided by centers due to a place of service, the nature of the diagnosis, child's age at the time of the study, as well as statistically significance differences due to age when first diagnosed. The results also showed positive evaluation for the good level of services as international standard, and the quality of these services provided by autism centers in the United Arab Emirates, especially in governmental centers. At the same time, the results showed the presence of many needs problems faced by the parents do not have appropriate solutions. Based on the results the recommendations were stated.

Keywords: autism, evaluation, diagnosis, parents, autism programs, supportive services, government centers, private centers

Procedia PDF Downloads 550
14165 An Inventory Management Model to Manage the Stock Level for Irregular Demand Items

Authors: Riccardo Patriarca, Giulio Di Gravio, Francesco Costantino, Massimo Tronci

Abstract:

An accurate inventory management policy acquires a crucial role in the several high-availability sectors. In these sectors, due to the high-cost of spares and backorders, an (S-1, S) replenishment policy is necessary for high-availability items. The policy enables the shipment of a substitute efficient item anytime the inventory size decreases by one. This policy can be modelled following the Multi-Echelon Technique for Recoverable Item Control (METRIC). The METRIC is a system-based technique that allows defining the optimum stock level in a multi-echelon network, adopting measures in line with the decision-maker’s perspective. The METRIC defines an availability-cost function with inventory costs and required service levels, using as inputs data about the demand trend, the supplying and maintenance characteristics of the network and the budget/availability constraints. The traditional METRIC relies on the hypothesis that a Poisson distribution well represents the demand distribution in case of items with a low failure rate. However, in this research, we will explore the effects of using a Poisson distribution to model the demand of low failure rate items characterized by an irregular demand trend. This characteristic of a demand is not included in the traditional METRIC formulation leading to the need of revising its traditional formulation. Using the CV (Coefficient of Variation) and ADI (Average inter-Demand Interval) classification, we will define the inherent flaws of Poisson-based METRIC for irregular demand items, defining an innovative ad hoc distribution which can better fit the irregular demands. This distribution will allow defining proper stock levels to reduce stocking and backorder costs due to the high irregularities in the demand trend. A case study in the aviation domain will clarify the benefits of this innovative METRIC approach.

Keywords: METRIC, inventory management, irregular demand, spare parts

Procedia PDF Downloads 337
14164 Comprehensive Feature Extraction for Optimized Condition Assessment of Fuel Pumps

Authors: Ugochukwu Ejike Akpudo, Jank-Wook Hur

Abstract:

The increasing demand for improved productivity, maintainability, and reliability has prompted rapidly increasing research studies on the emerging condition-based maintenance concept- Prognostics and health management (PHM). Varieties of fuel pumps serve critical functions in several hydraulic systems; hence, their failure can have daunting effects on productivity, safety, etc. The need for condition monitoring and assessment of these pumps cannot be overemphasized, and this has led to the uproar in research studies on standard feature extraction techniques for optimized condition assessment of fuel pumps. By extracting time-based, frequency-based and the more robust time-frequency based features from these vibrational signals, a more comprehensive feature assessment (and selection) can be achieved for a more accurate and reliable condition assessment of these pumps. With the aid of emerging deep classification and regression algorithms like the locally linear embedding (LLE), we propose a method for comprehensive condition assessment of electromagnetic fuel pumps (EMFPs). Results show that the LLE as a comprehensive feature extraction technique yields better feature fusion/dimensionality reduction results for condition assessment of EMFPs against the use of single features. Also, unlike other feature fusion techniques, its capabilities as a fault classification technique were explored, and the results show an acceptable accuracy level using standard performance metrics for evaluation.

Keywords: electromagnetic fuel pumps, comprehensive feature extraction, condition assessment, locally linear embedding, feature fusion

Procedia PDF Downloads 110
14163 Evaluating Perceived Usability of ProxTalker App Using Arabic Standard Usability Scale: A Student's Perspective

Authors: S. AlBustan, B. AlGhannam

Abstract:

This oral presentation discusses a proposal for a study that evaluates the usability of an evidence based application named ProxTalker App. The significance of this study will inform administration and faculty staff at the Department of Communication Sciences Disorders (CDS), College of Life Sciences, Kuwait University whether the app is a suitable tool to use for CDS students. A case study will be used involving a sample of CDS students taking practicum and internship courses during the academic year 2018/2019. The study will follow a process used by previous study. The process of calculating SUS is well documented and will be followed. ProxTalker App is an alternative and augmentative tool that speech language pathologist (SLP) can use to customize boards for their clients. SLPs can customize different boards using this app for various activities. A board can be created by the SLP to improve and support receptive and expressive language. Using technology to support therapy can aid SLPs to integrate this ProxTalker App as part of their clients therapy. Supported tools, games and motivation are some advantages of incorporating apps during therapy sessions. A quantitative methodology will be used. It involves the utilization of a standard tool that was the was adapted to the Arabic language to accommodate native Arabic language users. The tool that will be utilized in this research is the Arabic Standard Usability Scale (A-SUS) questionnaire which is an adoption of System Usability Scale (SUS). Standard usability questionnaires are reliable, valid and their process is properly documented. This study builds upon the development of A-SUS, which is a psychometrically evaluated questionnaire that targets Arabic native speakers. Results of the usability will give preliminary indication of whether the ProxTalker App under investigation is appropriate to be integrated within the practicum and internship curriculum of CDS. The results of this study will inform the CDS department of this specific app is an appropriate tool to be used for our specific students within our environment because usability depends on the product, environment, and users.

Keywords: A-SUS, communication disorders practicum, evidence based app, Standard Usability Scale

Procedia PDF Downloads 140
14162 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels

Authors: Tal Remez, Or Litany, Alex Bronstein

Abstract:

The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.

Keywords: binary pixels, maximum likelihood, neural networks, sparse coding

Procedia PDF Downloads 191
14161 Seismic Performance of Concrete Moment Resisting Frames in Western Canada

Authors: Ali Naghshineh, Ashutosh Bagchi

Abstract:

Performance-based seismic design concepts are increasingly being adopted in various jurisdictions. While the National Building Code of Canada (NBCC) is not fully performance-based, it provides some features of a performance-based code, such as displacement control and objective-based solutions. Performance evaluation is an important part of a performance-based design. In this paper, the seismic performance of a set of code-designed 4, 8 and 12 story moment resisting concrete frames located in Victoria, BC, in the western part of Canada at different hazard levels namely, SLE (Service Level Event), DLE (Design Level Event) and MCE (Maximum Considered Event) has been studied. The seismic performance of these buildings has been evaluated based on FEMA 356 and ATC 72 procedures, and the nonlinear time history analysis. Pushover analysis has been used to investigate the different performance levels of these buildings and adjust their design based on the corresponding target displacements. Since pushover analysis ignores the higher mode effects, nonlinear dynamic time history using a set of ground motion records has been performed. Different types of ground motion records, such as crustal and subduction earthquake records have been used for the dynamic analysis to determine their effects. Results obtained from push over analysis on inter-story drift, displacement, shear and overturning moment are compared to those from the dynamic analysis.

Keywords: seismic performance., performance-based design, concrete moment resisting frame, crustal earthquakes, subduction earthquakes

Procedia PDF Downloads 256
14160 Language Shapes Thought: An Experimental Study on English and Mandarin Native Speakers' Sequencing of Size

Authors: Hsi Wei

Abstract:

Does the language we speak affect the way we think? This question has been discussed for a long time from different aspects. In this article, the issue is examined with an experiment on how speakers of different languages tend to do different sequencing when it comes to the size of general objects. An essential difference between the usage of English and Mandarin is the way we sequence the size of places or objects. In English, when describing the location of something we may say, for example, ‘The pen is inside the trashcan next to the tree at the park.’ In Mandarin, however, we would say, ‘The pen is at the park next to the tree inside the trashcan.’ It’s clear that generally English use the sequence of small to big while Mandarin the opposite. Therefore, the experiment was conducted to test if the difference of the languages affects the speakers’ ability to do the different sequencing. There were two groups of subjects; one consisted of English native speakers, another of Mandarin native speakers. Within the experiment, three nouns were showed as a group to the subjects as their native languages. Before they saw the nouns, they would first get an instruction of ‘big to small’, ‘small to big’, or ‘repeat’. Therefore, the subjects had to sequence the following group of nouns as the instruction they get or simply repeat the nouns. After completing every sequencing and repetition in their minds, they pushed a button as reaction. The repetition design was to gather the mere reading time of the person. As the result of the experiment showed, English native speakers reacted more quickly to the sequencing of ‘small to big’; on the other hand, Mandarin native speakers reacted more quickly to the sequence ‘big to small’. To conclude, this study may be of importance as a support for linguistic relativism that the language we speak do shape the way we think.

Keywords: language, linguistic relativism, size, sequencing

Procedia PDF Downloads 276
14159 The Strategic Entering Time of a Commerce Platform

Authors: Chia-li Wang

Abstract:

The surge of service and commerce platforms, such as e-commerce and internet-of-things, have rapidly changed our lives. How to avoid the congestion and get the job done in the platform is now a common problem that many people encounter every day. This requires platform users to make decisions about when to enter the platform. To that end, we investigate the strategic entering time of a simple platform containing random numbers of buyers and sellers of some item. Upon a trade, the buyer and the seller gain respective profits, yet they pay the cost of waiting in the platform. To maximize their expected payoffs from trading, both buyers and sellers can choose their entering times. This creates an interesting and practical framework of a game that is played among buyers, among sellers, and between them. That is, a strategy employed by a player is not only against players of its type but also a response to those of the other type, and, thus, a strategy profile is composed of strategies of buyers and sellers. The players' best response, the Nash equilibrium (NE) strategy profile, is derived by a pair of differential equations, which, in turn, are used to establish its existence and uniqueness. More importantly, its structure sheds valuable insights of how the entering strategy of one side (buyers or sellers) is affected by the entering behavior of the other side. These results provide a base for the study of dynamic pricing for stochastic demand-supply imbalances. Finally, comparisons between the social welfares (the sum of the payoffs incurred by individual participants) obtained by the optimal strategy and by the NE strategy are conducted for showing the efficiency loss relative to the socially optimal solution. That should help to manage the platform better.

Keywords: double-sided queue, non-cooperative game, nash equilibrium, price of anarchy

Procedia PDF Downloads 77
14158 A Rotating Facility with High Temporal and Spatial Resolution Particle Image Velocimetry System to Investigate the Turbulent Boundary Layer Flow

Authors: Ruquan You, Haiwang Li, Zhi Tao

Abstract:

A time-resolved particle image velocimetry (PIV) system is developed to investigate the boundary layer flow with the effect of rotating Coriolis and buoyancy force. This time-resolved PIV system consists of a 10 Watts continuous laser diode and a high-speed camera. The laser diode is able to provide a less than 1mm thickness sheet light, and the high-speed camera can capture the 6400 frames per second with 1024×1024 pixels. The whole laser and the camera are fixed on the rotating facility with 1 radius meters and up to 500 revolutions per minute, which can measure the boundary flow velocity in the rotating channel with and without ribs directly at rotating conditions. To investigate the effect of buoyancy force, transparent heater glasses are used to provide the constant thermal heat flux, and then the density differences are generated near the channel wall, and the buoyancy force can be simulated when the channel is rotating. Due to the high temporal and spatial resolution of the system, the proper orthogonal decomposition (POD) can be developed to analyze the characteristic of the turbulent boundary layer flow at rotating conditions. With this rotating facility and PIV system, the velocity profile, Reynolds shear stress, spatial and temporal correlation, and the POD modes of the turbulent boundary layer flow can be discussed.

Keywords: rotating facility, PIV, boundary layer flow, spatial and temporal resolution

Procedia PDF Downloads 173
14157 Velocity Profiles of Vowel Perception by Javanese and Sundanese English Language Learners

Authors: Arum Perwitasari

Abstract:

Learning L2 sounds is influenced by the first language (L1) sound system. This current study seeks to examine how the listeners with a different L1 vowel system perceive L2 sounds. The fact that English has a bigger number of vowel inventory than Javanese and Sundanese L1 might cause problems for Javanese and Sundanese English language learners perceiving English sounds. To reveal the L2 sound perception over time, we measured the mouse trajectories related to the hand movements made by Javanese and Sundanese language learners, two of Indonesian local languages. Do the Javanese and Sundanese listeners show higher velocity than the English listeners when they perceive English vowels which are similar and new to their L1 system? The study aims to map the patterns of real-time processing through compatible hand movements to reveal any uncertainties when making selections. The results showed that the Javanese listeners exhibited significantly slower velocity values than the English listeners for similar vowels /I, ɛ, ʊ/ in the 826-1200ms post stimulus. Unlike the Javanese, the Sundanese listeners showed slow velocity values except for similar vowel /ʊ/. For the perception of new vowels /i:, æ, ɜ:, ʌ, ɑː, u:, ɔ:/, the Javanese listeners showed slower velocity in making the lexical decision. In contrast, the Sundanese listeners showed slow velocity only for vowels /ɜ:, ɔ:, æ, I/ indicating that these vowels are hard to perceive. Our results fit well with the second language model representing how the L1 vowel system influences the L2 sound perception.

Keywords: velocity profiles, EFL learners, speech perception, experimental linguistics

Procedia PDF Downloads 210
14156 Context Detection in Spreadsheets Based on Automatically Inferred Table Schema

Authors: Alexander Wachtel, Michael T. Franzen, Walter F. Tichy

Abstract:

Programming requires years of training. With natural language and end user development methods, programming could become available to everyone. It enables end users to program their own devices and extend the functionality of the existing system without any knowledge of programming languages. In this paper, we describe an Interactive Spreadsheet Processing Module (ISPM), a natural language interface to spreadsheets that allows users to address ranges within the spreadsheet based on inferred table schema. Using the ISPM, end users are able to search for values in the schema of the table and to address the data in spreadsheets implicitly. Furthermore, it enables them to select and sort the spreadsheet data by using natural language. ISPM uses a machine learning technique to automatically infer areas within a spreadsheet, including different kinds of headers and data ranges. Since ranges can be identified from natural language queries, the end users can query the data using natural language. During the evaluation 12 undergraduate students were asked to perform operations (sum, sort, group and select) using the system and also Excel without ISPM interface, and the time taken for task completion was compared across the two systems. Only for the selection task did users take less time in Excel (since they directly selected the cells using the mouse) than in ISPM, by using natural language for end user software engineering, to overcome the present bottleneck of professional developers.

Keywords: natural language processing, natural language interfaces, human computer interaction, end user development, dialog systems, data recognition, spreadsheet

Procedia PDF Downloads 296
14155 Distinct Patterns of Resilience Identified Using Smartphone Mobile Experience Sampling Method (M-ESM) and a Dual Model of Mental Health

Authors: Hussain-Abdulah Arjmand, Nikki S. Rickard

Abstract:

The response to stress can be highly heterogenous, and may be influenced by methodological factors. The integrity of data will be optimized by measuring both positive and negative affective responses to an event, by measuring responses in real time as close to the stressful event as possible, and by utilizing data collection methods that do not interfere with naturalistic behaviours. The aim of the current study was to explore short term prototypical responses to major stressor events on outcome measures encompassing both positive and negative indicators of psychological functioning. A novel mobile experience sampling methodology (m-ESM) was utilized to monitor both effective responses to stressors in real time. A smartphone mental health app (‘Moodprism’) which prompts users daily to report both their positive and negative mood, as well as whether any significant event had occurred in the past 24 hours, was developed for this purpose. A sample of 142 participants was recruited as part of the promotion of this app. Participants’ daily reported experience of stressor events, levels of depressive symptoms and positive affect were collected across a 30 day period as they used the app. For each participant, major stressor events were identified on the subjective severity of the event rated by the user. Depression and positive affect ratings were extracted for the three days following the event. Responses to the event were scaled relative to their general reactivity across the remainder of the 30 day period. Participants were first clustered into groups based on initial reactivity and subsequent recovery following a stressor event. This revealed distinct patterns of responding along depressive symptomatology and positive affect. Participants were then grouped based on allocations to clusters in each outcome variable. A highly individualised nature in which participants respond to stressor events, in symptoms of depression and levels of positive affect, was observed. A complete description of the novel profiles identified will be presented at the conference. These findings suggest that real-time measurement of both positive and negative functioning to stressors yields a more complex set of responses than previously observed with retrospective reporting. The use of smartphone technology to measure individualized responding also proved to shed significant insight.

Keywords: depression, experience sampling methodology, positive functioning, resilience

Procedia PDF Downloads 232
14154 The Use of Artificial Intelligence in Digital Forensics and Incident Response in a Constrained Environment

Authors: Dipo Dunsin, Mohamed C. Ghanem, Karim Ouazzane

Abstract:

Digital investigators often have a hard time spotting evidence in digital information. It has become hard to determine which source of proof relates to a specific investigation. A growing concern is that the various processes, technology, and specific procedures used in the digital investigation are not keeping up with criminal developments. Therefore, criminals are taking advantage of these weaknesses to commit further crimes. In digital forensics investigations, artificial intelligence is invaluable in identifying crime. It has been observed that an algorithm based on artificial intelligence (AI) is highly effective in detecting risks, preventing criminal activity, and forecasting illegal activity. Providing objective data and conducting an assessment is the goal of digital forensics and digital investigation, which will assist in developing a plausible theory that can be presented as evidence in court. Researchers and other authorities have used the available data as evidence in court to convict a person. This research paper aims at developing a multiagent framework for digital investigations using specific intelligent software agents (ISA). The agents communicate to address particular tasks jointly and keep the same objectives in mind during each task. The rules and knowledge contained within each agent are dependent on the investigation type. A criminal investigation is classified quickly and efficiently using the case-based reasoning (CBR) technique. The MADIK is implemented using the Java Agent Development Framework and implemented using Eclipse, Postgres repository, and a rule engine for agent reasoning. The proposed framework was tested using the Lone Wolf image files and datasets. Experiments were conducted using various sets of ISA and VMs. There was a significant reduction in the time taken for the Hash Set Agent to execute. As a result of loading the agents, 5 percent of the time was lost, as the File Path Agent prescribed deleting 1,510, while the Timeline Agent found multiple executable files. In comparison, the integrity check carried out on the Lone Wolf image file using a digital forensic tool kit took approximately 48 minutes (2,880 ms), whereas the MADIK framework accomplished this in 16 minutes (960 ms). The framework is integrated with Python, allowing for further integration of other digital forensic tools, such as AccessData Forensic Toolkit (FTK), Wireshark, Volatility, and Scapy.

Keywords: artificial intelligence, computer science, criminal investigation, digital forensics

Procedia PDF Downloads 199
14153 Analysis of the Advent of Multinational Corporations in Developing Countries: Case Study of Nike Factories Expansion in Vietnam

Authors: Khue Do Phan

Abstract:

Nike has been confronted by the press with their harsh working conditions, underpayment and highly-labor intensive requirement to their manufacturing workers and hiring of underage workers in Vietnam, Nike's largest production center. To analyze this topic critically through an international relations perspective, theory of dependency will be used to criticize the notion of exploitation of resources from developed countries towards developing countries. Theory of economic liberalism will be used to support the notion private property, the free market and generally capitalism as beneficial to both developing and developed countries. Workers are mentally, physically and sexually abused in the factories. In addition to this, their working conditions consist of improper training, lack of safety equipment, exposure of chemicals (glues and pants), their average wage is below the minimum wage in their country; the workers have to work around 60 hours or more a week. Even Nike says that the conditions are regulated often to make sure the workers get a voice to have their work rights and safe working environment. The monitors come to analyze the factories but in the end talk to the employers, whom are the direct abusers to the employees. Health benefits are rarely granted to the employees; they are forced to pay their bills first then the company will reimburse them later. They would also get in trouble for using the bathroom, taking a lunch break or sick days off because this would mean it would decrease their hours of work, leading to an even lower wage and a really angry employer. Of course with the press criticizing Nike’s lack of respect for human rights and working rights, Nike has been working on policy making and implementation to deal with the abuses. Due to its large chains and a great number of outsourcing host countries, the changes that Nike wish or attempt to make have not be in effect as quickly nor spreading to all countries it holds accountable for in its outsourcing factories.

Keywords: dependency theory, economic liberalism, human rights, outsource

Procedia PDF Downloads 324
14152 Automatic and High Precise Modeling for System Optimization

Authors: Stephanie Chen, Mitja Echim, Christof Büskens

Abstract:

To describe and propagate the behavior of a system mathematical models are formulated. Parameter identification is used to adapt the coefficients of the underlying laws of science. For complex systems this approach can be incomplete and hence imprecise and moreover too slow to be computed efficiently. Therefore, these models might be not applicable for the numerical optimization of real systems, since these techniques require numerous evaluations of the models. Moreover not all quantities necessary for the identification might be available and hence the system must be adapted manually. Therefore, an approach is described that generates models that overcome the before mentioned limitations by not focusing on physical laws, but on measured (sensor) data of real systems. The approach is more general since it generates models for every system detached from the scientific background. Additionally, this approach can be used in a more general sense, since it is able to automatically identify correlations in the data. The method can be classified as a multivariate data regression analysis. In contrast to many other data regression methods this variant is also able to identify correlations of products of variables and not only of single variables. This enables a far more precise and better representation of causal correlations. The basis and the explanation of this method come from an analytical background: the series expansion. Another advantage of this technique is the possibility of real-time adaptation of the generated models during operation. Herewith system changes due to aging, wear or perturbations from the environment can be taken into account, which is indispensable for realistic scenarios. Since these data driven models can be evaluated very efficiently and with high precision, they can be used in mathematical optimization algorithms that minimize a cost function, e.g. time, energy consumption, operational costs or a mixture of them, subject to additional constraints. The proposed method has successfully been tested in several complex applications and with strong industrial requirements. The generated models were able to simulate the given systems with an error in precision less than one percent. Moreover the automatic identification of the correlations was able to discover so far unknown relationships. To summarize the above mentioned approach is able to efficiently compute high precise and real-time-adaptive data-based models in different fields of industry. Combined with an effective mathematical optimization algorithm like WORHP (We Optimize Really Huge Problems) several complex systems can now be represented by a high precision model to be optimized within the user wishes. The proposed methods will be illustrated with different examples.

Keywords: adaptive modeling, automatic identification of correlations, data based modeling, optimization

Procedia PDF Downloads 395
14151 A Case Study on Post-Occupancy Evaluation of User Satisfaction in Higher Educational Buildings

Authors: Yuanhong Zhao, Qingping Yang, Andrew Fox, Tao Zhang

Abstract:

Post-occupancy evaluation (POE) is a systematic approach to assess the actual building performance after the building has been occupied for some time. In this paper, a structured POE assessment was conducted using the building use survey (BUS) methodology in two higher educational buildings in the United Kingdom. This study aims to help close the building performance gap, provide optimized building operation suggestions, and to improve occupants’ satisfaction level. In this research, the questionnaire survey investigated the influences of environmental factors on user satisfaction from the main aspects of building overall design, thermal comfort, perceived control, indoor environment quality for noise, lighting, ventilation, and other non-environmental factors, such as the background information about age, sex, time in buildings, workgroup size, and so on. The results indicate that the occupant satisfaction level with the main aspects of building overall design, indoor environment quality, and thermal comfort in summer and winter on both two buildings, which is lower than the benchmark data. The feedback of this POE assessment has been reported to the building management team to allow managers to develop high-performance building operation plans. Finally, this research provided improvement suggestions to the building operation system to narrow down the performance gap and improve the user work experience satisfaction and productivity level.

Keywords: building performance assessment systems, higher educational buildings, post-occupancy evaluation, user satisfaction

Procedia PDF Downloads 146
14150 Effect of Architecture and Operating Conditions of Vehicle on Bulb Lifetime in Automotive

Authors: Hatice Özbek, Caner Çil, Ahmet Rodoplu

Abstract:

Automotive lighting is the leading function in the configuration of vehicle architecture. Especially headlights and taillights from external lighting functions are among the structures that determine the stylistic character of the vehicle. At the same time, the fact that lighting functions are related to many other functions brings along difficulties in design. Customers expect maximum quality from the vehicle. In these circumstances, it is necessary to make designs that aim to keep the performance of bulbs with limited working lives at the highest level. With this study, the factors that influence the working lives of filament lamps were examined and bulb explosions that can occur sooner than anticipated in the future were prevented while the vehicle was still in the design phase by determining the relations with electrical, dynamical and static variables. Especially the filaments of the bulbs used in the front lighting of the vehicle are deformed in a shorter time due to the high voltage requirement. In addition to this, rear lighting lamps vibrate as a result of the tailgate opening and closing and cause the filaments to be exposed to high stress. With this study, the findings that cause bulb explosions were evaluated. Among the most important findings: 1. The structure of the cables to the lighting functions of the vehicle and the effect of the voltage values are drawn; 2. The effect of the vibration to bulb throughout the life of the vehicle; 3 The effect of the loads carried to bulb while the vehicle doors are opened and closed. At the end of the study, the maximum performance was established in the bulb lifetimes with the optimum changes made in the vehicle architecture based on the findings obtained.

Keywords: vehicle architecture, automotive lighting functions, filament lamps, bulb lifetime

Procedia PDF Downloads 147
14149 Detection and Distribution Pattern of Prevelant Genotypes of Hepatitis C in a Tertiary Care Hospital of Western India

Authors: Upasana Bhumbla

Abstract:

Background: Hepatitis C virus is a major cause of chronic hepatitis, which can further lead to cirrhosis of the liver and hepatocellular carcinoma. Worldwide the burden of Hepatitis C infection has become a serious threat to the human race. Hepatitis C virus (HCV) has population-specific genotypes and provides valuable epidemiological and therapeutic information. Genotyping and assessment of viral load in HCV patients are important for planning the therapeutic strategies. The aim of the study is to study the changing trends of prevalence and genotypic distribution of hepatitis C virus in a tertiary care hospital in Western India. Methods: It is a retrospective study; blood samples were collected and tested for anti HCV antibodies by ELISA in Dept. of Microbiology. In seropositive Hepatitis C patients, quantification of HCV-RNA was done by real-time PCR and in HCV-RNA positive samples, genotyping was conducted. Results: A total of 114 patients who were seropositive for Anti HCV were recruited in the study, out of which 79 (69.29%) were HCV-RNA positive. Out of these positive samples, 54 were further subjected to genotype determination using real-time PCR. Genotype was not detected in 24 samples due to low viral load; 30 samples were positive for genotype. Conclusion: Knowledge of genotype is crucial for the management of HCV infection and prediction of prognosis. Patients infected with HCV genotype 1 and 4 will have to receive Interferon and Ribavirin for 48 weeks. Patients with these genotypes show a poor sustained viral response when tested 24 weeks after completion of therapy. On the contrary, patients infected with HCV genotype 2 and 3 are reported to have a better response to therapy.

Keywords: hepatocellular, genotype, ribavarin, seropositive

Procedia PDF Downloads 123
14148 Effect of Thermal Energy on Inorganic Coagulation for the Treatment of Industrial Wastewater

Authors: Abhishek Singh, Rajlakshmi Barman, Tanmay Shah

Abstract:

Coagulation is considered to be one of the predominant water treatment processes which improve the cost effectiveness of wastewater. The sole purpose of this experiment on thermal coagulation is to increase the efficiency and the rate of reaction. The process uses renewable sources of energy which comprises of improved and minimized time method in order to eradicate the water scarcity of the regions which are on the brink of depletion. This paper includes the various effects of temperature on the standard coagulation treatment of wastewater and their effect on water quality. In addition, the coagulation is done with the mix of bottom/fly-ash that will act as an adsorbent and removes most of the minor and macro particles by means of adsorption which not only helps to reduce the environmental burden of fly ash but also enhance economic benefit. Also, the method of sand filtration is amalgamated in the process. The sand filter is an environmentally-friendly wastewater treatment method, which is relatively simple and inexpensive. The existing parameters were satisfied with the experimental results obtained in this study and were found satisfactory. The initial turbidity of the wastewater is 162 NTU. The initial temperature of the wastewater is 27 C. The temperature variation of the entire process is 50 C-80 C. The concentration of alum in wastewater is 60mg/L-320mg/L. The turbidity range is 8.31-28.1 NTU after treatment. pH variation is 7.73-8.29. The effective time taken is 10 minutes for thermal mixing and sedimentation. The results indicate that the presence of thermal energy affects the coagulation treatment process. The influence of thermal energy on turbidity is assessed along with renewable energy sources and increase of the rate of reaction of the treatment process.

Keywords: adsorbent, sand filter, temperature, thermal coagulation

Procedia PDF Downloads 316
14147 A Reduced Ablation Model for Laser Cutting and Laser Drilling

Authors: Torsten Hermanns, Thoufik Al Khawli, Wolfgang Schulz

Abstract:

In laser cutting as well as in long pulsed laser drilling of metals, it can be demonstrated that the ablation shape (the shape of cut faces respectively the hole shape) that is formed approaches a so-called asymptotic shape such that it changes only slightly or not at all with further irradiation. These findings are already known from the ultrashort pulse (USP) ablation of dielectric and semiconducting materials. The explanation for the occurrence of an asymptotic shape in laser cutting and long pulse drilling of metals is identified, its underlying mechanism numerically implemented, tested and clearly confirmed by comparison with experimental data. In detail, there now is a model that allows the simulation of the temporal (pulse-resolved) evolution of the hole shape in laser drilling as well as the final (asymptotic) shape of the cut faces in laser cutting. This simulation especially requires much less in the way of resources, such that it can even run on common desktop PCs or laptops. Individual parameters can be adjusted using sliders – the simulation result appears in an adjacent window and changes in real time. This is made possible by an application-specific reduction of the underlying ablation model. Because this reduction dramatically decreases the complexity of calculation, it produces a result much more quickly. This means that the simulation can be carried out directly at the laser machine. Time-intensive experiments can be reduced and set-up processes can be completed much faster. The high speed of simulation also opens up a range of entirely different options, such as metamodeling. Suitable for complex applications with many parameters, metamodeling involves generating high-dimensional data sets with the parameters and several evaluation criteria for process and product quality. These sets can then be used to create individual process maps that show the dependency of individual parameter pairs. This advanced simulation makes it possible to find global and local extreme values through mathematical manipulation. Such simultaneous optimization of multiple parameters is scarcely possible by experimental means. This means that new methods in manufacturing such as self-optimization can be executed much faster. However, the software’s potential does not stop there; time-intensive calculations exist in many areas of industry. In laser welding or laser additive manufacturing, for example, the simulation of thermal induced residual stresses still uses up considerable computing capacity or is even not possible. Transferring the principle of reduced models promises substantial savings there, too.

Keywords: asymptotic ablation shape, interactive process simulation, laser drilling, laser cutting, metamodeling, reduced modeling

Procedia PDF Downloads 208
14146 Miracle Fruit Application in Sour Beverages: Effect of Different Concentrations on the Temporal Sensory Profile and Overall Linking

Authors: Jéssica F. Rodrigues, Amanda C. Andrade, Sabrina C. Bastos, Sandra B. Coelho, Ana Carla M. Pinheiro

Abstract:

Currently, there is a great demand for the use of natural sweeteners due to the harmful effects of the high sugar and artificial sweeteners consumption on the health. Miracle fruit, which is known for its unique ability to modify the sour taste in sweet taste, has been shown to be a good alternative sweetener. However, it has a high production cost, being important to optimize lower contents to be used. Thus, the aim of this study was to assess the effect of different miracle fruit contents on the temporal (Time-intensity - TI and Temporal Dominance of Sensations - TDS) sensory profile and overall linking of lemonade, to determine the better content to be used as a natural sweetener in sour beverages. TI and TDS results showed that the concentrations of 150 mg, 300 mg and 600 mg miracle fruit were effective in reducing the acidity and promoting the sweet perception in lemonade. Furthermore, the concentrations of 300 mg and 600 mg obtained similar profiles. Through the acceptance test, the concentration of 300 mg miracle fruit was shown to be an efficient substitute for sucrose and sucralose in lemonade, once they had similar hedonic values between ‘I liked it slightly’ and ‘I liked it moderately’. Therefore, 300mg miracle fruit consists in an adequate content to be used as a natural sweetener of lemonade. The results of this work will help the food industry on the efficient application of a new natural sweetener- the Miracle fruit extract in sour beverages, reducing costs and providing a product that meets the consumer desires.

Keywords: acceptance, natural sweetener, temporal dominance of sensations, time-intensity

Procedia PDF Downloads 240
14145 Cognitive Behaviour Drama: Playful Method to Address Fears in Children on the Higher-End of the Autism Spectrum

Authors: H.Karnezi, K. Tierney

Abstract:

Childhood fears that persist over time and interfere with the children’s normal functioning may have detrimental effects on their social and emotional development. Cognitive behavior therapy is considered highly effective in treating fears and anxieties. However, given that many childhood fears are based on fantasy, the applicability of CBT may be hindered by cognitive immaturity. Furthermore, a lack of motivation to engage in therapy is another commonly encountered obstacle. The purpose of this study was to introduce and evaluate a more developmentally appropriate intervention model, specifically designed to provide phobic children with the motivation to overcome their fears. To this end, principles and techniques from cognitive and behavior therapies are incorporated into the ‘Drama in Education’ model. The Cognitive Behaviour Drama (CBD) method involves using the phobic children’s creativity to involve them in the therapeutic process. The children are invited to engage in exciting fictional scenarios tailored around their strengths and special interests. Once their commitment to the drama is established, a problem that they will feel motivated to solve is introduced. To resolve it, the children will have to overcome a number of obstacles culminating in an in vivo confrontation with the fear stimulus. The study examined the application of the CBD model in three single cases. Results in all three cases shown complete elimination of all fear-related symptoms. Preliminary results justify further evaluation of the Cognitive Behaviour Drama model. It is time and cost-effective, ensuring the clients' immediate engagement in the therapeutic process.

Keywords: phobias, autism, intervention, drama

Procedia PDF Downloads 118