Search results for: discourse processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4785

Search results for: discourse processing

3975 Typologies of Democratic Innovation Proposals - A Critical Literature Review

Authors: Kristof Lukas Heidemann

Abstract:

In response to the present-day worldwide regression in the prevalence and vitality of contemporary democratic systems proponents of democracy have made several proposals to reverse this global trajectory through constitutional law reforms, creating the democratic innovations discourse. This critical review analyzes the different typologies that have been put forward to systematize the suggested democratic innovations and argues that the typologies all either omit some existing proposals or include overlapping types. Therefore, the review endorses possible adaptations regarding the more comprehensive typologies and gives recommendations for further research.

Keywords: citizen participation, constitutional law, deliberative democracy, democracy, democratic innovations, law and legislation, law reform, literature review

Procedia PDF Downloads 5
3974 Modal FDTD Method for Wave Propagation Modeling Customized for Parallel Computing

Authors: H. Samadiyeh, R. Khajavi

Abstract:

A new FD-based procedure, modal finite difference method (MFDM), is proposed for seismic wave propagation modeling, in which simulation is dealt with in the modal space. The method employs eigenvalues of a characteristic matrix formed by appropriate time-space FD stencils. Since MFD runs for different modes are totally independent of each other, MFDM can easily be parallelized while considerable simplicity in parallel-algorithm is also achieved. There is no requirement to any domain-decomposition procedure and inter-core data exchange. More important is the possibility to skip processing of less-significant modes, which enables one to adjust the procedure up to the level of accuracy needed. Thus, in addition to considerable ease of parallel programming, computation and storage costs are significantly reduced. The method is qualified for its efficiency by some numerical examples.

Keywords: Finite Difference Method, Graphics Processing Unit (GPU), Message Passing Interface (MPI), Modal, Wave propagation

Procedia PDF Downloads 296
3973 Processing Studies and Challenges Faced in Development of High-Pressure Titanium Alloy Cryogenic Gas Bottles

Authors: Bhanu Pant, Sanjay H. Upadhyay

Abstract:

Frequently, the upper stage of high-performance launch vehicles utilizes cryogenic tank-submerged pressurization gas bottles with high volume-to-weight efficiency to achieve a direct gain in the satellite payload. Titanium alloys, owing to their high specific strength coupled with excellent compatibility with various fluids, are the materials of choice for these applications. Amongst the Titanium alloys, there are two alloys suitable for cryogenic applications, namely Ti6Al4V-ELI and Ti5Al2.5Sn-ELI. The two-phase alpha-beta alloy Ti6Al4V-ELI is usable up to LOX temperature of 90K, while the single-phase alpha alloy Ti5Al2.5Sn-ELI can be used down to LHe temperature of 4 K. The high-pressure gas bottles submerged in the LH2 (20K) can store more amount of gas in as compared to those submerged in LOX (90K) bottles the same volume. Thus, the use of these alpha alloy gas bottles stored at 20K gives a distinct advantage with respect to the need for a lesser number of gas bottles to store the same amount of high-pressure gas, which in turn leads to a one-to-one advantage in the payload in the satellite. The cost advantage to the tune of 15000$/ kg of weight is saved in the upper stages, and, thereby, the satellite payload gain is expected by this change. However, the processing of alpha Ti5Al2.5Sn-ELI alloy gas bottles poses challenges due to the lower forgeability of the alloy and mode of qualification for the critical severe application environment. The present paper describes the processing and challenges/ solutions during the development of these advanced gas bottles for LH2 (20K) applications.

Keywords: titanium alloys, cryogenic gas bottles, alpha titanium alloy, alpha-beta titanium alloy

Procedia PDF Downloads 57
3972 The Effect of Fly Ash in Dewatering of Marble Processing Wastewaters

Authors: H. A. Taner, V. Önen

Abstract:

In the thermal power plants established to meet the energy need, lignite with low calorie and high ash content is used. Burning of these coals results in wastes such as fly ash, slag and flue gas. This constitutes a significant economic and environmental problems. However, fly ash can find evaluation opportunities in various sectors. In this study, the effectiveness of fly ash on suspended solid removal from marble processing wastewater containing high concentration of suspended solids was examined. Experiments were carried out for two different suspensions, marble and travertine. In the experiments, FeCl3, Al2(SO4)3 and anionic polymer A130 were used also to compare with fly ash. Coagulant/flocculant type/dosage, mixing time/speed and pH were the experimental parameters. The performances in the experimental studies were assessed with the change in the interface height during sedimentation resultant and turbidity values of treated water. The highest sedimentation efficiency was achieved with anionic flocculant. However, it was determined that fly ash can be used instead of FeCl3 and Al2(SO4)3 in the travertine plant as a coagulant.

Keywords: dewatering, flocculant, fly ash, marble plant wastewater

Procedia PDF Downloads 152
3971 Multivariate Analysis of Spectroscopic Data for Agriculture Applications

Authors: Asmaa M. Hussein, Amr Wassal, Ahmed Farouk Al-Sadek, A. F. Abd El-Rahman

Abstract:

In this study, a multivariate analysis of potato spectroscopic data was presented to detect the presence of brown rot disease or not. Near-Infrared (NIR) spectroscopy (1,350-2,500 nm) combined with multivariate analysis was used as a rapid, non-destructive technique for the detection of brown rot disease in potatoes. Spectral measurements were performed in 565 samples, which were chosen randomly at the infection place in the potato slice. In this study, 254 infected and 311 uninfected (brown rot-free) samples were analyzed using different advanced statistical analysis techniques. The discrimination performance of different multivariate analysis techniques, including classification, pre-processing, and dimension reduction, were compared. Applying a random forest algorithm classifier with different pre-processing techniques to raw spectra had the best performance as the total classification accuracy of 98.7% was achieved in discriminating infected potatoes from control.

Keywords: Brown rot disease, NIR spectroscopy, potato, random forest

Procedia PDF Downloads 190
3970 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 497
3969 Temporal Progression of Episodic Memory as Function of Encoding Condition and Age: Further Investigation of Action Memory in School-Aged Children

Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf

Abstract:

Studies of adults' episodic memory have found that enacted encoding not only improve recall performance but also retrieve faster during the recall period. The current study focused on exploring the temporal progression of different encoding conditions in younger and older school children. 204 students from two age group of 8 and 14 participated in this study. During the study phase, we studied action encoding in two forms; participants performed the phrases by themselves (SPT), and observed the performance of the experimenter (EPT), which were compared with verbal encoding; participants listened to verbal action phrases (VT). At test phase, we used immediate and delayed free recall tests. We observed significant differences in memory performance as function of age group, and encoding conditions in both immediate and delayed free recall tests. Moreover, temporal progression of recall was faster in older children when compared with younger ones. The interaction of age-group and encoding condition was only significant in delayed recall displaying that younger children performed better in EPT whereas older children outperformed in SPT. It was proposed that enactment effect in form of SPT enhances item-specific processing, whereas EPT improves relational information processing and this differential processes are responsible for the results achieved in younger and older children. The role of memory strategies and information processing methods in younger and older children were considered in this study. Moreover, the temporal progression of recall was faster in action encoding in the form of SPT and EPT compared with verbal encoding in both immediate and delayed free recall and size of enactment effect was constantly increased throughout the recall period. The results of the present study provide further evidence that the action memory is explained with an emphasis on the notion of information processing and strategic views. These results also reveal the temporal progression of recall as a new dimension of episodic memory in children.

Keywords: action memory, enactment effect, episodic memory, school-aged children, temporal progression

Procedia PDF Downloads 273
3968 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control

Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay

Abstract:

In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.

Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart

Procedia PDF Downloads 217
3967 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications

Authors: K. P. Sandesh, M. H. Suman

Abstract:

Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.

Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms

Procedia PDF Downloads 518
3966 A Control Model for the Dismantling of Industrial Plants

Authors: Florian Mach, Eric Hund, Malte Stonis

Abstract:

The dismantling of disused industrial facilities such as nuclear power plants or refineries is an enormous challenge for the planning and control of the logistic processes. Existing control models do not meet the requirements for a proper dismantling of industrial plants. Therefore, the paper presents an approach for the control of dismantling and post-processing processes (e.g. decontamination) in plant decommissioning. In contrast to existing approaches, the dismantling sequence and depth are selected depending on the capacity utilization of required post-processing processes by also considering individual characteristics of respective dismantling tasks (e.g. decontamination success rate, uncertainties regarding the process times). The results can be used in the dismantling of industrial plants (e.g. nuclear power plants) to reduce dismantling time and costs by avoiding bottlenecks such as capacity constraints.

Keywords: dismantling management, logistics planning and control models, nuclear power plant dismantling, reverse logistics

Procedia PDF Downloads 304
3965 Administrators' Information Management Capacity and Decision-Making Effectiveness on Staff Promotion in the Teaching Service Commissions in South – West, Nigeria

Authors: Olatunji Sabitu Alimi

Abstract:

This study investigated the extent to which administrators’ information storage, retrieval and processing capacities influence decisions on staff promotion in the Teaching Service Commissions (TESCOMs) in The South-West, Nigeria. One research question and two research hypotheses were formulated and tested respectively at 0.05 level of significance. The study used the descriptive research of the survey type. One hundred (100) staff on salary grade level 09 constituted the sample. Multi- stage, stratified and simple random sampling techniques were used to select 100 staff from the TESCOMs in The South-West, Nigeria. Two questionnaires titled Administrators’ Information Storage, Retrieval and Processing Capacities (AISRPC), and Staff Promotion Effectiveness (SPE) were used for data collection. The inventory was validated and subjected to test-re-test and reliability coefficient of r = 0.79 was obtained. The data were collected and analyzed using Pearson Product Moment Correlation coefficient and simple percentage. The study found that Administrators at TESCOM stored their information in files, hard copies, soft copies, open registry and departmentally in varying degrees while they also processed information manually and through electronics for decision making. In addition, there is a significant relationship between administrators’ information storage and retrieval capacities in the TESCOMs in South – West, Nigeria, (r cal = 0.598 > r table = 0.195). Furthermore, administrators’ information processing capacity and staff promotion effectiveness were found to be significantly related (r cal = 0.209 > r table = 0.195 at 0.05 level of significance). The study recommended that training, seminars, workshops should be organized for administrators on information management, while educational organizations should provide Information Management Technology (ICT) equipment for the administrators in the TESCOMs. The staff of TESCOM should be promoted having satisfied the promotion criteria such as spending required number of years on a grade level, a clean record of service and vacancy.

Keywords: information processing capacity, staff promotion effectiveness, teaching service commission, Nigeria

Procedia PDF Downloads 533
3964 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 438
3963 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics

Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh

Abstract:

In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.

Keywords: bond ball mill, population balance model, product size distribution, vertical stirred mill

Procedia PDF Downloads 292
3962 Challenging Weak Central Coherence: An Exploration of Neurological Evidence from Visual Processing and Linguistic Studies in Autism Spectrum Disorder

Authors: Jessica Scher Lisa, Eric Shyman

Abstract:

Autism spectrum disorder (ASD) is a neuro-developmental disorder that is characterized by persistent deficits in social communication and social interaction (i.e. deficits in social-emotional reciprocity, nonverbal communicative behaviors, and establishing/maintaining social relationships), as well as by the presence of repetitive behaviors and perseverative areas of interest (i.e. stereotyped or receptive motor movements, use of objects, or speech, rigidity, restricted interests, and hypo or hyperactivity to sensory input or unusual interest in sensory aspects of the environment). Additionally, diagnoses of ASD require the presentation of symptoms in the early developmental period, marked impairments in adaptive functioning, and a lack of explanation by general intellectual impairment or global developmental delay (although these conditions may be co-occurring). Over the past several decades, many theories have been developed in an effort to explain the root cause of ASD in terms of atypical central cognitive processes. The field of neuroscience is increasingly finding structural and functional differences between autistic and neurotypical individuals using neuro-imaging technology. One main area this research has focused upon is in visuospatial processing, with specific attention to the notion of ‘weak central coherence’ (WCC). This paper offers an analysis of findings from selected studies in order to explore research that challenges the ‘deficit’ characterization of a weak central coherence theory as opposed to a ‘superiority’ characterization of strong local coherence. The weak central coherence theory has long been both supported and refuted in the ASD literature and has most recently been increasingly challenged by advances in neuroscience. The selected studies lend evidence to the notion of amplified localized perception rather than deficient global perception. In other words, WCC may represent superiority in ‘local processing’ rather than a deficit in global processing. Additionally, the right hemisphere and the specific area of the extrastriate appear to be key in both the visual and lexicosemantic process. Overactivity in the striate region seems to suggest inaccuracy in semantic language, which lends itself to support for the link between the striate region and the atypical organization of the lexicosemantic system in ASD.

Keywords: autism spectrum disorder, neurology, visual processing, weak coherence

Procedia PDF Downloads 127
3961 A Critical Genre Analysis of Negative Parts in CSR Reports

Authors: Shuai Liu

Abstract:

In corporate social responsibility (CSR) reporting, companies are expected to present both the positive and negative parts of the social and environmental impacts of their performance. This study investigates how the companies that listed in fortune 500 respond to this challenge by analyzing the representations of negative part especially the safety performance. It has found that in the level of genre analysis, it presented 3 major moves and 11 steps in terms of the interdiscursivity analysis. It was made up of three dominant discourse.. The study calls for greater focus on the internal and external analysis of the negative aspect of aspects of companies’ self-disclosure.

Keywords: CSR reports, negative parts, critical genre analysis, interdiscursivity

Procedia PDF Downloads 428
3960 A Network of Nouns and Their Features :A Neurocomputational Study

Authors: Skiker Kaoutar, Mounir Maouene

Abstract:

Neuroimaging studies indicate that a large fronto-parieto-temporal network support nouns and their features, with some areas store semantic knowledge (visual, auditory, olfactory, gustatory,…), other areas store lexical representation and other areas are implicated in general semantic processing. However, it is not well understood how this fronto-parieto-temporal network can be modulated by different semantic tasks and different semantic relations between nouns. In this study, we combine a behavioral semantic network, functional MRI studies involving object’s related nouns and brain network studies to explain how different semantic tasks and different semantic relations between nouns can modulate the activity within the brain network of nouns and their features. We first describe how nouns and their features form a large scale brain network. For this end, we examine the connectivities between areas recruited during the processing of nouns to know which configurations of interaction areas are possible. We can thus identify if, for example, brain areas that store semantic knowledge communicate via functional/structural links with areas that store lexical representations. Second, we examine how this network is modulated by different semantic tasks involving nouns and finally, we examine how category specific activation may result from the semantic relations among nouns. The results indicate that brain network of nouns and their features is highly modulated and flexible by different semantic tasks and semantic relations. At the end, this study can be used as a guide to help neurosientifics to interpret the pattern of fMRI activations detected in the semantic processing of nouns. Specifically; this study can help to interpret the category specific activations observed extensively in a large number of neuroimaging studies and clinical studies.

Keywords: nouns, features, network, category specificity

Procedia PDF Downloads 521
3959 Phonological Processing and Its Role in Pseudo-Word Decoding in Children Learning to Read Kannada Language between 5.6 to 8.6 Years

Authors: Vangmayee. V. Subban, Somashekara H. S, Shwetha Prabhu, Jayashree S. Bhat

Abstract:

Introduction and Need: Phonological processing is critical in learning to read alphabetical and non-alphabetical languages. However, its role in learning to read Kannada an alphasyllabary is equivocal. The literature has focused on the developmental role of phonological awareness on reading. To the best of authors knowledge, the role of phonological memory and phonological naming has not been addressed in alphasyllabary Kannada language. Therefore, there is a need to evaluate the comprehensive role of the phonological processing skills in Kannada on word decoding skills during the early years of schooling. Aim and Objectives: The present study aimed to explore the phonological processing abilities and their role in learning to decode pseudowords in children learning to read the Kannada language during initial years of formal schooling between 5.6 to 8.6 years. Method: In this cross sectional study, 60 typically developing Kannada speaking children, 20 each from Grade I, Grade II, and Grade III between the age range of 5.6 to 6.6 years, 6.7 to 7.6 years and 7.7 to 8.6 years respectively were selected from Kannada medium schools. Phonological processing abilities were assessed using an assessment tool specifically developed to address the objectives of the present research. The assessment tool was content validated by subject experts and had good inter and intra-subject reliability. Phonological awareness was assessed at syllable level using syllable segmentation, blending, and syllable stripping at initial, medial and final position. Phonological memory was assessed using pseudoword repetition task and phonological naming was assessed using rapid automatized naming of objects. Both phonological awareneness and phonological memory measures were scored for the accuracy of the response, whereas Rapid Automatized Naming (RAN) was scored for total naming speed. Results: The mean scores comparison using one-way ANOVA revealed a significant difference (p ≤ 0.05) between the groups on all the measures of phonological awareness, pseudoword repetition, rapid automatized naming, and pseudoword reading. Subsequent post-hoc grade wise comparison using Bonferroni test revealed significant differences (p ≤ 0.05) between each of the grades for all the tasks except (p ≥ 0.05) for syllable blending, syllable stripping, and pseudoword repetition between Grade II and Grade III. The Pearson correlations revealed a highly significant positive correlation (p=0.000) between all the variables except phonological naming which had significant negative correlations. However, the correlation co-efficient was higher for phonological awareness measures compared to others. Hence, phonological awareness was chosen a first independent variable to enter in the hierarchical regression equation followed by rapid automatized naming and finally, pseudoword repetition. The regression analysis revealed syllable awareness as a single most significant predictor of pseudoword reading by explaining the unique variance of 74% and there was no significant change in R² when RAN and pseudoword repetition were added subsequently to the regression equation. Conclusion: Present study concluded that syllable awareness matures completely by Grade II, whereas the phonological memory and phonological naming continue to develop beyond Grade III. Amongst phonological processing skills, phonological awareness, especially syllable awareness is crucial for word decoding than phonological memory and naming during initial years of schooling.

Keywords: phonological awareness, phonological memory, phonological naming, phonological processing, pseudo-word decoding

Procedia PDF Downloads 175
3958 Laser Irradiated GeSn Photodetector for Improved Infrared Photodetection

Authors: Patrik Scajev, Pavels Onufrijevs, Algirdas Mekys, Tadas Malinauskas, Dominykas Augulis, Liudvikas Subacius, Kuo-Chih Lee, Jevgenijs Kaupuzs, Arturs Medvids, Hung Hsiang Cheng

Abstract:

In this study, we focused on the optoelectronic properties of the photodiodes prepared by using 200 nm thick Ge₀.₉₅Sn₀.₀₅ epitaxial layers on Ge/n-Si substrate with aluminum contacts. Photodiodes were formed on non-irradiated and Nd: YAG laser irradiated Ge₀.₉₅Sn₀.₀₅ layers. The samples were irradiated by pulsed Nd: YAG laser with 136.7-462.6 MW/cm² intensity. The photodiodes were characterized by using short laser pulses with the wavelength in the 2.0-2.6 μm range. The laser-irradiated diode was found more sensitive in the long-wavelength range due to laser-induced Sn atoms redistribution providing formation of graded bandgap structure. Sub-millisecond photocurrent relaxation in the diodes revealed their suitability for image sensors. Our findings open the perspective for improving the photo-sensitivity of GeSn alloys in the mid-infrared by pulsed laser processing.

Keywords: GeSn, laser processing, photodetector, infrared

Procedia PDF Downloads 153
3957 Foucault and Governmentality: International Organizations and State Power

Authors: Sara Dragisic

Abstract:

Using the theoretical analysis of the birth of biopolitics that Foucault performed through the history of liberalism and neoliberalism, in this paper we will try to show how, precisely through problematizing the role of international institutions, the model of governance differs from previous ways of objectifying body and life. Are the state and its mechanisms still a Leviathan to fight against, or can it be even the driver of resistance against the proponents of modern governance and the biopolitical power? Do paradigmatic examples of biopolitics still appear through sovereignty and (international) law, or is it precisely this sphere that shows a significant dose of incompetence and powerlessness in relation to, not only the economic sphere (Foucault’s critique of neoliberalism) but also the new politics of freedom? Have the struggle for freedom and human rights, as well as the war on terrorism, opened a new spectrum of biopolitical processes, which are manifested precisely through new international institutions and humanitarian discourse? We will try to answer these questions, in the following way. On the one hand, we will show that the views of authors such as Agamben and Hardt and Negri, in whom the state and sovereignty are seen as enemies to be defeated or overcome, fail to see how such attempts could translate into the politicization of life like it is done in many examples through the doctrine of liberal interventionism and humanitarianism. On the other hand, we will point out that it is precisely the humanitarian discourse and the defense of the right to intervention that can be the incentive and basis for the politicization of the category of life and lead to the selective application of human rights. Zizek example of the killing of United Nations workers and doctors in a village during the Vietnam War, who were targeted even before police or soldiers, because they were precisely seen as a powerful instrument of American imperialism (as they were sincerely trying to help the population), will be focus of this part of the analysis. We’ll ask the question whether such interpretation is a kind of liquidation of the extreme left of the political (Laclau) or on this basis can be explained at least in part the need to review the functioning of international organizations, ranging from those dealing with humanitarian aid (and humanitarian military interventions) to those dealing with protection and the security of the population, primarily from growing terrorism. Based on the above examples, we will also explain how the discourse of terrorism itself plays a dual role: it can appear as a tool of liberal biopolitics, although, more superficially, it mostly appears as an enemy that wants to destroy the liberal system and its values. This brings us to the basic problem that this paper will tackle: do the mechanisms of institutional struggle for human rights and freedoms, which is often seen as opposed to the security mechanisms of the state, serve the governance of citizens in such a way that the latter themselves participate in producing biopolitical governmental practices? Is the freedom today "nothing but the correlative development of apparatuses of security" (Foucault)? Or, we can continue this line of Foucault’s argumentation and enhance the interpretation with the important question of what precisely today reflects the change in the rationality of governance in which society is transformed from a passive object into a subject of its own production. Finally, in order to understand the skills of biopolitical governance in modern civil society, it is necessary to pay attention to the status of international organizations, which seem to have become a significant place for the implementation of global governance. In this sense, the power of sovereignty can turn out to be an insufficiently strong power of security policy, which can go hand in hand with freedom policies, through neoliberal governmental techniques.

Keywords: neoliberalism, Foucault, sovereignty, biopolitics, international organizations, NGOs, Agamben, Hardt&Negri, Zizek, security, state power

Procedia PDF Downloads 206
3956 System Identification of Timber Masonry Walls Using Shaking Table Test

Authors: Timir Baran Roy, Luis Guerreiro, Ashutosh Bagchi

Abstract:

Dynamic study is important in order to design, repair and rehabilitation of structures. It has played an important role in the behavior characterization of structures; such as bridges, dams, high-rise buildings etc. There had been a substantial development in this area over the last few decades, especially in the field of dynamic identification techniques of structural systems. Frequency Domain Decomposition (FDD) and Time Domain Decomposition are most commonly used methods to identify modal parameters; such as natural frequency, modal damping, and mode shape. The focus of the present research is to study the dynamic characteristics of typical timber masonry walls commonly used in Portugal. For that purpose, a multi-storey structural prototypes of such walls have been tested on a seismic shake table at the National Laboratory for Civil Engineering, Portugal (LNEC). Signal processing has been performed of the output response, which is collected from the shaking table experiment of the prototype using accelerometers. In the present work signal processing of the output response, based on the input response has been done in two ways: FDD and Stochastic Subspace Identification (SSI). In order to estimate the values of the modal parameters, algorithms for FDD are formulated, and parametric functions for the SSI are computed. Finally, estimated values from both the methods are compared to measure the accuracy of both the techniques.

Keywords: frequency domain decomposition (fdd), modal parameters, signal processing, stochastic subspace identification (ssi), time domain decomposition

Procedia PDF Downloads 264
3955 Studying the Spatial Aspects of Visual Attention Processing in Global Precedence Paradigm

Authors: Shreya Borthakur, Aastha Vartak

Abstract:

This behavioral experiment aimed to investigate the global precedence phenomenon in a South Asian sample and its correlation with mobile screen time. The global precedence effect refers to the tendency to process overall structure before attending to specific details. Participants completed attention tasks involving global and local stimuli with varying consistencies. The results showed a tendency towards local precedence, but no significant differences in reaction times were found between consistency levels or attention conditions. However, the correlation analysis revealed that participants with higher screen time exhibited a stronger negative correlation with local attention, suggesting that excessive screen usage may impact perceptual organization. Further research is needed to explore this relationship and understand the influence of screen time on cognitive processing.

Keywords: global precedence, visual attention, perceptual organization, screen time, cognition

Procedia PDF Downloads 68
3954 The Influence of Machine Tool Composite Stiffness to the Surface Waviness When Processing Posture Constantly Switching

Authors: Song Zhiyong, Zhao Bo, Du Li, Wang Wei

Abstract:

Aircraft structures generally have complex surface. Because of constantly switching postures of motion axis, five-axis CNC machine’s composite stiffness changes during CNC machining. It gives rise to different amplitude of vibration of processing system, which further leads to the different effects on surface waviness. In order to provide a solution for this problem, we take the “S” shape test specimen’s CNC machining for the object, through calculate the five axis CNC machine’s composite stiffness and establish vibration model, we analysis of the influence mechanism between vibration amplitude and surface waviness. Through carry out the surface quality measurement experiments, verify the validity and accuracy of the theoretical analysis. This paper’s research results provide a theoretical basis for surface waviness control.

Keywords: five axis CNC machine, “S” shape test specimen, composite stiffness, surface waviness

Procedia PDF Downloads 390
3953 The Divergent Discourse of Political Islam: A Comparative Study of Indonesia and Pakistan

Authors: Sohaib Khaliq

Abstract:

This paper pursues a systematic analysis of the broad range of theories and studies relevant to Islam and democracy, in general and as they have been developed from and applied to the Indonesian and Pakistani cases. The analysis finds that an Islamic society’s potential to assimilate democratic political institutions is contingent on either an unconstrained 'political participation' or its ability to 'reinterpret' religious text. Drawing on a comparison of Indonesia and Pakistan, the present study favors a route that passes through the religious gates of theoretical reinterpretation. In doing so, the study brings Muslim reformation theory into focus by clarifying the mechanism by which reformation takes place.

Keywords: Islam, democratization, political Islam, reformation

Procedia PDF Downloads 390
3952 Effects of Safety Intervention Program towards Behaviors among Rubber Wood Processing Workers Using Theory of Planned Behavior

Authors: Junjira Mahaboon, Anongnard Boonpak, Nattakarn Worrasan, Busma Kama, Mujalin Saikliang, Siripor Dankachatarn

Abstract:

Rubber wood processing is one of the most important industries in southern Thailand. The process has several safety hazards for example unsafe wood cutting machine guarding, wood dust, noise, and heavy lifting. However, workers’ occupational health and safety measures to promote their behaviors are still limited. This quasi-experimental research was to determine factors affecting workers’ safety behaviors using theory of planned behavior after implementing job safety intervention program. The purposes were to (1) determine factors affecting workers’ behaviors and (2) to evaluate effectiveness of the intervention program. The sample of study was 66 workers from a rubber wood processing factory. Factors in the Theory of Planned Behavior model (TPB) were measured before and after the intervention. The factors of TPB included attitude towards behavior, subjective norm, perceived behavioral control, intention, and behavior. Firstly, Job Safety Analysis (JSA) was conducted and Safety Standard Operation Procedures (SSOP) were established. The questionnaire was also used to collect workers’ characteristics and TPB factors. Then, job safety intervention program to promote workers’ behavior according to SSOP were implemented for a four month period. The program included SSOP training, personal protective equipment use, and safety promotional campaign. After that, the TPB factors were again collected. Paired sample t-test and independent t-test were used to analyze the data. The result revealed that attitude towards behavior and intention increased significantly after the intervention at p<0.05. These factors also significantly determined the workers’ safety behavior according to SSOP at p<0.05. However, subjective norm, and perceived behavioral control were not significantly changed nor related to safety behaviors. In conclusion, attitude towards behavior and workers’ intention should be promoted to encourage workers’ safety behaviors. SSOP intervention program e.g. short meeting, safety training, and promotional campaign should be continuously implemented in a routine basis to improve workers’ behavior.

Keywords: job safety analysis, rubber wood processing workers, safety standard operation procedure, theory of planned behavior

Procedia PDF Downloads 193
3951 Reasons for the Selection of Information-Processing Framework and the Philosophy of Mind as a General Account for an Error Analysis and Explanation on Mathematics

Authors: Michael Lousis

Abstract:

This research study is concerned with learner’s errors on Arithmetic and Algebra. The data resulted from a broader international comparative research program called Kassel Project. However, its conceptualisation differed from and contrasted with that of the main program, which was mostly based on socio-demographic data. The way in which the research study was conducted, was not dependent on the researcher’s discretion, but was absolutely dictated by the nature of the problem under investigation. This is because the phenomenon of learners’ mathematical errors is due neither to the intentions of learners nor to institutional processes, rules and norms, nor to the educators’ intentions and goals; but rather to the way certain information is presented to learners and how their cognitive apparatus processes this information. Several approaches for the study of learners’ errors have been developed from the beginning of the 20th century, encompassing different belief systems. These approaches were based on the behaviourist theory, on the Piagetian- constructivist research framework, the perspective that followed the philosophy of science and the information-processing paradigm. The researcher of the present study was forced to disclose the learners’ course of thinking that led them in specific observable actions with the result of showing particular errors in specific problems, rather than analysing scripts with the students’ thoughts presented in a written form. This, in turn, entailed that the choice of methods would have to be appropriate and conducive to seeing and realising the learners’ errors from the perspective of the participants in the investigation. This particular fact determined important decisions to be made concerning the selection of an appropriate framework for analysing the mathematical errors and giving explanations. Thus the rejection of the belief systems concerning behaviourism, the Piagetian-constructivist, and philosophy of science perspectives took place, and the information-processing paradigm in conjunction with the philosophy of mind were adopted as a general account for the elaboration of data. This paper explains why these decisions were appropriate and beneficial for conducting the present study and for the establishment of the ensued thesis. Additionally, the reasons for the adoption of the information-processing paradigm in conjunction with the philosophy of mind give sound and legitimate bases for the development of future studies concerning mathematical error analysis are explained.

Keywords: advantages-disadvantages of theoretical prospects, behavioral prospect, critical evaluation of theoretical prospects, error analysis, information-processing paradigm, opting for the appropriate approach, philosophy of science prospect, Piagetian-constructivist research frameworks, review of research in mathematical errors

Procedia PDF Downloads 190
3950 Jordan Curves in the Digital Plane with Respect to the Connectednesses given by Certain Adjacency Graphs

Authors: Josef Slapal

Abstract:

Digital images are approximations of real ones and, therefore, to be able to study them, we need the digital plane Z2 to be equipped with a convenient structure that behaves analogously to the Euclidean topology on the real plane. In particular, it is required that such a structure allows for a digital analogue of the Jordan curve theorem. We introduce certain adjacency graphs on the digital plane and prove digital Jordan curves for them thus showing that the graphs provide convenient structures on Z2 for the study and processing of digital images. Further convenient structures including the wellknown Khalimsky and Marcus-Wyse adjacency graphs may be obtained as quotients of the graphs introduced. Since digital Jordan curves represent borders of objects in digital images, the adjacency graphs discussed may be used as background structures on the digital plane for solving the problems of digital image processing that are closely related to borders like border detection, contour filling, pattern recognition, thinning, etc.

Keywords: digital plane, adjacency graph, Jordan curve, quotient adjacency

Procedia PDF Downloads 379
3949 Avoidance and Selectivity in the Acquisition of Arabic as a Second/Foreign Language

Authors: Abeer Heider

Abstract:

This paper explores and classifies the different kinds of avoidances that students commonly make in the acquisition of Arabic as a second/foreign language, and suggests specific strategies to help students lessen their avoidance trends in hopes of streamlining the learning process. Students most commonly use avoidance strategies in grammar, and word choice. These different types of strategies have different implications and naturally require different approaches. Thus the question remains as to the most effective way to help students improve their Arabic, and how teachers can efficiently utilize these techniques. It is hoped that this research will contribute to understand the role of avoidance in the field of the second language acquisition in general, and as a type of input. Yet some researchers also note that similarity between L1 and L2 may be problematic as well since the learner may doubt that such similarity indeed exists and consequently avoid the identical constructions or elements (Jordens, 1977; Kellermann, 1977, 1978, 1986). In an effort to resolve this issue, a case study is being conducted. The present case study attempts to provide a broader analysis of what is acquired than is usually the case, analyzing the learners ‘accomplishments in terms of three –part framework of the components of communicative competence suggested by Michele Canale: grammatical competence, sociolinguistic competence and discourse competence. The subjects of this study are 15 students’ 22th year who came to study Arabic at Qatar University of Cairo. The 15 students are in the advanced level. They were complete intermediate level in Arabic when they arrive in Qatar for the first time. The study used discourse analytic method to examine how the first language affects students’ production and output in the second language, and how and when students use avoidance methods in their learning. The study will be conducted through Fall 2015 through analyzing audio recordings that are recorded throughout the entire semester. The recordings will be around 30 clips. The students are using supplementary listening and speaking materials. The group will be tested at the end of the term to assess any measurable difference between the techniques. Questionnaires will be administered to teachers and students before and after the semester to assess any change in attitude toward avoidance and selectivity methods. Responses to these questionnaires are analyzed and discussed to assess the relative merits of the aforementioned strategies to avoidance and selectivity to further support on. Implications and recommendations for teacher training are proposed.

Keywords: the second language acquisition, learning languages, selectivity, avoidance

Procedia PDF Downloads 277
3948 Comparison of Yb and Tm-Fiber Laser Cutting Processes of Fiber Reinforced Plastics

Authors: Oktay Celenk, Ugur Karanfil, Iskender Demir, Samir Lamrini, Jorg Neumann, Arif Demir

Abstract:

Due to its favourable material characteristics, fiber reinforced plastics are amongst the main topics of all actual lightweight construction megatrends. Especially in transportation trends ranging from aeronautics over the automotive industry to naval transportation (yachts, cruise liners) the expected economic and environmental impact is huge. In naval transportation components like yacht bodies, antenna masts, decorative structures like deck lamps, light houses and pool areas represent cheap and robust solutions. Commercially available laser tools like carbon dioxide gas lasers (CO₂), frequency tripled solid state UV lasers, and Neodymium-YAG (Nd:YAG) lasers can be used. These tools have emission wavelengths of 10 µm, 0.355 µm, and 1.064 µm, respectively. The scientific goal is first of all the generation of a parameter matrix for laser processing of each used material for a Tm-fiber laser system (wavelength 2 µm). These parameters are the heat affected zone, process gas pressure, work piece feed velocity, intensity, irradiation time etc. The results are compared with results obtained with well-known material processing lasers, such as a Yb-fiber lasers (wavelength 1 µm). Compared to the CO₂-laser, the Tm-laser offers essential advantages for future laser processes like cutting, welding, ablating for repair and drilling in composite part manufacturing (components of cruise liners, marine pipelines). Some of these are the possibility of beam delivery in a standard fused silica fiber which enables hand guided processing, eye safety which results from the wavelength, excellent beam quality and brilliance due to the fiber nature. There is one more feature that is economically absolutely important for boat, automotive and military projects manufacturing that the wavelength of 2 µm is highly absorbed by the plastic matrix and thus enables selective removal of it for repair procedures.

Keywords: Thulium (Tm) fiber laser, laser processing of fiber-reinforced plastics (FRP), composite, heat affected zone

Procedia PDF Downloads 193
3947 Inviscid Steady Flow Simulation Around a Wing Configuration Using MB_CNS

Authors: Muhammad Umar Kiani, Muhammad Shahbaz, Hassan Akbar

Abstract:

Simulation of a high speed inviscid steady ideal air flow around a 2D/axial-symmetry body was carried out by the use of mb_cns code. mb_cns is a program for the time-integration of the Navier-Stokes equations for two-dimensional compressible flows on a multiple-block structured mesh. The flow geometry may be either planar or axisymmetric and multiply-connected domains can be modeled by patching together several blocks. The main simulation code is accompanied by a set of pre and post-processing programs. The pre-processing programs scriptit and mb_prep start with a short script describing the geometry, initial flow state and boundary conditions and produce a discretized version of the initial flow state. The main flow simulation program (or solver as it is sometimes called) is mb_cns. It takes the files prepared by scriptit and mb_prep, integrates the discrete form of the gas flow equations in time and writes the evolved flow data to a set of output files. This output data may consist of the flow state (over the whole domain) at a number of instants in time. After integration in time, the post-processing programs mb_post and mb_cont can be used to reformat the flow state data and produce GIF or postscript plots of flow quantities such as pressure, temperature and Mach number. The current problem is an example of supersonic inviscid flow. The flow domain for the current problem (strake configuration wing) is discretized by a structured grid and a finite-volume approach is used to discretize the conservation equations. The flow field is recorded as cell-average values at cell centers and explicit time stepping is used to update conserved quantities. MUSCL-type interpolation and one of three flux calculation methods (Riemann solver, AUSMDV flux splitting and the Equilibrium Flux Method, EFM) are used to calculate inviscid fluxes across cell faces.

Keywords: steady flow simulation, processing programs, simulation code, inviscid flux

Procedia PDF Downloads 429
3946 Importance of Ethics in Cloud Security

Authors: Pallavi Malhotra

Abstract:

This paper examines the importance of ethics in cloud computing. In the modern society, cloud computing is offering individuals and businesses an unlimited space for storing and processing data or information. Most of the data and information stored in the cloud by various users such as banks, doctors, architects, engineers, lawyers, consulting firms, and financial institutions among others require a high level of confidentiality and safeguard. Cloud computing offers centralized storage and processing of data, and this has immensely contributed to the growth of businesses and improved sharing of information over the internet. However, the accessibility and management of data and servers by a third party raise concerns regarding the privacy of clients’ information and the possible manipulations of the data by third parties. This document suggests the approaches various stakeholders should take to address various ethical issues involving cloud-computing services. Ethical education and training is key to all stakeholders involved in the handling of data and information stored or being processed in the cloud.

Keywords: IT ethics, cloud computing technology, cloud privacy and security, ethical education

Procedia PDF Downloads 325