Search results for: lexical complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1833

Search results for: lexical complexity

873 Discrimination Against Popular Religiosity in the Dominican Republic

Authors: Sterlyn Poueriet Gil

Abstract:

Through the years, the construction of cultural identity in the Dominican Republic has been subject to multiple unfavorable conditions fostered by the elites. These conditions have led to the loss of a pattern that lasts over time; leaving this as a result of inconsistency and the diversity of elements that at first are theses and antitheses and gives form to the complexity of what we can call: "the culture of a people that tries to reinvent itself in each social-historical moment". Religion is one of those cultural elements that does not escape the will of the elites. In the country, there are multiple religious groups that, in one way or another, represent what the people are, their ancestral customs, their philosophy, and even their strengths as groups of a certain social environment. However, these have always been marginalized and discriminated against by the country's official religion and their respective denominations. The objective of this research was to verify to what extent interreligious discrimination was real, moving from the assumption to scientific evidence through the application of research techniques such as the survey, fieldwork, and qualitative analysis of the collected evidence, the supremacy of the dominant religion condemns its rites and in many cases the person himself. In many communities, freedom of worship is reserved for traditional groups, having cases in the country where the manifestations of rites such as the "gagá" and the "prillé" have been prohibited, considering them as diabolical and primitive practices; this seeks to deny the roots of a people marked by poverty and social conflicts but remains firm in the will to be.

Keywords: cultural identity, freedom of worship, gagá, popular demonstrations

Procedia PDF Downloads 165
872 A Multidimensional Exploration of Narcissistic Personality Disorder Through Psycholinguistic Analysis and Neuroscientific Correlates

Authors: Dalia Elleuch

Abstract:

Narcissistic Personality Disorder (NPD) manifests as a personality disorder marked by inflated self-importance, heightened sensitivity to criticism, a lack of empathy, a preoccupation with appearance over substance, and features such as arrogance, grandiosity, a constant need for admiration, a tendency to exploit others, and an inclination towards demanding special treatment due to a sense of excessive entitlement (APA, 2013). This interdisciplinary study delves into NPD through the systematic synthesis of psycholinguistic analysis and neuroscientific correlates. The cognitive and emotional dimensions of NPD reveal linguistic patterns, including grandiosity, entitlement, and manipulative communication. Neuroscientific investigations reveal structural brain differences and alterations in functional connectivity, further explaining the neural underpinnings of social cognition deficits observed in individuals with NPD. Genetic predispositions and neurotransmitter imbalances add a layer of complexity to the understanding of NPD. The necessity for linguistic intervention in diagnosing and treating Narcissistic Personality Disorder is underscored by an interdisciplinary study that intricately synthesizes psycholinguistic analysis and neuroscientific correlates, offering a comprehensive understanding of NPD’s cognitive, emotional, and neural dimensions and paving the way for future practical, theoretical, and pedagogical approaches to address the complexities of this personality disorder.

Keywords: Narcissistic Personality Disorder (NPD), psycholinguistic analysis, neuroscientific correlates, interpersonal dysfunction, cognitive empathy

Procedia PDF Downloads 41
871 Modeling the Performance of Natural Sand-Bentonite Barriers after Infiltration with Polar and Non-Polar Hydrocarbon Leachates

Authors: Altayeb Qasem, Mousa Bani Baker, Amani Nawafleh

Abstract:

The complexity of the sand-bentonite liner barrier system calls for an adequate model that reflects the conditions depending on the barrier materials and the characteristics of the permeates which lead to hydraulic conductivity changes when liners infiltrated with polar, no-polar, miscible and immiscible liquids. This paper is dedicated to developing a model for evaluating the hydraulic conductivity in the form of a simple indicator for the compatibility of the liner versus leachate. Based on two liner compositions (95% sand: 5% bentonite; and 90% sand: 10% bentonite), two pressures (40 kPa and 100 kPa), and three leachates: water, ethanol and biofuel. Two characteristics of the leacahtes were used: viscosity of permeate and its octanol-water partitioning coefficient (Kow). Three characteristics of the liners mixtures were evaluated which had impact on the hydraulic conductivity of the liner system: the initial content of bentonite (%), the free swelling index, and the shrinkage limit of the initial liner’s mixture. Engineers can use this modest tool to predict a potential liner failure in sand-bentonite barriers.

Keywords: liner performance, sand-bentonite barriers, viscosity, free swelling index, shrinkage limit, octanol-water partitioning coefficient, hydraulic conductivity, theoretical modeling

Procedia PDF Downloads 394
870 Using T-Splines to Model Point Clouds from Terrestrial Laser Scanner

Authors: G. Kermarrec, J. Hartmann

Abstract:

Spline surfaces are a major representation of freeform surfaces in the computer-aided graphic industry and were recently introduced in the field of geodesy for processing point clouds from terrestrial laser scanner (TLS). The surface fitting consists of approximating a trustworthy mathematical surface to a large numbered 3D point cloud. The standard B-spline surfaces lack of local refinement due to the tensor-product construction. The consequences are oscillating geometry, particularly in the transition from low-to-high curvature parts for scattered point clouds with missing data. More economic alternatives in terms of parameters on how to handle point clouds with a huge amount of observations are the recently introduced T-splines. As long as the partition of unity is guaranteed, their computational complexity is low, and they are flexible. T-splines are implemented in a commercial package called Rhino, a 3D modeler which is widely used in computer aided design to create and animate NURBS objects. We have applied T-splines surface fitting to terrestrial laser scanner point clouds from a bridge under load and a sheet pile wall with noisy observations. We will highlight their potential for modelling details with high trustworthiness, paving the way for further applications in terms of deformation analysis.

Keywords: deformation analysis, surface modelling, terrestrial laser scanner, T-splines

Procedia PDF Downloads 125
869 Offender Rehabilitation: The Middle Way of Maimonides to Mental and Social Health

Authors: Liron Hoch

Abstract:

Traditional religious and spiritual texts offer a surprising wealth of relevant theoretical and practical knowledge about human behavior. This wellspring may contribute significantly to expanding our current body of knowledge in the social sciences and criminology in particular. In Jewish religious texts, specifically by Maimonides, we can find profound analyses of human traits and guidelines for a normative way of life. Among other things, modern criminological literature attempts to link certain character traits and divergent behaviors. Using the hermeneutic phenomenological approach, we analyzed the writings of Maimonides, mainly Laws of Human Dispositions, in order to understand Moses ben Maimon's (1138–1204) view of character traits. The analysis yielded four themes: (1) Human personality between nature and nurture; (2) The complexity of human personality, imbalance and criminality; (3) Extremism as a way to achieve balance; and (4) The Middle Way, flexibility and common sense. These themes can serve therapeutic purposes, as well as inform a rehabilitation model. Grounded in a theoretical rationale about the nature of humans, this model is designed to direct individuals to balance their traits by self-reflection and constant practice of the Middle Way. The proposal we will present is that implementing this model may promote normative behavior and thus contribute to rehabilitating offenders.

Keywords: rehabilitation, traits, offenders, maimonides, middle way

Procedia PDF Downloads 53
868 Artificial Bee Colony Optimization for SNR Maximization through Relay Selection in Underlay Cognitive Radio Networks

Authors: Babar Sultan, Kiran Sultan, Waseem Khan, Ijaz Mansoor Qureshi

Abstract:

In this paper, a novel idea for the performance enhancement of secondary network is proposed for Underlay Cognitive Radio Networks (CRNs). In Underlay CRNs, primary users (PUs) impose strict interference constraints on the secondary users (SUs). The proposed scheme is based on Artificial Bee Colony (ABC) optimization for relay selection and power allocation to handle the highlighted primary challenge of Underlay CRNs. ABC is a simple, population-based optimization algorithm which attains global optimum solution by combining local search methods (Employed and Onlooker Bees) and global search methods (Scout Bees). The proposed two-phase relay selection and power allocation algorithm aims to maximize the signal-to-noise ratio (SNR) at the destination while operating in an underlying mode. The proposed algorithm has less computational complexity and its performance is verified through simulation results for a different number of potential relays, different interference threshold levels and different transmit power thresholds for the selected relays.

Keywords: artificial bee colony, underlay spectrum sharing, cognitive radio networks, amplify-and-forward

Procedia PDF Downloads 561
867 Transcription Skills and Written Composition in Chinese

Authors: Pui-sze Yeung, Connie Suk-han Ho, David Wai-ock Chan, Kevin Kien-hoa Chung

Abstract:

Background: Recent findings have shown that transcription skills play a unique and significant role in Chinese word reading and spelling (i.e. word dictation), and written composition development. The interrelationships among component skills of transcription, word reading, word spelling, and written composition in Chinese have rarely been examined in the literature. Is the contribution of component skills of transcription to Chinese written composition mediated by word level skills (i.e., word reading and spelling)? Methods: The participants in the study were 249 Chinese children in Grade 1, Grade 3, and Grade 5 in Hong Kong. They were administered measures of general reasoning ability, orthographic knowledge, stroke sequence knowledge, word spelling, handwriting fluency, word reading, and Chinese narrative writing. Orthographic knowledge- orthographic knowledge was assessed by a task modeled after the lexical decision subtest of the Hong Kong Test of Specific Learning Difficulties in Reading and Writing (HKT-SpLD). Stroke sequence knowledge: The participants’ performance in producing legitimate stroke sequences was measured by a stroke sequence knowledge task. Handwriting fluency- Handwriting fluency was assessed by a task modeled after the Chinese Handwriting Speed Test. Word spelling: The stimuli of the word spelling task consist of fourteen two-character Chinese words. Word reading: The stimuli of the word reading task consist of 120 two-character Chinese words. Written composition: A narrative writing task was used to assess the participants’ text writing skills. Results: Analysis of covariance results showed that there were significant between-grade differences in the performance of word reading, word spelling, handwriting fluency, and written composition. Preliminary hierarchical multiple regression analysis results showed that orthographic knowledge, word spelling, and handwriting fluency were unique predictors of Chinese written composition even after controlling for age, IQ, and word reading. The interaction effects between grade and each of these three skills (orthographic knowledge, word spelling, and handwriting fluency) were not significant. Path analysis results showed that orthographic knowledge contributed to written composition both directly and indirectly through word spelling, while handwriting fluency contributed to written composition directly and indirectly through both word reading and spelling. Stroke sequence knowledge only contributed to written composition indirectly through word spelling. Conclusions: Preliminary hierarchical regression results were consistent with previous findings about the significant role of transcription skills in Chinese word reading, spelling and written composition development. The fact that orthographic knowledge contributed both directly and indirectly to written composition through word reading and spelling may reflect the impact of the script-sound-meaning convergence of Chinese characters on the composing process. The significant contribution of word spelling and handwriting fluency to Chinese written composition across elementary grades highlighted the difficulty in attaining automaticity of transcription skills in Chinese, which limits the working memory resources available for other composing processes.

Keywords: orthographic knowledge, transcription skills, word reading, writing

Procedia PDF Downloads 406
866 Enhancing Patch Time Series Transformer with Wavelet Transform for Improved Stock Prediction

Authors: Cheng-yu Hsieh, Bo Zhang, Ahmed Hambaba

Abstract:

Stock market prediction has long been an area of interest for both expert analysts and investors, driven by its complexity and the noisy, volatile conditions it operates under. This research examines the efficacy of combining the Patch Time Series Transformer (PatchTST) with wavelet transforms, specifically focusing on Haar and Daubechies wavelets, in forecasting the adjusted closing price of the S&P 500 index for the following day. By comparing the performance of the augmented PatchTST models with traditional predictive models such as Recurrent Neural Networks (RNNs), Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformers, this study highlights significant enhancements in prediction accuracy. The integration of the Daubechies wavelet with PatchTST notably excels, surpassing other configurations and conventional models in terms of Mean Absolute Error (MAE) and Mean Squared Error (MSE). The success of the PatchTST model paired with Daubechies wavelet is attributed to its superior capability in extracting detailed signal information and eliminating irrelevant noise, thus proving to be an effective approach for financial time series forecasting.

Keywords: deep learning, financial forecasting, stock market prediction, patch time series transformer, wavelet transform

Procedia PDF Downloads 25
865 Online Handwritten Character Recognition for South Indian Scripts Using Support Vector Machines

Authors: Steffy Maria Joseph, Abdu Rahiman V, Abdul Hameed K. M.

Abstract:

Online handwritten character recognition is a challenging field in Artificial Intelligence. The classification success rate of current techniques decreases when the dataset involves similarity and complexity in stroke styles, number of strokes and stroke characteristics variations. Malayalam is a complex south indian language spoken by about 35 million people especially in Kerala and Lakshadweep islands. In this paper, we consider the significant feature extraction for the similar stroke styles of Malayalam. This extracted feature set are suitable for the recognition of other handwritten south indian languages like Tamil, Telugu and Kannada. A classification scheme based on support vector machines (SVM) is proposed to improve the accuracy in classification and recognition of online malayalam handwritten characters. SVM Classifiers are the best for real world applications. The contribution of various features towards the accuracy in recognition is analysed. Performance for different kernels of SVM are also studied. A graphical user interface has developed for reading and displaying the character. Different writing styles are taken for each of the 44 alphabets. Various features are extracted and used for classification after the preprocessing of input data samples. Highest recognition accuracy of 97% is obtained experimentally at the best feature combination with polynomial kernel in SVM.

Keywords: SVM, matlab, malayalam, South Indian scripts, onlinehandwritten character recognition

Procedia PDF Downloads 559
864 Banking and Accounting Analysis Researches Effect on Environment and Income

Authors: Gerges Samaan Henin Abdalla

Abstract:

Ultra-secured methods of banking services have been introduced to the customer, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. Consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.

Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development

Procedia PDF Downloads 26
863 Improving Decision-Making in Multi-Project Environments within Organizational Information Systems Using Blockchain Technology

Authors: Seyed Hossein Iranmanesh, Hassan Nouri, Seyed Reza Iranmanesh

Abstract:

In the dynamic and complex landscape of today’s business, organizations often face challenges in impactful decision-making across multi-project settings. To efficiently allocate resources, coordinate tasks, and optimize project outcomes, establishing robust decision-making processes is essential. Furthermore, the increasing importance of information systems and their integration within organizational workflows introduces an additional layer of complexity. This research proposes the use of blockchain technology as a suitable solution to enhance decision-making in multi-project environments, particularly within the realm of information systems. The conceptual framework in this study comprises four independent variables and one dependent variable. The identified independent variables for the targeted research include: Blockchain Layer in Integrated Systems, Quality of Generated Information ,User Satisfaction with Integrated Systems and Utilization of Integrated Systems. The project’s performance, considered as the dependent variable and moderated by organizational policies and procedures, reflects the impact of blockchain technology adoption on organizational effectiveness1. The results highlight the significant influence of blockchain implementation on organizational performance.

Keywords: multi-project environments, decision support systems, information systems, blockchain technology, decentralized systems.

Procedia PDF Downloads 38
862 Explicit Numerical Approximations for a Pricing Weather Derivatives Model

Authors: Clarinda V. Nhangumbe, Ercília Sousa

Abstract:

Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.

Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives

Procedia PDF Downloads 75
861 Phonological Encoding and Working Memory in Kannada Speaking Adults Who Stutter

Authors: Nirmal Sugathan, Santosh Maruthy

Abstract:

Background: A considerable number of studies have evidenced that phonological encoding (PE) and working memory (WM) skills operate differently in adults who stutter (AWS). In order to tap these skills, several paradigms have been employed such as phonological priming, phoneme monitoring, and nonword repetition tasks. This study, however, utilizes a word jumble paradigm to assess both PE and WM using different modalities and this may give a better understanding of phonological processing deficits in AWS. Aim: The present study investigated PE and WM abilities in conjunction with lexical access in AWS using jumbled words. The study also aimed at investigating the effect of increase in cognitive load on phonological processing in AWS by comparing the speech reaction time (SRT) and accuracy scores across various syllable lengths. Method: Participants were 11 AWS (Age range=19-26) and 11 adults who do not stutter (AWNS) (Age range=19-26) matched for age, gender and handedness. Stimuli: Ninety 3-, 4-, and 5-syllable jumbled words (JWs) (n=30 per syllable length category) constructed from Kannada words served as stimuli for jumbled word paradigm. In order to generate jumbled words (JWs), the syllables in the real words were randomly transpositioned. Procedures: To assess PE, the JWs were presently visually using DMDX software and for WM task, JWs were presented through auditory mode through headphones. The participants were asked to silently manipulate the jumbled words to form a Kannada real word and verbally respond once. The responses for both tasks were audio recorded using record function in DMDX software and the recorded responses were analyzed using PRAAT software to calculate the SRT. Results: SRT: Mann-Whitney test results demonstrated that AWS performed significantly slower on both tasks (p < 0.001) as indicated by increased SRT. Also, AWS presented with increased SRT on both the tasks in all syllable length conditions (p < 0.001). Effect of syllable length: Wilcoxon signed rank test was carried out revealed that, on task assessing PE, the SRT of 4syllable JWs were significantly higher in both AWS (Z= -2.93, p=.003) and AWNS (Z= -2.41, p=.003) when compared to 3-syllable words. However, the findings for 4- and 5-syllable words were not significant. Task Accuracy: The accuracy scores were calculated for three syllable length conditions for both PE and PM tasks and were compared across the groups using Mann-Whitney test. The results indicated that the accuracy scores of AWS were significantly below that of AWNS in all the three syllable conditions for both the tasks (p < 0.001). Conclusion: The above findings suggest that PE and WM skills are compromised in AWS as indicated by increased SRT. Also, AWS were progressively less accurate in descrambling JWs of increasing syllable length and this may be interpreted as, rather than existing as a uniform deficiency, PE and WM deficits emerge when the cognitive load is increased. AWNS exhibited increased SRT and increased accuracy for JWs of longer syllable length whereas AWS was not benefited from increasing the reaction time, thus AWS had to compromise for both SRT and accuracy while solving JWs of longer syllable length.

Keywords: adults who stutter, phonological ability, working memory, encoding, jumbled words

Procedia PDF Downloads 221
860 Aqueous Two Phase Extraction of Jonesia denitrificans Xylanase 6 in PEG 1000/Phosphate System

Authors: Nawel Boucherba, Azzedine Bettache, Abdelaziz Messis, Francis Duchiron, Said Benallaoua

Abstract:

The impetus for research in the field of bioseparation has been sparked by the difficulty and complexity in the downstream processing of biological products. Indeed, 50% to 90% of the production cost for a typical biological product resides in the purification strategy. There is a need for efficient and economical large scale bioseparation techniques which will achieve high purity and high recovery while maintaining the biological activity of the molecule. One such purification technique which meets these criteria involves the partitioning of biomolecules between two immiscible phases in an aqueous system (ATPS). The Production of xylanases is carried out in 500ml of a liquid medium based on birchwood xylan. In each ATPS, PEG 1000 is added to a mixture consisting of dipotassium phosphate, sodium chloride and the culture medium inoculated with the strain Jonesia denitrificans, the mixture was adjusted to different pH. The concentration of PEG 1000 was varied: 8 to 16 % and the NaCl percentages are also varied from 2 to 4% while maintaining the other parameters constant. The results showed that the best ATPS for purification of xylanases is composed of PEG 1000 at 8.33%, 13.14 % of K2HPO4, 1.62% NaCl at pH 7. We obtained a yield of 96.62 %, a partition coefficient of 86.66 and a purification factor of 2.9. The zymogram showed that the activity is mainly detected in the top phase.

Keywords: Jonesia denitrificans BN13, xylanase, aqueous two phases system, zymogram

Procedia PDF Downloads 385
859 Music Reading Expertise Facilitates Implicit Statistical Learning of Sentence Structures in a Novel Language: Evidence from Eye Movement Behavior

Authors: Sara T. K. Li, Belinda H. J. Chung, Jeffery C. N. Yip, Janet H. Hsiao

Abstract:

Music notation and text reading both involve statistical learning of music or linguistic structures. However, it remains unclear how music reading expertise influences text reading behavior. The present study examined this issue through an eye-tracking study. Chinese-English bilingual musicians and non-musicians read English sentences, Chinese sentences, musical phrases, and sentences in Tibetan, a language novel to the participants, with their eye movement recorded. Each set of stimuli consisted of two conditions in terms of structural regularity: syntactically correct and syntactically incorrect musical phrases/sentences. They then completed a sentence comprehension (for syntactically correct sentences) or a musical segment/word recognition task afterwards to test their comprehension/recognition abilities. The results showed that in reading musical phrases, as compared with non-musicians, musicians had a higher accuracy in the recognition task, and had shorter reading time, fewer fixations, and shorter fixation duration when reading syntactically correct (i.e., in diatonic key) than incorrect (i.e., in non-diatonic key/atonal) musical phrases. This result reflects their expertise in music reading. Interestingly, in reading Tibetan sentences, which was novel to both participant groups, while non-musicians did not show any behavior differences between reading syntactically correct or incorrect Tibetan sentences, musicians showed a shorter reading time and had marginally fewer fixations when reading syntactically correct sentences than syntactically incorrect ones. However, none of the musicians reported discovering any structural regularities in the Tibetan stimuli after the experiment when being asked explicitly, suggesting that they may have implicitly acquired the structural regularities in Tibetan sentences. This group difference was not observed when they read English or Chinese sentences. This result suggests that music reading expertise facilities reading texts in a novel language (i.e., Tibetan), but not in languages that the readers are already familiar with (i.e., English and Chinese). This phenomenon may be due to the similarities between reading music notations and reading texts in a novel language, as in both cases the stimuli follow particular statistical structures but do not involve semantic or lexical processing. Thus, musicians may transfer their statistical learning skills stemmed from music notation reading experience to implicitly discover structures of sentences in a novel language. This speculation is consistent with a recent finding showing that music reading expertise modulates the processing of English nonwords (i.e., words that do not follow morphological or orthographic rules) but not pseudo- or real words. These results suggest that the modulation of music reading expertise on language processing depends on the similarities in the cognitive processes involved. It also has important implications for the benefits of music education on language and cognitive development.

Keywords: eye movement behavior, eye-tracking, music reading expertise, sentence reading, structural regularity, visual processing

Procedia PDF Downloads 361
858 Woman, House, Identity: The Study of the Role of House in Constructing the Contemporary Dong Minority Woman’s Identity

Authors: Sze Wai Veera Fung, Peter W. Ferretto

Abstract:

Similar to most ethnic groups in China, men of the Dong minority hold the primary position in policymaking, moral authority, social values, and the control of the property. As the spatial embodiment of the patriarchal ideals, the house plays a significant role in producing and reproducing the distinctive gender status within the Dong society. Nevertheless, Dong women do not see their home as a cage of confinement, nor do they see themselves as a victim of oppression. For these women with reference to their productive identity, a house is a dwelling place with manifold meanings, including a proof of identity, an economic instrument, and a public resource operating on the community level. This paper examines the role of the house as a central site for identity construction and maintenance for the southern dialect Dong minority women in Hunan, China. Drawing on recent interviews with the Dong women, this study argues that women as productive individuals have a strong influence on the form of their house and the immediate environment, regardless of the male-dominated social construct of the Dong society. The aim of this study is not to produce a definitive relationship between women, house, and identity. Rather, it seeks to offer an alternative lens into the complexity and diversity of gender dynamics operating in and beyond the boundary of the house in the context of contemporary rural China.

Keywords: conception of home, Dong minority, house, rural China, woman’s identity

Procedia PDF Downloads 116
857 A Simple Computational Method for the Gravitational and Seismic Soil-Structure-Interaction between New and Existent Buildings Sites

Authors: Nicolae Daniel Stoica, Ion Mierlus Mazilu

Abstract:

This work is one of numerical research and aims to address the issue of the design of new buildings in a 3D location of existing buildings. In today's continuous development and congestion of urban centers is a big question about the influence of the new buildings on an already existent vicinity site. Thus, in this study, we tried to focus on how existent buildings may be affected by any newly constructed buildings and in how far this influence is really decreased. The problem of modeling the influence of interaction between buildings is not simple in any area in the world, and neither in Romania. Unfortunately, most often the designers not done calculations that can determine how close to reality these 3D influences nor the simplified method and the more superior methods. In the most literature making a "shield" (the pilots or molded walls) is absolutely sufficient to stop the influence between the buildings, and so often the soil under the structure is ignored in the calculation models. The main causes for which the soil is neglected in the analysis are related to the complexity modeling of interaction between soil and structure. In this paper, based on a new simple but efficient methodology we tried to determine for a lot of study cases the influence, in terms of assessing the interaction land structure on the behavior of structures that influence a new building on an existing one. The study covers additional subsidence that may occur during the execution of new works and after its completion. It also highlighted the efforts diagrams and deflections in the soil for both the original case and the final stage. This is necessary to see to what extent the expected impact of the new building on existing areas.

Keywords: soil, structure, interaction, piles, earthquakes

Procedia PDF Downloads 278
856 Proposal of Optimality Evaluation for Quantum Secure Communication Protocols by Taking the Average of the Main Protocol Parameters: Efficiency, Security and Practicality

Authors: Georgi Bebrov, Rozalina Dimova

Abstract:

In the field of quantum secure communication, there is no evaluation that characterizes quantum secure communication (QSC) protocols in a complete, general manner. The current paper addresses the problem concerning the lack of such an evaluation for QSC protocols by introducing an optimality evaluation, which is expressed as the average over the three main parameters of QSC protocols: efficiency, security, and practicality. For the efficiency evaluation, the common expression of this parameter is used, which incorporates all the classical and quantum resources (bits and qubits) utilized for transferring a certain amount of information (bits) in a secure manner. By using criteria approach whether or not certain criteria are met, an expression for the practicality evaluation is presented, which accounts for the complexity of the QSC practical realization. Based on the error rates that the common quantum attacks (Measurement and resend, Intercept and resend, probe attack, and entanglement swapping attack) induce, the security evaluation for a QSC protocol is proposed as the minimum function taken over the error rates of the mentioned quantum attacks. For the sake of clarity, an example is presented in order to show how the optimality is calculated.

Keywords: quantum cryptography, quantum secure communcation, quantum secure direct communcation security, quantum secure direct communcation efficiency, quantum secure direct communcation practicality

Procedia PDF Downloads 166
855 Infilling Strategies for Surrogate Model Based Multi-disciplinary Analysis and Applications to Velocity Prediction Programs

Authors: Malo Pocheau-Lesteven, Olivier Le Maître

Abstract:

Engineering and optimisation of complex systems is often achieved through multi-disciplinary analysis of the system, where each subsystem is modeled and interacts with other subsystems to model the complete system. The coherence of the output of the different sub-systems is achieved through the use of compatibility constraints, which enforce the coupling between the different subsystems. Due to the complexity of some sub-systems and the computational cost of evaluating their respective models, it is often necessary to build surrogate models of these subsystems to allow repeated evaluation these subsystems at a relatively low computational cost. In this paper, gaussian processes are used, as their probabilistic nature is leveraged to evaluate the likelihood of satisfying the compatibility constraints. This paper presents infilling strategies to build accurate surrogate models of the subsystems in areas where they are likely to meet the compatibility constraint. It is shown that these infilling strategies can reduce the computational cost of building surrogate models for a given level of accuracy. An application of these methods to velocity prediction programs used in offshore racing naval architecture further demonstrates these method's applicability in a real engineering context. Also, some examples of the application of uncertainty quantification to field of naval architecture are presented.

Keywords: infilling strategy, gaussian process, multi disciplinary analysis, velocity prediction program

Procedia PDF Downloads 142
854 DesignChain: Automated Design of Products Featuring a Large Number of Variants

Authors: Lars Rödel, Jonas Krebs, Gregor Müller

Abstract:

The growing price pressure due to the increasing number of global suppliers, the growing individualization of products and ever-shorter delivery times are upcoming challenges in the industry. In this context, Mass Personalization stands for the individualized production of customer products in batch size 1 at the price of standardized products. The possibilities of digitalization and automation of technical order processing open up the opportunity for companies to significantly reduce their cost of complexity and lead times and thus enhance their competitiveness. Many companies already use a range of CAx tools and configuration solutions today. Often, the expert knowledge of employees is hidden in "knowledge silos" and is rarely networked across processes. DesignChain describes the automated digital process from the recording of individual customer requirements, through design and technical preparation, to production. Configurators offer the possibility of mapping variant-rich products within the Design Chain. This transformation of customer requirements into product features makes it possible to generate even complex CAD models, such as those for large-scale plants, on a rule-based basis. With the aid of an automated CAx chain, production-relevant documents are thus transferred digitally to production. This process, which can be fully automated, allows variants to always be generated on the basis of current version statuses.

Keywords: automation, design, CAD, CAx

Procedia PDF Downloads 62
853 An Approach of Node Model TCnNet: Trellis Coded Nanonetworks on Graphene Composite Substrate

Authors: Diogo Ferreira Lima Filho, José Roberto Amazonas

Abstract:

Nanotechnology opens the door to new paradigms that introduces a variety of novel tools enabling a plethora of potential applications in the biomedical, industrial, environmental, and military fields. This work proposes an integrated node model by applying the same concepts of TCNet to networks of nanodevices where the nodes are cooperatively interconnected with a low-complexity Mealy Machine (MM) topology integrating in the same electronic system the modules necessary for independent operation in wireless sensor networks (WSNs), consisting of Rectennas (RF to DC power converters), Code Generators based on Finite State Machine (FSM) & Trellis Decoder and On-chip Transmit/Receive with autonomy in terms of energy sources applying the Energy Harvesting technique. This approach considers the use of a Graphene Composite Substrate (GCS) for the integrated electronic circuits meeting the following characteristics: mechanical flexibility, miniaturization, and optical transparency, besides being ecological. In addition, graphene consists of a layer of carbon atoms with the configuration of a honeycomb crystal lattice, which has attracted the attention of the scientific community due to its unique Electrical Characteristics.

Keywords: composite substrate, energy harvesting, finite state machine, graphene, nanotechnology, rectennas, wireless sensor networks

Procedia PDF Downloads 89
852 Transitioning Teacher Identity during COVID-19: An Australian Early Childhood Education Perspective

Authors: J. Jebunnesa, Y. Budd, T. Mason

Abstract:

COVID-19 changed the pedagogical expectations of early childhood education as many teachers across Australia had to quickly adapt to new teaching practices such as remote teaching. An important factor in the successful implementation of any new teaching and learning approach is teacher preparation, however, due to the pandemic, the transformation to remote teaching was immediate. A timely question to be asked is how early childhood teachers managed the transition from face-to-face teaching to remote teaching and what was learned through this time. This study explores the experiences of early childhood educators in Australia during COVID-19 lockdowns. Data were collected from an online survey conducted through the official Facebook forum of “Early Childhood Education and Care Australia,” and a constructivist grounded theory methodology was used to analyse the data. Initial research results suggest changing expectations of teachers’ roles and responsibilities during the lockdown, with a significant category related to transitioning teacher identities emerging. The concept of transitioning represents the shift from the role of early childhood educator to educational innovator, essential worker, social worker, and health officer. The findings illustrate the complexity of early childhood educators’ roles during the pandemic.

Keywords: changing role of teachers, constructivist grounded theory, lessons learned, teaching during COVID-19

Procedia PDF Downloads 82
851 Compressed Sensing of Fetal Electrocardiogram Signals Based on Joint Block Multi-Orthogonal Least Squares Algorithm

Authors: Xiang Jianhong, Wang Cong, Wang Linyu

Abstract:

With the rise of medical IoT technologies, Wireless body area networks (WBANs) can collect fetal electrocardiogram (FECG) signals to support telemedicine analysis. The compressed sensing (CS)-based WBANs system can avoid the sampling of a large amount of redundant information and reduce the complexity and computing time of data processing, but the existing algorithms have poor signal compression and reconstruction performance. In this paper, a Joint block multi-orthogonal least squares (JBMOLS) algorithm is proposed. We apply the FECG signal to the Joint block sparse model (JBSM), and a comparative study of sparse transformation and measurement matrices is carried out. A FECG signal compression transmission mode based on Rbio5.5 wavelet, Bernoulli measurement matrix, and JBMOLS algorithm is proposed to improve the compression and reconstruction performance of FECG signal by CS-based WBANs. Experimental results show that the compression ratio (CR) required for accurate reconstruction of this transmission mode is increased by nearly 10%, and the runtime is saved by about 30%.

Keywords: telemedicine, fetal ECG, compressed sensing, joint sparse reconstruction, block sparse signal

Procedia PDF Downloads 112
850 Linguistic Cyberbullying, a Legislative Approach

Authors: Simona Maria Ignat

Abstract:

Bullying online has been an increasing studied topic during the last years. Different approaches, psychological, linguistic, or computational, have been applied. To our best knowledge, a definition and a set of characteristics of phenomenon agreed internationally as a common framework are still waiting for answers. Thus, the objectives of this paper are the identification of bullying utterances on Twitter and their algorithms. This research paper is focused on the identification of words or groups of words, categorized as “utterances”, with bullying effect, from Twitter platform, extracted on a set of legislative criteria. This set is the result of analysis followed by synthesis of law documents on bullying(online) from United States of America, European Union, and Ireland. The outcome is a linguistic corpus with approximatively 10,000 entries. The methods applied to the first objective have been the following. The discourse analysis has been applied in identification of keywords with bullying effect in texts from Google search engine, Images link. Transcription and anonymization have been applied on texts grouped in CL1 (Corpus linguistics 1). The keywords search method and the legislative criteria have been used for identifying bullying utterances from Twitter. The texts with at least 30 representations on Twitter have been grouped. They form the second corpus linguistics, Bullying utterances from Twitter (CL2). The entries have been identified by using the legislative criteria on the the BoW method principle. The BoW is a method of extracting words or group of words with same meaning in any context. The methods applied for reaching the second objective is the conversion of parts of speech to alphabetical and numerical symbols and writing the bullying utterances as algorithms. The converted form of parts of speech has been chosen on the criterion of relevance within bullying message. The inductive reasoning approach has been applied in sampling and identifying the algorithms. The results are groups with interchangeable elements. The outcomes convey two aspects of bullying: the form and the content or meaning. The form conveys the intentional intimidation against somebody, expressed at the level of texts by grammatical and lexical marks. This outcome has applicability in the forensic linguistics for establishing the intentionality of an action. Another outcome of form is a complex of graphemic variations essential in detecting harmful texts online. This research enriches the lexicon already known on the topic. The second aspect, the content, revealed the topics like threat, harassment, assault, or suicide. They are subcategories of a broader harmful content which is a constant concern for task forces and legislators at national and international levels. These topic – outcomes of the dataset are a valuable source of detection. The analysis of content revealed algorithms and lexicons which could be applied to other harmful contents. A third outcome of content are the conveyances of Stylistics, which is a rich source of discourse analysis of social media platforms. In conclusion, this corpus linguistics is structured on legislative criteria and could be used in various fields.

Keywords: corpus linguistics, cyberbullying, legislation, natural language processing, twitter

Procedia PDF Downloads 68
849 Development of Tourism Infrastructure and Cultural Heritage: Case of Gobustan Preserve

Authors: Rufat Nuriyev

Abstract:

Located in the eastern part of the Republic of Azerbaijan and on the western shore of the Caspian Sea, Gobustan National Reserve was inscribed as Gobustan Rock Art Cultural Landscape into the World Heritage List in 2007. Gobustan is an outstanding rock art landscape, where over 6000 rock engravings were found and registered, since the end of Upper Paleolithic up to the Middle Ages. Being a rock art center, the Gobustan seeks to stimulate public awareness and disseminate knowledge of prehistoric art to enrich educational, cultural and artistic communities regionally, nationally and internationally. Due to the Decree of the President of the Republic of Azerbaijan and the “Action Plan” , planned actions started to realize. Some of them implemented before of stipulated date. For the attraction of visitors and improvement of service quality in the museum-reserve, various activities are organized. The building of a new museum center at the foot of the Beyukdash Mountain has been completed in 2011. Main aims of the new museum building and exhibition was to provide better understanding of the importance of this monument for local community, Azerbaijanian culture and the world. In the Petroglyph Museum at Gobustan, digital and traditional media are closely integrated to reveal the complexity of historical, cultural and artistic meaning of prehistoric rock carvings of Gobustan. Alongside with electronic devices, the visitor gets opportunity of direct contact with artifacts and ancient rock carvings.

Keywords: Azerbaijan, Gobustan, rock art, museum

Procedia PDF Downloads 285
848 Remotely Sensed Data Fusion to Extract Vegetation Cover in the Cultural Park of Tassili, South of Algeria

Authors: Y. Fekir, K. Mederbal, M. A. Hammadouche, D. Anteur

Abstract:

The cultural park of the Tassili, occupying a large area of Algeria, is characterized by a rich vegetative biodiversity to be preserved and managed both in time and space. The management of a large area (case of Tassili), by its complexity, needs large amounts of data, which for the most part, are spatially localized (DEM, satellite images and socio-economic information etc.), where the use of conventional and traditional methods is quite difficult. The remote sensing, by its efficiency in environmental applications, became an indispensable solution for this kind of studies. Multispectral imaging sensors have been very useful in the last decade in very interesting applications of remote sensing. They can aid in several domains such as the de¬tection and identification of diverse surface targets, topographical details, and geological features. In this work, we try to extract vegetative areas using fusion techniques between data acquired from sensor on-board the Earth Observing 1 (EO-1) satellite and Landsat ETM+ and TM sensors. We have used images acquired over the Oasis of Djanet in the National Park of Tassili in the south of Algeria. Fusion technqiues were applied on the obtained image to extract the vegetative fraction of the different classes of land use. We compare the obtained results in vegetation end member extraction with vegetation indices calculated from both Hyperion and other multispectral sensors.

Keywords: Landsat ETM+, EO1, data fusion, vegetation, Tassili, Algeria

Procedia PDF Downloads 416
847 A Mechanical Diagnosis Method Based on Vibration Fault Signal down-Sampling and the Improved One-Dimensional Convolutional Neural Network

Authors: Bowei Yuan, Shi Li, Liuyang Song, Huaqing Wang, Lingli Cui

Abstract:

Convolutional neural networks (CNN) have received extensive attention in the field of fault diagnosis. Many fault diagnosis methods use CNN for fault type identification. However, when the amount of raw data collected by sensors is massive, the neural network needs to perform a time-consuming classification task. In this paper, a mechanical fault diagnosis method based on vibration signal down-sampling and the improved one-dimensional convolutional neural network is proposed. Through the robust principal component analysis, the low-rank feature matrix of a large amount of raw data can be separated, and then down-sampling is realized to reduce the subsequent calculation amount. In the improved one-dimensional CNN, a smaller convolution kernel is used to reduce the number of parameters and computational complexity, and regularization is introduced before the fully connected layer to prevent overfitting. In addition, the multi-connected layers can better generalize classification results without cumbersome parameter adjustments. The effectiveness of the method is verified by monitoring the signal of the centrifugal pump test bench, and the average test accuracy is above 98%. When compared with the traditional deep belief network (DBN) and support vector machine (SVM) methods, this method has better performance.

Keywords: fault diagnosis, vibration signal down-sampling, 1D-CNN

Procedia PDF Downloads 114
846 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: exam length, psychometric criteria, synthetic experimental designs, test length

Procedia PDF Downloads 260
845 The Ontological Memory in Bergson as a Conceptual Tool for the Analysis of the Digital Conjuncture

Authors: Douglas Rossi Ramos

Abstract:

The current digital conjuncture, called by some authors as 'Internet of Things' (IoT), 'Web 2.0' or even 'Web 3.0', consists of a network that encompasses any communication of objects and entities, such as data, information, technologies, and people. At this juncture, especially characterized by an "object socialization," communication can no longer be represented as a simple informational flow of messages from a sender, crossing a channel or medium, reaching a receiver. The idea of communication must, therefore, be thought of more broadly in which it is possible to analyze the process communicative from interactions between humans and nonhumans. To think about this complexity, a communicative process that encompasses both humans and other beings or entities communicating (objects and things), it is necessary to constitute a new epistemology of communication to rethink concepts and notions commonly attributed to humans such as 'memory.' This research aims to contribute to this epistemological constitution from the discussion about the notion of memory according to the complex ontology of Henri Bergson. Among the results (the notion of memory in Bergson presents itself as a conceptual tool for the analysis of posthumanism and the anthropomorphic conjuncture of the new advent of digital), there was the need to think about an ontological memory, analyzed as a being itself (being itself of memory), as a strategy for understanding the forms of interaction and communication that constitute the new digital conjuncture, in which communicating beings or entities tend to interact with each other. Rethinking the idea of communication beyond the dimension of transmission in informative sequences paves the way for an ecological perspective of the digital dwelling condition.

Keywords: communication, digital, Henri Bergson, memory

Procedia PDF Downloads 141
844 Iterative Dynamic Programming for 4D Flight Trajectory Optimization

Authors: Kawser Ahmed, K. Bousson, Milca F. Coelho

Abstract:

4D flight trajectory optimization is one of the key ingredients to improve flight efficiency and to enhance the air traffic capacity in the current air traffic management (ATM). The present paper explores the iterative dynamic programming (IDP) as a potential numerical optimization method for 4D flight trajectory optimization. IDP is an iterative version of the Dynamic programming (DP) method. Due to the numerical framework, DP is very suitable to deal with nonlinear discrete dynamic systems. The 4D waypoint representation of the flight trajectory is similar to the discretization by a grid system; thus DP is a natural method to deal with the 4D flight trajectory optimization. However, the computational time and space complexity demanded by the DP is enormous due to the immense number of grid points required to find the optimum, which prevents the use of the DP in many practical high dimension problems. On the other hand, the IDP has shown potentials to deal successfully with high dimension optimal control problems even with a few numbers of grid points at each stage, which reduces the computational effort over the traditional DP approach. Although the IDP has been applied successfully in chemical engineering problems, IDP is yet to be validated in 4D flight trajectory optimization problems. In this paper, the IDP has been successfully used to generate minimum length 4D optimal trajectory avoiding any obstacle in its path, such as a no-fly zone or residential areas when flying in low altitude to reduce noise pollution.

Keywords: 4D waypoint navigation, iterative dynamic programming, obstacle avoidance, trajectory optimization

Procedia PDF Downloads 142