Search results for: syntactical simplification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 89

Search results for: syntactical simplification

29 Power, Pluralism, and History: Norms in International Societies

Authors: Nicole Cervenka

Abstract:

On the question of norms in international politics, scholars are divided over whether norms are a tool for power politics or a genuine reflection of an emergent international society. The line is drawn between rationalism and idealism, but this dialectical relationship needs to be broken down if we hope to come to a comprehensive understanding of how norms play out in international society. The concept of an elusive international society is a simplification of a more pluralistic, cosmopolitan, and diverse collection of international societies. The English School effectively overcomes realist-idealist dichotomies and provides a pluralistic, comprehensive explanation and description of international societies through its application to two distinct areas: human rights as well as security and war. We argue that international norms have always been present in human rights, war, and international security, forming international societies that can be complimentary or oppositional, beneficial or problematic. Power politics are present, but they can only be regarded as partially explanatory of the role of norms in international politics, which must also include history, international law, the media, NGOs, and others to fully represent the normative influences in international societies. A side-by-side comparison of international norms of war/security and human rights show how much international societies converge. World War II was a turning point in terms of international law, these forces of international society have deeper historical roots. Norms of human rights and war/security are often norms of restraint, guiding appropriate treatment of individuals. This can at times give primacy to the individual over the sovereign state. However, state power politics and hegemony are still intact. It cannot be said that there is an emergent international society—international societies are part of broader historical backdrops. Furthermore, states and, more generally, power politics, are important components in international societies, but international norms are far from mere tools of power politics. They define a more diverse, complicated, and ever-present conception of international societies.

Keywords: English school, international societies, norms, pluralism

Procedia PDF Downloads 359
28 Integrated Dynamic Analysis of Semi-Submersible Flap Type Concept

Authors: M. Rafiur Rahman, M. Mezbah Uddin, Mohammad Irfan Uddin, M. Moinul Islam

Abstract:

With a rapid development of offshore renewable energy industry, the research activities in regards of harnessing power from offshore wind and wave energy are increasing day by day. Integration of wind turbines and wave energy converters into one combined semi-submersible platform might be a cost-economy and beneficial option. In this paper, the coupled integrated dynamic analysis in the time domain (TD) of a simplified semi-submersible flap type concept (SFC) is accomplished via state-of-the-art numerical code referred as Simo-Riflex-Aerodyn (SRA). This concept is a combined platform consisting of a semi-submersible floater supporting a 5 MW horizontal axis wind turbine (WT) and three elliptical shaped flap type wave energy converters (WECs) on three pontoons. The main focus is to validate the numerical model of SFC with experimental results and perform the frequency domain (FD) and TD response analysis. The numerical analysis is performed using potential flow theory for hydrodynamics and blade element momentum (BEM) theory for aerodynamics. A variety of environmental conditions encompassing the functional & survival conditions for short-term sea (1-hour simulation) are tested to evaluate the sustainability of the SFC. The numerical analysis is performed in full scale. Finally, the time domain analysis of heave, pitch & surge motions is performed numerically using SRA and compared with the experimental results. Due to the simplification of the model, there are some discrepancies which are discussed in brief.

Keywords: coupled integrated dynamic analysis, SFC, time domain analysis, wave energy converters

Procedia PDF Downloads 192
27 Study of Efficiency of Flying Animal Using Computational Simulation

Authors: Ratih Julistina, M. Agoes Moelyadi

Abstract:

Innovation in aviation technology evolved rapidly by time to time for acquiring the most favorable value of utilization and is usually denoted by efficiency parameter. Nature always become part of inspiration, and for this sector, many researchers focused on studying the behavior of flying animal to comprehend the fundamental, one of them is birds. Experimental testing has already conducted by several researches to seek and calculate the efficiency by putting the object in wind tunnel. Hence, computational simulation is needed to conform the result and give more visualization which is based on Reynold Averaged Navier-Stokes equation solution for unsteady case in time-dependent viscous flow. By creating model from simplification of the real bird as a rigid body, those are Hawk which has low aspect ratio and Swift with high aspect ratio, subsequently generating the multi grid structured mesh to capture and calculate the aerodynamic behavior and characteristics. Mimicking the motion of downstroke and upstroke of bird flight which produced both lift and thrust, the sinusoidal function is used. Simulation is carried out for varied of flapping frequencies within upper and lower range of actual each bird’s frequency which are 1 Hz, 2.87 Hz, 5 Hz for Hawk and 5 Hz, 8.9 Hz, 13 Hz for Swift to investigate the dependency of frequency effecting the efficiency of aerodynamic characteristics production. Also, by comparing the result in different condition flights with the morphology of each bird. Simulation has shown that higher flapping frequency is used then greater aerodynamic coefficient is obtained, on other hand, efficiency on thrust production is not the same. The result is analyzed from velocity and pressure contours, mesh movement as to see the behavior.

Keywords: characteristics of aerodynamic, efficiency, flapping frequency, flapping wing, unsteady simulation

Procedia PDF Downloads 215
26 Applying the Information System to Enhance the Management of Perioperative Nursing

Authors: Ya-Yi Yen

Abstract:

The operating room is a medical environment full of high-risk, high-complexity and high-cost. In addition to assuring patient safety, the operating room should effort on the efficient and safe medical quality for the surgical patients of high risk, elders, and children. If the nursing staffs of operation room carry on the pre-operative visiting prior to surgery, the patient's anxiety and complications are expected to be alleviated, and the hospitalization days may also be shortened. Purpose: Applying the information system to enhance pre-operative visiting, case tracking, and effectiveness recording Method: (I) Application the information system to screen cases by integrating the operation scheduling, and linking the severe surgery codes, for to shorten the time to track cases of operative visiting. Through the improvement, the time required decreased to 1.5 minutes per day from 20 minutes per day, and nursing staffs’ satisfaction with satisfaction for tracking and visiting procedure of case increased to 86% from 54%. (II)The electronic establishment of the operative visiting record enhanced the integrity of the operative visiting record. The integrity rate was rise to 92% from 66%, while nursing staffs’ satisfaction with the visiting record increased to 82% from 61.3%. Since information technology continues evolving, the application of information technology is helpful to the integration of nursing information, simplification of processes, and saving of man-hours. This article introduces the application of information systems to simplify the processes and improve the effectiveness of operation visiting and tracking, including the saving of time, improving the integrity rate of record, and improving the satisfaction of nursing staffs.

Keywords: effectiveness, information system, perioperative nursing, pre-operative visiting

Procedia PDF Downloads 118
25 Evaluation of Model-Based Code Generation for Embedded Systems–Mature Approach for Development in Evolution

Authors: Nikolay P. Brayanov, Anna V. Stoynova

Abstract:

Model-based development approach is gaining more support and acceptance. Its higher abstraction level brings simplification of systems’ description that allows domain experts to do their best without particular knowledge in programming. The different levels of simulation support the rapid prototyping, verifying and validating the product even before it exists physically. Nowadays model-based approach is beneficial for modelling of complex embedded systems as well as a generation of code for many different hardware platforms. Moreover, it is possible to be applied in safety-relevant industries like automotive, which brings extra automation of the expensive device certification process and especially in the software qualification. Using it, some companies report about cost savings and quality improvements, but there are others claiming no major changes or even about cost increases. This publication demonstrates the level of maturity and autonomy of model-based approach for code generation. It is based on a real live automotive seat heater (ASH) module, developed using The Mathworks, Inc. tools. The model, created with Simulink, Stateflow and Matlab is used for automatic generation of C code with Embedded Coder. To prove the maturity of the process, Code generation advisor is used for automatic configuration. All additional configuration parameters are set to auto, when applicable, leaving the generation process to function autonomously. As a result of the investigation, the publication compares the quality of generated embedded code and a manually developed one. The measurements show that generally, the code generated by automatic approach is not worse than the manual one. A deeper analysis of the technical parameters enumerates the disadvantages, part of them identified as topics for our future work.

Keywords: embedded code generation, embedded C code quality, embedded systems, model-based development

Procedia PDF Downloads 218
24 Impact of Map Generalization in Spatial Analysis

Authors: Lin Li, P. G. R. N. I. Pussella

Abstract:

When representing spatial data and their attributes on different types of maps, the scale plays a key role in the process of map generalization. The process is consisted with two main operators such as selection and omission. Once some data were selected, they would undergo of several geometrical changing processes such as elimination, simplification, smoothing, exaggeration, displacement, aggregation and size reduction. As a result of these operations at different levels of data, the geometry of the spatial features such as length, sinuosity, orientation, perimeter and area would be altered. This would be worst in the case of preparation of small scale maps, since the cartographer has not enough space to represent all the features on the map. What the GIS users do is when they wanted to analyze a set of spatial data; they retrieve a data set and does the analysis part without considering very important characteristics such as the scale, the purpose of the map and the degree of generalization. Further, the GIS users use and compare different maps with different degrees of generalization. Sometimes, GIS users are going beyond the scale of the source map using zoom in facility and violate the basic cartographic rule 'it is not suitable to create a larger scale map using a smaller scale map'. In the study, the effect of map generalization for GIS analysis would be discussed as the main objective. It was used three digital maps with different scales such as 1:10000, 1:50000 and 1:250000 which were prepared by the Survey Department of Sri Lanka, the National Mapping Agency of Sri Lanka. It was used common features which were on above three maps and an overlay analysis was done by repeating the data with different combinations. Road data, River data and Land use data sets were used for the study. A simple model, to find the best place for a wild life park, was used to identify the effects. The results show remarkable effects on different degrees of generalization processes. It can see that different locations with different geometries were received as the outputs from this analysis. The study suggests that there should be reasonable methods to overcome this effect. It can be recommended that, as a solution, it would be very reasonable to take all the data sets into a common scale and do the analysis part.

Keywords: generalization, GIS, scales, spatial analysis

Procedia PDF Downloads 305
23 Linearization of Y-Force Equation of Rigid Body Equation of Motion and Behavior of Fighter Aircraft under Imbalance Weight on Wings during Combat

Authors: Jawad Zakir, Syed Irtiza Ali Shah, Rana Shaharyar, Sidra Mahmood

Abstract:

Y-force equation comprises aerodynamic forces, drag and side force with side slip angle β and weight component along with the coupled roll (φ) and pitch angles (θ). This research deals with the linearization of Y-force equation using Small Disturbance theory assuming equilibrium flight conditions for different state variables of aircraft. By using assumptions of Small Disturbance theory in non-linear Y-force equation, finally reached at linearized lateral rigid body equation of motion; which says that in linearized Y-force equation, the lateral acceleration is dependent on the other different aerodynamic and propulsive forces like vertical tail, change in roll rate (Δp) from equilibrium, change in yaw rate (Δr) from equilibrium, change in lateral velocity due to side force, drag and side force components due to side slip, and the lateral equation from coupled rotating frame to decoupled rotating frame. This paper describes implementation of this lateral linearized equation for aircraft control systems. Another significant parameter considered on which y-force equation depends is ‘c’ which shows that any change bought in the weight of aircrafts wing will cause Δφ and cause lateral force i.e. Y_c. This simplification also leads to lateral static and dynamic stability. The linearization of equations is required because much of mathematics control system design for aircraft is based on linear equations. This technique is simple and eases the linearization of the rigid body equations of motion without using any high-speed computers.

Keywords: Y-force linearization, small disturbance theory, side slip, aerodynamic force drag, lateral rigid body equation of motion

Procedia PDF Downloads 460
22 The 'Plain Style' in the Theory and Practice of Project Design: Contributions to the Shaping of an Urban Image on the Waterfront Prior to the 1755 Earthquake

Authors: Armenio Lopes, Carlos Ferreira

Abstract:

In the specific context of the Iberian Union between 1580 and 1640, characteristics emerged in Portuguese architecture that stood out from the main architectural production of the period. Recognised and identified aspects that had begun making their appearance decades before (1521) became significantly more marked during the Hapsburg-Spanish occupation. Distinctive even from the imperialist language of Spain, this trend would endure even after the restoration of independence (1706), continuing through to the start of the age of absolutism. Or perhaps not. This trend, recognised as Plain Style (Kubler), associated with a certain scarcity of resources, involved a certain formal and decorative simplification, as well as a particular set of conventions that would subsequently mark the landscape. This expression could also be seen as a means of asserting a certain spirit of independence as the Iberian Union breathed its last. The image of a simple, bare-bones architecture with purer design lines is associated by various authors –most notably Kubler– with the narratives of modernism, to whose principles it is similar, in a context-specific to the period. There is a contrast with some of the exuberance of the baroque or its expression in the Manueline period, in a similar fashion to modernism's responses to nineteenth-century eclecticism. This assertion and practice of simple architecture, drafted from the interpretation of the treaties, and highlighting a certain classical inspiration, was to become a benchmark in the theory of architecture, spanning the Baroque and Mannerism, until achieving contemporary recognition within certain originality and modernity. At a time when the baroque and its scenography became generally very widespread, it is important also to recognise the role played by plain style architecture in the construction of a rather complex and contradictory waterfront landscape, featuring promises of exuberance and more discrete practices.

Keywords: Carlos Mardel, Lisbon's waterfront, plain style, urban image on the waterfront

Procedia PDF Downloads 105
21 Study of a Lean Premixed Combustor: A Thermo Acoustic Analysis

Authors: Minoo Ghasemzadeh, Rouzbeh Riazi, Shidvash Vakilipour, Alireza Ramezani

Abstract:

In this study, thermo acoustic oscillations of a lean premixed combustor has been investigated, and a mono-dimensional code was developed in this regard. The linearized equations of motion are solved for perturbations with time dependence〖 e〗^iwt. Two flame models were considered in this paper and the effect of mean flow and boundary conditions were also investigated. After manipulation of flame heat release equation together with the equations of flow perturbation within the main components of the combustor model (i.e., plenum/ premixed duct/ and combustion chamber) and by considering proper boundary conditions between the components of model, a system of eight homogeneous equations can be obtained. This simplification, for the main components of the combustor model, is convenient since low frequency acoustic waves are not affected by bends. Moreover, some elements in the combustor are smaller than the wavelength of propagated acoustic perturbations. A convection time is also assumed to characterize the required time for the acoustic velocity fluctuations to travel from the point of injection to the location of flame front in the combustion chamber. The influence of an extended flame model on the acoustic frequencies of combustor was also investigated, assuming the effect of flame speed as a function of equivalence ratio perturbation, on the rate of flame heat release. The abovementioned system of equations has a related eigenvalue equation which has complex roots. The sign of imaginary part of these roots determines whether the disturbances grow or decay and the real part of these roots would give the frequency of the modes. The results show a reasonable agreement between the predicted values of dominant frequencies in the present model and those calculated in previous related studies.

Keywords: combustion instability, dominant frequencies, flame speed, premixed combustor

Procedia PDF Downloads 353
20 Advantages of Neural Network Based Air Data Estimation for Unmanned Aerial Vehicles

Authors: Angelo Lerro, Manuela Battipede, Piero Gili, Alberto Brandl

Abstract:

Redundancy requirements for UAV (Unmanned Aerial Vehicle) are hardly faced due to the generally restricted amount of available space and allowable weight for the aircraft systems, limiting their exploitation. Essential equipment as the Air Data, Attitude and Heading Reference Systems (ADAHRS) require several external probes to measure significant data as the Angle of Attack or the Sideslip Angle. Previous research focused on the analysis of a patented technology named Smart-ADAHRS (Smart Air Data, Attitude and Heading Reference System) as an alternative method to obtain reliable and accurate estimates of the aerodynamic angles. This solution is based on an innovative sensor fusion algorithm implementing soft computing techniques and it allows to obtain a simplified inertial and air data system reducing external devices. In fact, only one external source of dynamic and static pressures is needed. This paper focuses on the benefits which would be gained by the implementation of this system in UAV applications. A simplification of the entire ADAHRS architecture will bring to reduce the overall cost together with improved safety performance. Smart-ADAHRS has currently reached Technology Readiness Level (TRL) 6. Real flight tests took place on ultralight aircraft equipped with a suitable Flight Test Instrumentation (FTI). The output of the algorithm using the flight test measurements demonstrates the capability for this fusion algorithm to embed in a single device multiple physical and virtual sensors. Any source of dynamic and static pressure can be integrated with this system gaining a significant improvement in terms of versatility.

Keywords: aerodynamic angles, air data system, flight test, neural network, unmanned aerial vehicle, virtual sensor

Procedia PDF Downloads 193
19 Simo-syl: A Computer-Based Tool to Identify Language Fragilities in Italian Pre-Schoolers

Authors: Marinella Majorano, Rachele Ferrari, Tamara Bastianello

Abstract:

The recent technological advance allows for applying innovative and multimedia screen-based assessment tools to test children's language and early literacy skills, monitor their growth over the preschool years, and test their readiness for primary school. Several are the advantages that a computer-based assessment tool offers with respect to paper-based tools. Firstly, computer-based tools which provide the use of games, videos, and audio may be more motivating and engaging for children, especially for those with language difficulties. Secondly, computer-based assessments are generally less time-consuming than traditional paper-based assessments: this makes them less demanding for children and provides clinicians and researchers, but also teachers, with the opportunity to test children multiple times over the same school year and, thus, to monitor their language growth more systematically. Finally, while paper-based tools require offline coding, computer-based tools sometimes allow obtaining automatically calculated scores, thus producing less subjective evaluations of the assessed skills and provide immediate feedback. Nonetheless, using computer-based assessment tools to test meta-phonological and language skills in children is not yet common practice in Italy. The present contribution aims to estimate the internal consistency of a computer-based assessment (i.e., the Simo-syl assessment). Sixty-three Italian pre-schoolers aged between 4;10 and 5;9 years were tested at the beginning of the last year of the preschool through paper-based standardised tools in their lexical (Peabody Picture Vocabulary Test), morpho-syntactical (Grammar Repetition Test for Children), meta-phonological (Meta-Phonological skills Evaluation test), and phono-articulatory skills (non-word repetition). The same children were tested through Simo-syl assessment on their phonological and meta-phonological skills (e.g., recognise syllables and vowels and read syllables and words). The internal consistency of the computer-based tool was acceptable (Cronbach's alpha = .799). Children's scores obtained in the paper-based assessment and scores obtained in each task of the computer-based assessment were correlated. Significant and positive correlations emerged between all the tasks of the computer-based assessment and the scores obtained in the CMF (r = .287 - .311, p < .05) and in the correct sentences in the RCGB (r = .360 - .481, p < .01); non-word repetition standardised test significantly correlates with the reading tasks only (r = .329 - .350, p < .05). Further tasks should be included in the current version of Simo-syl to have a comprehensive and multi-dimensional approach when assessing children. However, such a tool represents a good chance for the teachers to early identifying language-related problems even in the school environment.

Keywords: assessment, computer-based, early identification, language-related skills

Procedia PDF Downloads 148
18 Experiences and Perspectives of Jewish Heritage Conservation and Promotion in Oradea and Timişoara, Western Romania

Authors: Andrea Corsale

Abstract:

The historical and geographical regions of Banat and Crişana in Western Romania have long been characterized by a high degree of ethnic diversity. However, this traditionally complex cultural, linguistic, and religious mosaic has undergone a progressive simplification during the past century due to deportations, emigration, and assimilation, and both regions now have a large Romanian-speaking majority population. This contribution focuses on Jewish heritage in the two largest cities of these two regions, Timişoara (Banat) and Oradea (Crişana). The two cities shared some historical events but also went through different experiences, despite their relative geographic proximity. The Jewish community of Timişoara survived the Holocaust basically intact, an almost unique case in Central-Eastern Europe, but largely left the city after the war. Instead, the Jewish community of Oradea was almost completely deported and killed in Auschwitz, and a renewed post-war community gradually emigrated abroad in the following decades. The two Jewish communities are now very small in size but inherited a vast tangible and intangible heritage (synagogues, cemeteries, community buildings, characteristic architecture, memories, local traditions, and histories), partially restored and recovered in recent years. The author’s fieldwork shows that local Jewish stakeholders are aware of the potential of this heritage in terms of cultural and economic benefits, but significant weaknesses and concerns exist, as the small dimension of these communities, and their financial constraints, challenge their future role in the eventual promotion and management of this heritage, which is now basically in the hands of the non-Jewish public and private stakeholders. Projects, experiences, and views related to Jewish heritage conservation and promotion in these two contexts will be portrayed and analysed in order to contribute to a broader discussion on representations and narratives of minority heritage within cultural tourism development dynamics.

Keywords: Jewish heritage, ethnic minorities, heritage tourism, Romania

Procedia PDF Downloads 68
17 Intermittent Effect of Coupled Thermal and Acoustic Sources on Combustion: A Spatial Perspective

Authors: Pallavi Gajjar, Vinayak Malhotra

Abstract:

Rockets have been known to have played a predominant role in spacecraft propulsion. The quintessential aspect of combustion-related requirements of a rocket engine is the minimization of the surrounding risks/hazards. Over time, it has become imperative to understand the combustion rate variation in presence of external energy source(s). Rocket propulsion represents a special domain of chemical propulsion assisted by high speed flows in presence of acoustics and thermal source(s). Jet noise leads to a significant loss of resources and every year a huge amount of financial aid is spent to prevent it. External heat source(s) induce high possibility of fire risk/hazards which can sufficiently endanger the operation of a space vehicle. Appreciable work had been done with justifiable simplification and emphasis on the linear variation of external energy source(s), which yields good physical insight but does not cater to accurate predictions. Present work experimentally attempts to understand the correlation between inter-energy conversions with the non-linear placement of external energy source(s). The work is motivated by the need to have better fire safety and enhanced combustion. The specific objectives of the work are a) To interpret the related energy transfer for combustion in presence of alternate external energy source(s) viz., thermal and acoustic, b) To fundamentally understand the role of key controlling parameters viz., separation distance, the number of the source(s), selected configurations and their non-linear variation to resemble real-life cases. An experimental setup was prepared using incense sticks as potential fuel and paraffin wax candles as the external energy source(s). The acoustics was generated using frequency generator, and source(s) were placed at selected locations. Non-equidistant parametric experimentation was carried out, and the effects were noted on regression rate changes. The results are expected to be very helpful in offering a new perspective into futuristic rocket designs and safety.

Keywords: combustion, acoustic energy, external energy sources, regression rate

Procedia PDF Downloads 112
16 A Systematic Review of Pedometer-or Accelerometer-Based Interventions for Increasing Physical Activity in Low Socioeconomic Groups

Authors: Shaun G. Abbott, Rebecca C. Reynolds, James B. Etter, John B. F. de Wit

Abstract:

The benefits of physical activity (PA) on health are well documented. Low socioeconomic status (SES) is associated with poor health, with PA a suggested mediator. Pedometers and accelerometers offer an effective behavior change tool to increase PA levels. While the role of pedometer and accelerometer use in increasing PA is recognized in many populations, little is known in low-SES groups. We are aiming to assess the effectiveness of pedometer- and accelerometer-based interventions for increasing PA step count and improving subsequent health outcomes among low-SES groups of high-income countries. Medline, Embase, PsycINFO, CENTRAL and SportDiscus databases were searched to identify articles published before 10th July, 2015; using search terms developed from previous systematic reviews. Inclusion criteria are: low-SES participants classified by income, geography, education, occupation or ethnicity; study duration minimum 4 weeks; an intervention and control group; wearing of an unsealed pedometer or accelerometer to objectively measure PA as step counts per day for the duration of the study. We retrieved 2,142 articles from our database searches, after removal of duplicates. Two investigators independently reviewed titles and abstracts of these articles (50% each) and a combined 20% sample were reviewed to account for inter-assessor variation. We are currently verifying the full texts of 430 articles. Included studies will be critically appraised for risk of bias using guidelines suggested by the Cochrane Public Health Group. Two investigators will extract data concerning the intervention; study design; comparators; steps per day; participants; context and presence or absence of obesity and/or chronic disease. Heterogeneity amongst studies is anticipated, thus a narrative synthesis of data will be conducted with the simplification of selected results into percentage increases from baseline to allow for between-study comparison. Results will be presented at the conference in December if selected.

Keywords: accelerometer, pedometer, physical activity, socioeconomic, step count

Procedia PDF Downloads 303
15 A Semantic and Concise Structure to Represent Human Actions

Authors: Tobias Strübing, Fatemeh Ziaeetabar

Abstract:

Humans usually manipulate objects with their hands. To represent these actions in a simple and understandable way, we need to use a semantic framework. For this purpose, the Semantic Event Chain (SEC) method has already been presented which is done by consideration of touching and non-touching relations between manipulated objects in a scene. This method was improved by a computational model, the so-called enriched Semantic Event Chain (eSEC), which incorporates the information of static (e.g. top, bottom) and dynamic spatial relations (e.g. moving apart, getting closer) between objects in an action scene. This leads to a better action prediction as well as the ability to distinguish between more actions. Each eSEC manipulation descriptor is a huge matrix with thirty rows and a massive set of the spatial relations between each pair of manipulated objects. The current eSEC framework has so far only been used in the category of manipulation actions, which eventually involve two hands. Here, we would like to extend this approach to a whole body action descriptor and make a conjoint activity representation structure. For this purpose, we need to do a statistical analysis to modify the current eSEC by summarizing while preserving its features, and introduce a new version called Enhanced eSEC or (e2SEC). This summarization can be done from two points of the view: 1) reducing the number of rows in an eSEC matrix, 2) shrinking the set of possible semantic spatial relations. To achieve these, we computed the importance of each matrix row in an statistical way, to see if it is possible to remove a particular one while all manipulations are still distinguishable from each other. On the other hand, we examined which semantic spatial relations can be merged without compromising the unity of the predefined manipulation actions. Therefore by performing the above analyses, we made the new e2SEC framework which has 20% fewer rows, 16.7% less static spatial and 11.1% less dynamic spatial relations. This simplification, while preserving the salient features of a semantic structure in representing actions, has a tremendous impact on the recognition and prediction of complex actions, as well as the interactions between humans and robots. It also creates a comprehensive platform to integrate with the body limbs descriptors and dramatically increases system performance, especially in complex real time applications such as human-robot interaction prediction.

Keywords: enriched semantic event chain, semantic action representation, spatial relations, statistical analysis

Procedia PDF Downloads 83
14 The Effect of Different Parameters on a Single Invariant Lateral Displacement Distribution to Consider the Higher Modes Effect in a Displacement-Based Pushover Procedure

Authors: Mohamad Amin Amini, Mehdi Poursha

Abstract:

Nonlinear response history analysis (NL-RHA) is a robust analytical tool for estimating the seismic demands of structures responding in the inelastic range. However, because of its conceptual and numerical complications, the nonlinear static procedure (NSP) is being increasingly used as a suitable tool for seismic performance evaluation of structures. The conventional pushover analysis methods presented in various codes (FEMA 356; Eurocode-8; ATC-40), are limited to the first-mode-dominated structures, and cannot take higher modes effect into consideration. Therefore, since more than a decade ago, researchers developed enhanced pushover analysis procedures to take higher modes effect into account. The main objective of this study is to propose an enhanced invariant lateral displacement distribution to take higher modes effect into consideration in performing a displacement-based pushover analysis, whereby a set of laterally applied displacements, rather than forces, is monotonically applied to the structure. For this purpose, the effect of different parameters such as the spectral displacement of ground motion, the modal participation factor, and the effective modal participating mass ratio on the lateral displacement distribution is investigated to find the best distribution. The major simplification of this procedure is that the effect of higher modes is concentrated into a single invariant lateral load distribution. Therefore, only one pushover analysis is sufficient without any need to utilize a modal combination rule for combining the responses. The invariant lateral displacement distribution for pushover analysis is then calculated by combining the modal story displacements using the modal combination rules. The seismic demands resulting from the different procedures are compared to those from the more accurate nonlinear response history analysis (NL-RHA) as a benchmark solution. Two structures of different heights including 10 and 20-story special steel moment resisting frames (MRFs) were selected and evaluated. Twenty ground motion records were used to conduct the NL-RHA. The results show that more accurate responses can be obtained in comparison with the conventional lateral loads when the enhanced modal lateral displacement distributions are used.

Keywords: displacement-based pushover, enhanced lateral load distribution, higher modes effect, nonlinear response history analysis (NL-RHA)

Procedia PDF Downloads 248
13 Two-Component Biocompartible Material for Reconstruction of Articular Hyaline Cartilage

Authors: Alena O. Stepanova, Vera S. Chernonosova, Tatyana S. Godovikova, Konstantin A. Bulatov, Andrey Y. Patrushev, Pavel P. Laktionov

Abstract:

Trauma and arthrosis, not to mention cartilage destruction in overweight and elders put hyaline cartilage lesion among the most frequent diseases of locomotor system. These problems combined with low regeneration potential of the cartilage make regeneration of articular cartilage a high-priority task of tissue engineering. Many types of matrices, the procedures of their installation and autologous chondrocyte implantation protocols were offered, but certain aspects including adhesion of the implant with surrounding cartilage/bone, prevention of the ossification and fibrosis were not resolved. Simplification and acceleration of the procedures resulting in restoration of normal cartilage are also required. We have demonstrated that human chondroblasts can be successfully cultivated at the surface of electrospun scaffolds and produce extracellular matrix components in contrast to chondroblasts grown in homogeneous hydrogels. To restore cartilage we offer to use stacks of electrospun scaffolds fixed with photopolymerized solution of prepared from gelatin and chondroitin-4-sulfate both modified by glycidyl methacrylate and non-toxic photoinitator Darocur 2959. Scaffolds were prepared from nylon 6, polylactide-co-glicolide and their mixtures with modified gelatin. Illumination of chondroblasts in photopolymerized solution using 365 nm LED light had no effect on cell viability at compressive strength of the gel less than0,12 MPa. Stacks of electrospun scaffolds provide good compressive strength and have the potential for substitution with cartilage when biodegradable scaffolds are used. Vascularization can be prevented by introduction of biostable scaffolds in the layers contacting the subchondral bone. Studies of two-component materials (2-3 sheets of electrospun scaffold) implanted in the knee-joints of rabbits and fixed by photopolymerization demonstrated good crush resistance, biocompatibility and good adhesion of the implant with surrounding cartilage. Histological examination of the implants 3 month after implantation demonstrates absence of any inflammation and signs of replacement of the biodegradable scaffolds with normal cartilage. The possibility of intraoperative population of the implants with autologous cells is being investigated.

Keywords: chondroblasts, electrospun scaffolds, hyaline cartilage, photopolymerized gel

Procedia PDF Downloads 250
12 A Program of Data Analysis on the Possible State of the Antibiotic Resistance in Bangladesh Environment in 2019

Authors: S. D. Kadir

Abstract:

Background: Antibiotics have always been at the centrum of the revolution of modern microbiology. Micro-organisms and its pathogenicity, resistant organisms, inappropriate or over usage of various types of antibiotic agents are fuelled multidrug-resistant pathogenic organisms. Our present time review report mainly focuses on the therapeutic condition of antibiotic resistance and the possible roots behind the development of antibiotic resistance in Bangladesh in 2019. Methodology: The systemic review has progressed through a series of research analyses on various manuscripts published on Google Scholar, PubMed, Research Gate, and collected relevant information from established popular healthcare and diagnostic center and its subdivisions all over Bangladesh. Our research analysis on the possible assurance of antibiotic resistance been ensured by the selective medical reports and on random assay on the extent of individual antibiotic in 2019. Results: 5 research articles, 50 medical report summary, and around 5 patients have been interviewed while going through the estimation process. We have prioritized research articles where the research analysis been performed by the appropriate use of the Kirby-Bauer method. Kirby-Bauer technique is preferred as it provides greater efficiency, ensures lower performance expenditure, and supplies greater convenience and simplification in the application. In most of the reviews, clinical and laboratory standards institute guidelines were strictly followed. Most of our reports indicate significant resistance shown by the Beta-lactam drugs. Specifically by the derivatives of Penicillin's, Cephalosporin's (rare use of the first generation Cephalosporin and overuse of the second and third generation of Cephalosporin and misuse of the fourth generation of Cephalosporin), which are responsible for almost 67 percent of the bacterial resistance. Moreover, approximately 20 percent of the resistance was due to the fact of drug pumping from the bacterial cell by tetracycline and sulphonamides and their derivatives. Conclusion: 90 percent of the approximate antibiotic resistance is due to the usage of relative and true broad-spectrum antibiotics. The environment has been created by the following circumstances where; the excessive usage of broad-spectrum antibiotics had led to a condition where the disruption of native bacteria and a series of anti-microbial resistance causing a disturbance of the surrounding environments in medium, leading to a state of super-infection.

Keywords: antibiotics, antibiotic resistance, Kirby Bauer method, microbiology

Procedia PDF Downloads 99
11 Spatial Analysis as a Tool to Assess Risk Management in Peru

Authors: Josué Alfredo Tomas Machaca Fajardo, Jhon Elvis Chahua Janampa, Pedro Rau Lavado

Abstract:

A flood vulnerability index was developed for the Piura River watershed in northern Peru using Principal Component Analysis (PCA) to assess flood risk. The official methodology to assess risk from natural hazards in Peru was introduced in 1980 and proved effective for aiding complex decision-making. This method relies in part on decision-makers defining subjective correlations between variables to identify high-risk areas. While risk identification and ensuing response activities benefit from a qualitative understanding of influences, this method does not take advantage of the advent of national and international data collection efforts, which can supplement our understanding of risk. Furthermore, this method does not take advantage of broadly applied statistical methods such as PCA, which highlight central indicators of vulnerability. Nowadays, information processing is much faster and allows for more objective decision-making tools, such as PCA. The approach presented here develops a tool to improve the current flood risk assessment in the Peruvian basin. Hence, the spatial analysis of the census and other datasets provides a better understanding of the current land occupation and a basin-wide distribution of services and human populations, a necessary step toward ultimately reducing flood risk in Peru. PCA allows the simplification of a large number of variables into a few factors regarding social, economic, physical and environmental dimensions of vulnerability. There is a correlation between the location of people and the water availability mainly found in rivers. For this reason, a comprehensive vision of the population location around the river basin is necessary to establish flood prevention policies. The grouping of 5x5 km gridded areas allows the spatial analysis of flood risk rather than assessing political divisions of the territory. The index was applied to the Peruvian region of Piura, where several flood events occurred in recent past years, being one of the most affected regions during the ENSO events in Peru. The analysis evidenced inequalities for the access to basic services, such as water, electricity, internet and sewage, between rural and urban areas.

Keywords: assess risk, flood risk, indicators of vulnerability, principal component analysis

Procedia PDF Downloads 162
10 Effect of Ease of Doing Business to Economic Growth among Selected Countries in Asia

Authors: Teodorica G. Ani

Abstract:

Economic activity requires an encouraging regulatory environment and effective rules that are transparent and accessible to all. The World Bank has been publishing the annual Doing Business reports since 2004 to investigate the scope and manner of regulations that enhance business activity and those that constrain it. A streamlined business environment supporting the development of competitive small and medium enterprises (SMEs) may expand employment opportunities and improve the living conditions of low income households. Asia has emerged as one of the most attractive markets in the world. Economies in East Asia and the Pacific were among the most active in making it easier for local firms to do business. The study aimed to describe the ease of doing business and its effect to economic growth among selected economies in Asia for the year 2014. The study covered 29 economies in East Asia, Southeast Asia, South Asia and Middle Asia. Ease of doing business is measured by the Doing Business indicators (DBI) of the World Bank. The indicators cover ten aspects of the ease of doing business such as starting a business, dealing with construction permits, getting electricity, registering property, getting credit, protecting investors, paying taxes, trading across borders, enforcing contracts and resolving insolvency. In the study, Gross Domestic Product (GDP) was used as the proxy variable for economic growth. Descriptive research was the research design used. Graphical analysis was used to describe the income and doing business among selected economies. In addition, multiple regression was used to determine the effect of doing business to economic growth. The study presented the income among selected economies. The graph showed China has the highest income while Maldives produces the lowest and that observation were supported by gathered literatures. The study also presented the status of the ten indicators of doing business among selected economies. The graphs showed varying trends on how easy to start a business, deal with construction permits and to register property. Starting a business is easiest in Singapore followed by Hong Kong. The study found out that the variations in ease of doing business is explained by starting a business, dealing with construction permits and registering property. Moreover, an explanation of the regression result implies that a day increase in the average number of days it takes to complete a procedure will decrease the value of GDP in general. The research proposed inputs to policy which may increase the awareness of local government units of different economies on the simplification of the policies of the different components used in measuring doing business.

Keywords: doing business, economic growth, gross domestic product, Asia

Procedia PDF Downloads 355
9 The Debureaucratization Strategy for the Portuguese Health Service through Effective Communication

Authors: Fernando Araujo, Sandra Cardoso, Fátima Fonseca, Sandra Cavaca

Abstract:

A debureaucratization strategy for the Portuguese Health Service was assumed by the Executive Board of the SNS, in deep articulation with the Shared Services of the Ministry of Health. Two of the main dimensions were focused on sick leaves (SL), that transform primary health care (PHC) in administrative institutions, limiting access to patients. The self-declaration of illness (SDI) project, through the National Health Service Contact Centre (SNS24), began on May 1, 2023, and has already resulted in the issuance of more than 300,000 SDI without the need to allocate resources from the National Health Service (NHS). This political decision allows each citizen, in a maximum 2 times/year, and 3 days each time, if ill, through their own responsibility, report their health condition in a dematerialized way, and by this way justified the absence to work, although by Portuguese law in these first three days, there is no payment of salary. Using a digital approach, it is now feasible without the need to go to the PHC and occupy the time of the PHC only to obtain an SL. Through this measure, bureaucracy has been reduced, and the system has been focused on users, improving the lives of citizens and reducing the administrative burden on PHC, which now has more consultation times for users who need it. The second initiative, which began on March 1, 2024, allows the SL to be issued in emergency departments (ED) of public hospitals and in the health institutions of the social and private sectors. This project is intended to allow the user who has suffered a situation of acute urgent illness and who has been observed in an ED of a public hospital or in a private or social entity no longer need to go to PHC only to apply for the respective SL. Since March 1, 54,453 SLs have been issued, 242 in private or social sector institutions and 6,918 in public hospitals, of which 134 were in ED and 47,292 in PHC. This approach has proven to be technically robust, allows immediate resolution of problems and differentiates the performance of doctors. However, it is important to continue to qualify the proper functioning of the ED, preventing non-urgent users from going there only to obtain SL. Thus, in order to make better use of existing resources, it was operationalizing this extension of its issuance in a balanced way, allowing SL to be issued in the ED of hospitals only to critically ill patients or patients referred by INEM, SNS24, or PHC. In both cases, an intense public campaign was implemented to explain the way it works and the benefits for patients. In satisfaction surveys, more than 95% of patients and doctors were satisfied with the solutions, asking for extensions to other areas. The administrative simplification agenda of the NHS continues its effective development. For the success of this debureaucratization agenda, the key factors are effective communication and the ability to reach patients and health professionals in order to increase health literacy and the correct use of NHS.

Keywords: debureaucratization strategy, self-declaration of illness, sick leaves, SNS24

Procedia PDF Downloads 25
8 Towards Accurate Velocity Profile Models in Turbulent Open-Channel Flows: Improved Eddy Viscosity Formulation

Authors: W. Meron Mebrahtu, R. Absi

Abstract:

Velocity distribution in turbulent open-channel flows is organized in a complex manner. This is due to the large spatial and temporal variability of fluid motion resulting from the free-surface turbulent flow condition. This phenomenon is complicated further due to the complex geometry of channels and the presence of solids transported. Thus, several efforts were made to understand the phenomenon and obtain accurate mathematical models that are suitable for engineering applications. However, predictions are inaccurate because oversimplified assumptions are involved in modeling this complex phenomenon. Therefore, the aim of this work is to study velocity distribution profiles and obtain simple, more accurate, and predictive mathematical models. Particular focus will be made on the acceptable simplification of the general transport equations and an accurate representation of eddy viscosity. Wide rectangular open-channel seems suitable to begin the study; other assumptions are smooth-wall, and sediment-free flow under steady and uniform flow conditions. These assumptions will allow examining the effect of the bottom wall and the free surface only, which is a necessary step before dealing with more complex flow scenarios. For this flow condition, two ordinary differential equations are obtained for velocity profiles; from the Reynolds-averaged Navier-Stokes (RANS) equation and equilibrium consideration between turbulent kinetic energy (TKE) production and dissipation. Then different analytic models for eddy viscosity, TKE, and mixing length were assessed. Computation results for velocity profiles were compared to experimental data for different flow conditions and the well-known linear, log, and log-wake laws. Results show that the model based on the RANS equation provides more accurate velocity profiles. In the viscous sublayer and buffer layer, the method based on Prandtl’s eddy viscosity model and Van Driest mixing length give a more precise result. For the log layer and outer region, a mixing length equation derived from Von Karman’s similarity hypothesis provides the best agreement with measured data except near the free surface where an additional correction based on a damping function for eddy viscosity is used. This method allows more accurate velocity profiles with the same value of the damping coefficient that is valid under different flow conditions. This work continues with investigating narrow channels, complex geometries, and the effect of solids transported in sewers.

Keywords: accuracy, eddy viscosity, sewers, velocity profile

Procedia PDF Downloads 82
7 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products

Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola

Abstract:

The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.

Keywords: decision making, design euristics, product design, product design process, design paradigms

Procedia PDF Downloads 83
6 Sinhala Sign Language to Grammatically Correct Sentences using NLP

Authors: Anjalika Fernando, Banuka Athuraliya

Abstract:

This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired community

Keywords: Sinhala sign language, sign Language, NLP, LSTM, NMT

Procedia PDF Downloads 70
5 3D-Mesh Robust Watermarking Technique for Ownership Protection and Authentication

Authors: Farhan A. Alenizi

Abstract:

Digital watermarking has evolved in the past years as an important means for data authentication and ownership protection. The images and video watermarking was well known in the field of multimedia processing; however, 3D objects' watermarking techniques have emerged as an important means for the same purposes, as 3D mesh models are in increasing use in different areas of scientific, industrial, and medical applications. Like the image watermarking techniques, 3D watermarking can take place in either space or transform domains. Unlike images and video watermarking, where the frames have regular structures in both space and temporal domains, 3D objects are represented in different ways as meshes that are basically irregular samplings of surfaces; moreover, meshes can undergo a large variety of alterations which may be hard to tackle. This makes the watermarking process more challenging. While the transform domain watermarking is preferable in images and videos, they are still difficult to implement in 3d meshes due to the huge number of vertices involved and the complicated topology and geometry, and hence the difficulty to perform the spectral decomposition, even though significant work was done in the field. Spatial domain watermarking has attracted significant attention in the past years; they can either act on the topology or on the geometry of the model. Exploiting the statistical characteristics in the 3D mesh models from both geometrical and topological aspects was useful in hiding data. However, doing that with minimal surface distortions to the mesh attracted significant research in the field. A 3D mesh blind watermarking technique is proposed in this research. The watermarking method depends on modifying the vertices' positions with respect to the center of the object. An optimal method will be developed to reduce the errors, minimizing the distortions that the 3d object may experience due to the watermarking process, and reducing the computational complexity due to the iterations and other factors. The technique relies on the displacement process of the vertices' locations depending on the modification of the variances of the vertices’ norms. Statistical analyses were performed to establish the proper distributions that best fit each mesh, and hence establishing the bins sizes. Several optimizing approaches were introduced in the realms of mesh local roughness, the statistical distributions of the norms, and the displacements in the mesh centers. To evaluate the algorithm's robustness against other common geometry and connectivity attacks, the watermarked objects were subjected to uniform noise, Laplacian smoothing, vertices quantization, simplification, and cropping. Experimental results showed that the approach is robust in terms of both perceptual and quantitative qualities. It was also robust against both geometry and connectivity attacks. Moreover, the probability of true positive detection versus the probability of false-positive detection was evaluated. To validate the accuracy of the test cases, the receiver operating characteristics (ROC) curves were drawn, and they’ve shown robustness from this aspect. 3D watermarking is still a new field but still a promising one.

Keywords: watermarking, mesh objects, local roughness, Laplacian Smoothing

Procedia PDF Downloads 136
4 Effect of Rolling Shear Modulus and Geometric Make up on the Out-Of-Plane Bending Performance of Cross-Laminated Timber Panel

Authors: Md Tanvir Rahman, Mahbube Subhani, Mahmud Ashraf, Paul Kremer

Abstract:

Cross-laminated timber (CLT) is made from layers of timber boards orthogonally oriented in the thickness direction, and due to this, CLT can withstand bi-axial bending in contrast with most other engineered wood products such as laminated veneer lumber (LVL) and glued laminated timber (GLT). Wood is cylindrically anisotropic in nature and is characterized by significantly lower elastic modulus and shear modulus in the planes perpendicular to the fibre direction, and is therefore classified as orthotropic material and is thus characterized by 9 elastic constants which are three elastic modulus in longitudinal direction, tangential direction and radial direction, three shear modulus in longitudinal tangential plane, longitudinal radial plane and radial tangential plane and three Poisson’s ratio. For simplification, timber materials are generally assumed to be transversely isotropic, reducing the number of elastic properties characterizing it to 5, where the longitudinal plane and radial planes are assumed to be planes of symmetry. The validity of this assumption was investigated through numerical modelling of CLT with both orthotropic mechanical properties and transversely isotropic material properties for three softwood species, which are Norway spruce, Douglas fir, Radiata pine, and three hardwood species, namely Victorian ash, Beech wood, and Aspen subjected to uniformly distributed loading under simply supported boundary condition. It was concluded that assuming the timber to be transversely isotropic results in a negligible error in the order of 1 percent. It was also observed that along with longitudinal elastic modulus, ratio of longitudinal shear modulus (GL) and rolling shear modulus (GR) has a significant effect on a deflection for CLT panels of lower span to depth ratio. For softwoods such as Norway spruce and Radiata pine, the ratio of longitudinal shear modulus, GL to rolling shear modulus GR is reported to be in the order of 12 to 15 times in literature. This results in shear flexibility in transverse layers leading to increased deflection under out-of-plane loading. The rolling shear modulus of hardwoods has been found to be significantly higher than those of softwoods, where the ratio between longitudinal shear modulus to rolling shear modulus as low as 4. This has resulted in a significant rise in research into the manufacturing of CLT from entirely from hardwood, as well as from a combination of softwood and hardwoods. The commonly used beam theory to analyze the performance of CLT panels under out-of-plane loads are the Shear analogy method, Gamma method, and k-method. The shear analogy method has been found to be the most effective method where shear deformation is significant. The effect of the ratio of longitudinal shear modulus and rolling shear modulus of cross-layer on the deflection of CLT under uniformly distributed load with respect to its length to depth ratio was investigated using shear analogy method. It was observed that shear deflection is reduced significantly as the ratio of the shear modulus of the longitudinal layer and rolling shear modulus of cross-layer decreases. This indicates that there is significant room for improvement of the bending performance of CLT through developing hybrid CLT from a mix of softwood and hardwood.

Keywords: rolling shear modulus, shear deflection, ratio of shear modulus and rolling shear modulus, timber

Procedia PDF Downloads 95
3 Modelling the Art Historical Canon: The Use of Dynamic Computer Models in Deconstructing the Canon

Authors: Laura M. F. Bertens

Abstract:

There is a long tradition of visually representing the art historical canon, in schematic overviews and diagrams. This is indicative of the desire for scientific, ‘objective’ knowledge of the kind (seemingly) produced in the natural sciences. These diagrams will, however, always retain an element of subjectivity and the modelling methods colour our perception of the represented information. In recent decades visualisations of art historical data, such as hand-drawn diagrams in textbooks, have been extended to include digital, computational tools. These tools significantly increase modelling strength and functionality. As such, they might be used to deconstruct and amend the very problem caused by traditional visualisations of the canon. In this paper, the use of digital tools for modelling the art historical canon is studied, in order to draw attention to the artificial nature of the static models that art historians are presented with in textbooks and lectures, as well as to explore the potential of digital, dynamic tools in creating new models. To study the way diagrams of the canon mediate the represented information, two modelling methods have been used on two case studies of existing diagrams. The tree diagram Stammbaum der neudeutschen Kunst (1823) by Ferdinand Olivier has been translated to a social network using the program Visone, and the famous flow chart Cubism and Abstract Art (1936) by Alfred Barr has been translated to an ontological model using Protégé Ontology Editor. The implications of the modelling decisions have been analysed in an art historical context. The aim of this project has been twofold. On the one hand the translation process makes explicit the design choices in the original diagrams, which reflect hidden assumptions about the Western canon. Ways of organizing data (for instance ordering art according to artist) have come to feel natural and neutral and implicit biases and the historically uneven distribution of power have resulted in underrepresentation of groups of artists. Over the last decades, scholars from fields such as Feminist Studies, Postcolonial Studies and Gender Studies have considered this problem and tried to remedy it. The translation presented here adds to this deconstruction by defamiliarizing the traditional models and analysing the process of reconstructing new models, step by step, taking into account theoretical critiques of the canon, such as the feminist perspective discussed by Griselda Pollock, amongst others. On the other hand, the project has served as a pilot study for the use of digital modelling tools in creating dynamic visualisations of the canon for education and museum purposes. Dynamic computer models introduce functionalities that allow new ways of ordering and visualising the artworks in the canon. As such, they could form a powerful tool in the training of new art historians, introducing a broader and more diverse view on the traditional canon. Although modelling will always imply a simplification and therefore a distortion of reality, new modelling techniques can help us get a better sense of the limitations of earlier models and can provide new perspectives on already established knowledge.

Keywords: canon, ontological modelling, Protege Ontology Editor, social network modelling, Visone

Procedia PDF Downloads 101
2 Typology of Fake News Dissemination Strategies in Social Networks in Social Events

Authors: Mohadese Oghbaee, Borna Firouzi

Abstract:

The emergence of the Internet and more specifically the formation of social media has provided the ground for paying attention to new types of content dissemination. In recent years, Social media users share information, communicate with others, and exchange opinions on social events in this space. Many of the information published in this space are suspicious and produced with the intention of deceiving others. These contents are often called "fake news". Fake news, by disrupting the circulation of the concept and similar concepts such as fake news with correct information and misleading public opinion, has the ability to endanger the security of countries and deprive the audience of the basic right of free access to real information; Competing governments, opposition elements, profit-seeking individuals and even competing organizations, knowing about this capacity, act to distort and overturn the facts in the virtual space of the target countries and communities on a large scale and influence public opinion towards their goals. This process of extensive de-truthing of the information space of the societies has created a wave of harm and worries all over the world. The formation of these concerns has led to the opening of a new path of research for the timely containment and reduction of the destructive effects of fake news on public opinion. In addition, the expansion of this phenomenon has the potential to create serious and important problems for societies, and its impact on events such as the 2016 American elections, Brexit, 2017 French elections, 2019 Indian elections, etc., has caused concerns and led to the adoption of approaches It has been dealt with. In recent years, a simple look at the growth trend of research in "Scopus" shows an increasing increase in research with the keyword "false information", which reached its peak in 2020, namely 524 cases, reached, while in 2015, only 30 scientific-research contents were published in this field. Considering that one of the capabilities of social media is to create a context for the dissemination of news and information, both true and false, in this article, the classification of strategies for spreading fake news in social networks was investigated in social events. To achieve this goal, thematic analysis research method was chosen. In this way, an extensive library study was first conducted in global sources. Then, an in-depth interview was conducted with 18 well-known specialists and experts in the field of news and media in Iran. These experts were selected by purposeful sampling. Then by analyzing the data using the theme analysis method, strategies were obtained; The strategies achieved so far (research is in progress) include unrealistically strengthening/weakening the speed and content of the event, stimulating psycho-media movements, targeting emotional audiences such as women, teenagers and young people, strengthening public hatred, calling the reaction legitimate/illegitimate. events, incitement to physical conflict, simplification of violent protests and targeted publication of images and interviews were introduced.

Keywords: fake news, social network, social events, thematic analysis

Procedia PDF Downloads 34
1 Reflection of Landscape Agrogenization in the Soil Cover Structure and Profile Morphology: Example of Lithuania Agroecosystem

Authors: Jonas Volungevicius, Kristina Amaleviciute, Rimantas Vaisvalavicius, Alvyra Slepetiene, Darijus Veteikis

Abstract:

Lithuanian territory is characterized by landscape with prevailing morain hills and clayey lowlands. The largest part of it has endured agrogenization of various degrees which was the cause of changes both in the structure of landscape and soil cover, transformations of soil profile and degradation of natural background soils. These changes influence negatively geoecological potential of landscape and soil and contribute to the weakening of the sustainability of agroecosystems. Research objective: to reveal the landscape agrogenization induced alterations of catenae and their appendant soil profiles in Lithuanian morain hills and clayey lowlands. Methods: Soil cover analysis and catenae charting was conducted using landscape profiling; soil morphology detected and soil type identified following WRB 2014. Granulometric composition of soil profiles was obtained by laser diffraction method (lazer diffractometer Mastersizer 2000). pH was measured in H2O extraction using potentiometric titration; SOC was determined by the Tyurin method modified by Nikitin, measuring with spectrometer Cary 50 (VARIAN) in 590 nm wavelength using glucose standards. Results: analysis showed that the decrease of forest vegetation and the other natural landscape components following the agrogenization of the research area influenced differently but significantly the structural alterations in soil cover and vertical soil profile. The research detected that due to landscape agrogenization, the suppression of zone-specific processes and the intensification of inter-zone processes determined by agrogenic factors take place in Lithuanian agroecosystems. In forested hills historically prevailing Retisols and Histosols territorial complex is transforming into the territorial complex of Regosols, Deluvial soils and drained Histosols. Processes taking place are simplification of vertical profile structure, intensive rejuvenation of profile, disappearance of the features of zone-specific soil-forming processes (podzolization, lessivage, gley formation). Erosion and deluvial processes manifest more intensively and weakly accumulating organic material more intensively spread in a vertical soil profile. The territorial soil complex of Gleyic Luvisols and Gleysols dominating in forested clayey lowlands subjected to agrogenization is transformed into the catena of drained Luvisols and pseudo Cambisols. The best expressed are their changes in moisture regime (morphological features of gley and stagnic properties are on decline) together with alterations of pH and distribution and intensity of accumulation of organic matter in profile. A specific horizon, antraquic, uncharacteristic to natural soil formation is appearing. Important to note that due to deep ploughing and other agrotechnical measures, the natural vertical differentiation of clay particles in a soil profile is destroyed which leads not only to alterations of physical qualities of soil, but also encumbers the identification of Luvisols by creating presumptions to misidentify them as Cambisols. The latter have never developed in these ecosystems under the present climatic conditions. Acknowledgements: This work was supported by the National Science Program: The effect of long-term, different-intensity management of resources on the soils of different genesis and on other components of the agro-ecosystems [grant number SIT-9/2015] funded by the Research Council of Lithuania.

Keywords: agroecosystems, landscape agrogenization, luvisols, retisols, transformation of soil profile

Procedia PDF Downloads 231