Search results for: using an Anisotropic Analytical Algorithm (AAA)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5836

Search results for: using an Anisotropic Analytical Algorithm (AAA)

916 Raising the Property Provisions of the Topographic Located near the Locality of Gircov, Romania

Authors: Carmen Georgeta Dumitrache

Abstract:

Measurements of terrestrial science aims to study the totality of operations and computing, which are carried out for the purposes of representation on the plan or map of the land surface in a specific cartographic projection and topographic scale. With the development of society, the metrics have evolved, and they land, being dependent on the achievement of a goal-bound utility of economic activity and of a scientific purpose related to determining the form and dimensions of the Earth. For measurements in the field, data processing and proper representation on drawings and maps of planimetry and landform of the land, using topographic and geodesic instruments, calculation and graphical reporting, which requires a knowledge of theoretical and practical concepts from different areas of science and technology. In order to use properly in practice, topographical and geodetic instruments designed to measure precise angles and distances are required knowledge of geometric optics, precision mechanics, the strength of materials, and more. For processing, the results from field measurements are necessary for calculation methods, based on notions of geometry, trigonometry, algebra, mathematical analysis and computer science. To be able to illustrate topographic measurements was established for the lifting of property located near the locality of Gircov, Romania. We determine this total surface of the plan (T30), parcel/plot, but also in the field trace the coordinates of a parcel. The purpose of the removal of the planimetric consisted of: the exact determination of the bounding surface; analytical calculation of the surface; comparing the surface determined with the one registered in the documents produced; drawing up a plan of location and delineation with closeness and distance contour, as well as highlighting the parcels comprising this property; drawing up a plan of location and delineation with closeness and distance contour for a parcel from Dave; in the field trace outline of plot points from the previous point. The ultimate goal of this work was to determine and represent the surface, but also to tear off a plot of the surface total, while respecting the first surface condition imposed by the Act of the beneficiary's property.

Keywords: topography, surface, coordinate, modeling

Procedia PDF Downloads 223
915 A Spatial Repetitive Controller Applied to an Aeroelastic Model for Wind Turbines

Authors: Riccardo Fratini, Riccardo Santini, Jacopo Serafini, Massimo Gennaretti, Stefano Panzieri

Abstract:

This paper presents a nonlinear differential model, for a three-bladed horizontal axis wind turbine (HAWT) suited for control applications. It is based on a 8-dofs, lumped parameters structural dynamics coupled with a quasi-steady sectional aerodynamics. In particular, using the Euler-Lagrange Equation (Energetic Variation approach), the authors derive, and successively validate, such model. For the derivation of the aerodynamic model, the Greenbergs theory, an extension of the theory proposed by Theodorsen to the case of thin airfoils undergoing pulsating flows, is used. Specifically, in this work, the authors restricted that theory under the hypothesis of low perturbation reduced frequency k, which causes the lift deficiency function C(k) to be real and equal to 1. Furthermore, the expressions of the aerodynamic loads are obtained using the quasi-steady strip theory (Hodges and Ormiston), as a function of the chordwise and normal components of relative velocity between flow and airfoil Ut, Up, their derivatives, and section angular velocity ε˙. For the validation of the proposed model, the authors carried out open and closed-loop simulations of a 5 MW HAWT, characterized by radius R =61.5 m and by mean chord c = 3 m, with a nominal angular velocity Ωn = 1.266rad/sec. The first analysis performed is the steady state solution, where a uniform wind Vw = 11.4 m/s is considered and a collective pitch angle θ = 0.88◦ is imposed. During this step, the authors noticed that the proposed model is intrinsically periodic due to the effect of the wind and of the gravitational force. In order to reject this periodic trend in the model dynamics, the authors propose a collective repetitive control algorithm coupled with a PD controller. In particular, when the reference command to be tracked and/or the disturbance to be rejected are periodic signals with a fixed period, the repetitive control strategies can be applied due to their high precision, simple implementation and little performance dependency on system parameters. The functional scheme of a repetitive controller is quite simple and, given a periodic reference command, is composed of a control block Crc(s) usually added to an existing feedback control system. The control block contains and a free time-delay system eτs in a positive feedback loop, and a low-pass filter q(s). It should be noticed that, while the time delay term reduces the stability margin, on the other hand the low pass filter is added to ensure stability. It is worth noting that, in this work, the authors propose a phase shifting for the controller and the delay system has been modified as e^(−(T−γk)), where T is the period of the signal and γk is a phase shifting of k samples of the same periodic signal. It should be noticed that, the phase shifting technique is particularly useful in non-minimum phase systems, such as flexible structures. In fact, using the phase shifting, the iterative algorithm could reach the convergence also at high frequencies. Notice that, in our case study, the shifting of k samples depends both on the rotor angular velocity Ω and on the rotor azimuth angle Ψ: we refer to this controller as a spatial repetitive controller. The collective repetitive controller has also been coupled with a C(s) = PD(s), in order to dampen oscillations of the blades. The performance of the spatial repetitive controller is compared with an industrial PI controller. In particular, starting from wind speed velocity Vw = 11.4 m/s the controller is asked to maintain the nominal angular velocity Ωn = 1.266rad/s after an instantaneous increase of wind speed (Vw = 15 m/s). Then, a purely periodic external disturbance is introduced in order to stress the capabilities of the repetitive controller. The results of the simulations show that, contrary to a simple PI controller, the spatial repetitive-PD controller has the capability to reject both external disturbances and periodic trend in the model dynamics. Finally, the nominal value of the angular velocity is reached, in accordance with results obtained with commercial software for a turbine of the same type.

Keywords: wind turbines, aeroelasticity, repetitive control, periodic systems

Procedia PDF Downloads 219
914 Software Transactional Memory in a Dynamic Programming Language at Virtual Machine Level

Authors: Szu-Kai Hsu, Po-Ching Lin

Abstract:

As more and more multi-core processors emerge, traditional sequential programming paradigm no longer suffice. Yet only few modern dynamic programming languages can leverage such advantage. Ruby, for example, despite its wide adoption, only includes threads as a simple parallel primitive. The global virtual machine lock of official Ruby runtime makes it impossible to exploit full parallelism. Though various alternative Ruby implementations do eliminate the global virtual machine lock, they only provide developers dated locking mechanism for data synchronization. However, traditional locking mechanism error-prone by nature. Software Transactional Memory is one of the promising alternatives among others. This paper introduces a new virtual machine: GobiesVM to provide a native software transactional memory based solution for dynamic programming languages to exploit parallelism. We also proposed a simplified variation of Transactional Locking II algorithm. The empirical results of our experiments show that support of STM at virtual machine level enables developers to write straightforward code without compromising parallelism or sacrificing thread safety. Existing source code only requires minimal or even none modi cation, which allows developers to easily switch their legacy codebase to a parallel environment. The performance evaluations of GobiesVM also indicate the difference between sequential and parallel execution is significant.

Keywords: global interpreter lock, ruby, software transactional memory, virtual machine

Procedia PDF Downloads 246
913 Urban Meetings: Graphic Analysis of the Public Space in a Cultural Building from São Paulo

Authors: Thalita Carvalho Martins de Castro, Núbia Bernardi

Abstract:

Currently, studies evidence that our cities are portraits of social relations. In the midst of so many segregations, cultural buildings emerge as a place to assemble collective activities and expressions. Through theater, exhibitions, educational workshops, libraries, the architecture approaches human relations and seeks to propose meeting places. The purpose of this research is to deepen the discussions about the contributions of cultural buildings in the use of the spaces of the contemporary city, based on the data and measure collected in the master's research in progress. The graphic analysis of the insertion of contemporary cultural buildings seeks to highlight the social use of space. The urban insertions of contemporary cultural buildings in the city of São Paulo (Brazil) will be analyzed to understand the relations between the architectural form and its audience. The collected data describe a dynamic of flows and the permanence in the use of these spaces, indicating the contribution of the cultural buildings, associated with artistic production, in the dynamics of urban spaces and the social modifications of their milieu. Among the case studies, the research in development is based on the registration and graphic analysis of the Praça das Artes (2012) building located in the historical central region of the city, which after a long period of great degradation undergoes a current redevelopment. The choice of this building was based on four parameters, both on the architectural scale and on the urban scale: urban insertion, local impact, cultural production and a mix of uses. For the analysis will be applied two methodologies of graphic analysis, one with diagrams accompanied by texts and another with the active analysis for open space projects using complementary graphic methodologies, with maps, plants, info-graphics, perspectives, time-lapse videos and analytical tables. This research aims to reinforce the debates between the methodologies of form-use spaces and visual synthesis applied in cultural buildings, in order that new projects can structure public spaces as catalysts for social use, generating improvements in the daily life of its users and in the cities where they are inserted.

Keywords: cultural buildings, design methodologies, graphic analysis, public spaces

Procedia PDF Downloads 275
912 Efficient Estimation for the Cox Proportional Hazards Cure Model

Authors: Khandoker Akib Mohammad

Abstract:

While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.

Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood

Procedia PDF Downloads 107
911 Web Proxy Detection via Bipartite Graphs and One-Mode Projections

Authors: Zhipeng Chen, Peng Zhang, Qingyun Liu, Li Guo

Abstract:

With the Internet becoming the dominant channel for business and life, many IPs are increasingly masked using web proxies for illegal purposes such as propagating malware, impersonate phishing pages to steal sensitive data or redirect victims to other malicious targets. Moreover, as Internet traffic continues to grow in size and complexity, it has become an increasingly challenging task to detect the proxy service due to their dynamic update and high anonymity. In this paper, we present an approach based on behavioral graph analysis to study the behavior similarity of web proxy users. Specifically, we use bipartite graphs to model host communications from network traffic and build one-mode projections of bipartite graphs for discovering social-behavior similarity of web proxy users. Based on the similarity matrices of end-users from the derived one-mode projection graphs, we apply a simple yet effective spectral clustering algorithm to discover the inherent web proxy users behavior clusters. The web proxy URL may vary from time to time. Still, the inherent interest would not. So, based on the intuition, by dint of our private tools implemented by WebDriver, we examine whether the top URLs visited by the web proxy users are web proxies. Our experiment results based on real datasets show that the behavior clusters not only reduce the number of URLs analysis but also provide an effective way to detect the web proxies, especially for the unknown web proxies.

Keywords: bipartite graph, one-mode projection, clustering, web proxy detection

Procedia PDF Downloads 216
910 A Framework Based on Dempster-Shafer Theory of Evidence Algorithm for the Analysis of the TV-Viewers’ Behaviors

Authors: Hamdi Amroun, Yacine Benziani, Mehdi Ammi

Abstract:

In this paper, we propose an approach of detecting the behavior of the viewers of a TV program in a non-controlled environment. The experiment we propose is based on the use of three types of connected objects (smartphone, smart watch, and a connected remote control). 23 participants were observed while watching their TV programs during three phases: before, during and after watching a TV program. Their behaviors were detected using an approach based on The Dempster Shafer Theory (DST) in two phases. The first phase is to approximate dynamically the mass functions using an approach based on the correlation coefficient. The second phase is to calculate the approximate mass functions. To approximate the mass functions, two approaches have been tested: the first approach was to divide each features data space into cells; each one has a specific probability distribution over the behaviors. The probability distributions were computed statistically (estimated by empirical distribution). The second approach was to predict the TV-viewing behaviors through the use of classifiers algorithms and add uncertainty to the prediction based on the uncertainty of the model. Results showed that mixing the fusion rule with the computation of the initial approximate mass functions using a classifier led to an overall of 96%, 95% and 96% success rate for the first, second and third TV-viewing phase respectively. The results were also compared to those found in the literature. This study aims to anticipate certain actions in order to maintain the attention of TV viewers towards the proposed TV programs with usual connected objects, taking into account the various uncertainties that can be generated.

Keywords: Iot, TV-viewing behaviors identification, automatic classification, unconstrained environment

Procedia PDF Downloads 193
909 Recent Advances in Pulse Width Modulation Techniques and Multilevel Inverters

Authors: Satish Kumar Peddapelli

Abstract:

This paper presents advances in pulse width modulation techniques which refers to a method of carrying information on train of pulses and the information be encoded in the width of pulses. Pulse Width Modulation is used to control the inverter output voltage. This is done by exercising the control within the inverter itself by adjusting the ON and OFF periods of inverter. By fixing the DC input voltage we get AC output voltage. In variable speed AC motors the AC output voltage from a constant DC voltage is obtained by using inverter. Recent developments in power electronics and semiconductor technology have lead improvements in power electronic systems. Hence, different circuit configurations namely multilevel inverters have become popular and considerable interest by researcher are given on them. A fast Space-Vector Pulse Width Modulation (SVPWM) method for five-level inverter is also discussed. In this method, the space vector diagram of the five-level inverter is decomposed into six space vector diagrams of three-level inverters. In turn, each of these six space vector diagrams of three-level inverter is decomposed into six space vector diagrams of two-level inverters. After decomposition, all the remaining necessary procedures for the three-level SVPWM are done like conventional two-level inverter. The proposed method reduces the algorithm complexity and the execution time. It can be applied to the multilevel inverters above the five-level also. The experimental setup for three-level diode-clamped inverter is developed using TMS320LF2407 DSP controller and the experimental results are analysed.

Keywords: five-level inverter, space vector pulse wide modulation, diode clamped inverter, electrical engineering

Procedia PDF Downloads 359
908 Pilot-Assisted Direct-Current Biased Optical Orthogonal Frequency Division Multiplexing Visible Light Communication System

Authors: Ayad A. Abdulkafi, Shahir F. Nawaf, Mohammed K. Hussein, Ibrahim K. Sileh, Fouad A. Abdulkafi

Abstract:

Visible light communication (VLC) is a new approach of optical wireless communication proposed to support the congested radio frequency (RF) spectrum. VLC systems are combined with orthogonal frequency division multiplexing (OFDM) to achieve high rate transmission and high spectral efficiency. In this paper, we investigate the Pilot-Assisted Channel Estimation for DC biased Optical OFDM (PACE-DCO-OFDM) systems to reduce the effects of the distortion on the transmitted signal. Least-square (LS) and linear minimum mean-squared error (LMMSE) estimators are implemented in MATLAB/Simulink to enhance the bit-error-rate (BER) of PACE-DCO-OFDM. Results show that DCO-OFDM system based on PACE scheme has achieved better BER performance compared to conventional system without pilot assisted channel estimation. Simulation results show that the proposed PACE-DCO-OFDM based on LMMSE algorithm can more accurately estimate the channel and achieves better BER performance when compared to the LS based PACE-DCO-OFDM and the traditional system without PACE. For the same signal to noise ratio (SNR) of 25 dB, the achieved BER is about 5×10-4 for LMMSE-PACE and 4.2×10-3 with LS-PACE while it is about 2×10-1 for system without PACE scheme.

Keywords: channel estimation, OFDM, pilot-assist, VLC

Procedia PDF Downloads 147
907 Evaluation of Different Liquid Scintillation Counting Methods for 222Rn Determination in Waters

Authors: Jovana Nikolov, Natasa Todorovic, Ivana Stojkovic

Abstract:

Monitoring of 222Rn in drinking or surface waters, as well as in groundwater has been performed in connection with geological, hydrogeological and hydrological surveys and health hazard studies. Liquid scintillation counting (LSC) is often preferred analytical method for 222Rn measurements in waters because it allows multiple-sample automatic analysis. LSC method implies mixing of water samples with organic scintillation cocktail, which triggers radon diffusion from the aqueous into organic phase for which it has a much greater affinity, eliminating possibility of radon emanation in that manner. Two direct LSC methods that assume different sample composition have been presented, optimized and evaluated in this study. One-phase method assumed direct mixing of 10 ml sample with 10 ml of emulsifying cocktail (Ultima Gold AB scintillation cocktail is used). Two-phase method involved usage of water-immiscible cocktails (in this study High Efficiency Mineral Oil Scintillator, Opti-Fluor O and Ultima Gold F are used). Calibration samples were prepared with aqueous 226Ra standard in glass 20 ml vials and counted on ultra-low background spectrometer Quantulus 1220TM equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with 226Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide reliable and precise spectra separation. Consequentially, calibration procedure was done through investigation of PSA discriminator level influence on 222Rn efficiency detection, using 226Ra calibration standard in wide range of activity concentrations. Evaluation of presented methods was based on obtained efficiency detections and achieved Minimal Detectable Activity (MDA). Comparison of presented methods, accuracy and precision as well as different scintillation cocktail’s performance was considered from results of measurements of 226Ra spiked water samples with known activity and environmental samples.

Keywords: 222Rn in water, Quantulus1220TM, scintillation cocktail, PSA parameter

Procedia PDF Downloads 168
906 Using MALDI-TOF MS to Detect Environmental Microplastics (Polyethylene, Polyethylene Terephthalate, and Polystyrene) within a Simulated Tissue Sample

Authors: Kara J. Coffman-Rea, Karen E. Samonds

Abstract:

Microplastic pollution is an urgent global threat to our planet and human health. Microplastic particles have been detected within our food, water, and atmosphere, and found within the human stool, placenta, and lung tissue. However, most spectrometric microplastic detection methods require chemical digestion which can alter or destroy microplastic particles and makes it impossible to acquire information about their in-situ distribution. MALDI TOF MS (Matrix-assisted laser desorption ionization-time of flight mass spectrometry) is an analytical method using a soft ionization technique that can be used for polymer analysis. This method provides a valuable opportunity to both acquire information regarding the in-situ distribution of microplastics and also minimizes the destructive element of chemical digestion. In addition, MALDI TOF MS allows for expanded analysis of the microplastics including detection of specific additives that may be present within them. MALDI TOF MS is particularly sensitive to sample preparation and has not yet been used to analyze environmental microplastics within their specific location (e.g., biological tissues, sediment, water). In this study, microplastics were created using polyethylene gloves, polystyrene micro-foam, and polyethylene terephthalate cable sleeving. Plastics were frozen using liquid nitrogen and ground to obtain small fragments. An artificial tissue was created using a cellulose sponge as scaffolding coated with a MaxGel Extracellular Matrix to simulate human lung tissue. Optimal preparation techniques (e.g., matrix, cationization reagent, solvent, mixing ratio, laser intensity) were first established for each specific polymer type. The artificial tissue sample was subsequently spiked with microplastics, and specific polymers were detected using MALDI-TOF-MS. This study presents a novel method for the detection of environmental polyethylene, polyethylene terephthalate, and polystyrene microplastics within a complex sample. Results of this study provide an effective method that can be used in future microplastics research and can aid in determining the potential threats to environmental and human health that they pose.

Keywords: environmental plastic pollution, MALDI-TOF MS, microplastics, polymer identification

Procedia PDF Downloads 218
905 Case Study: Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: cash flow optimization, payment plan, procurement management, subcontracting plan

Procedia PDF Downloads 79
904 Brazilian Constitution and the Fundamental Right to Sanitation

Authors: Michely Vargas Delpupo, José Geraldo Romanello Bueno

Abstract:

The right to basic sanitation, was elevated to the category of fundamental right by the Brazilian Constitution of 1988 to protect the ecologically balanced environment, ensuring social rights to health and adequate housing warranting dignity of the human person as a principle of the Brazilian Democratic State. Because of their essentiality to the Brazilian population, this article seeks to understand why universal access to basic sanitation is a goal so difficult to achieve in Brazil. Therefore, this research uses the deductive and analytical method. Given the nature of the research literature, research techniques were centered in specialized books on the subject, journals, theses and dissertations, laws, relevant law case and raising social indicators relating to the theme. The relevance of the topic stems, among other things, the fact that sanitation services are essential for a dignified life, i.e. everyone is entitled to the maintenance of the necessary existence conditions are satisfied. However, the effectiveness of this right is undermined in society, since Brazil has huge deficit in sanitation services, denying thus a worthy life to most of the population. Thus, it can be seen that the provision of water and sewage services in Brazil is still characterized by a large imbalance, since the municipalities with lower population index have greater disability in the sanitation service. The truth is that the precariousness of water and sewage services in Brazil is still very concentrated in the North and Northeast regions, limiting the effective implementation of the Law 11.445/2007 in the country. Therefore, there is urgent need for a positive service by the State in the provision of sanitation services in order to prevent and control disease, improve quality of life and productivity of individuals, besides preventing contamination of water resources. More than just social and economic necessity, there is even a an obligation of the government to implement such services. In this sense, given the current scenario, to achieve universal access to basic sanitation imposes many hurdles. These are mainly in the field of properly formulated and implemented public policies, i.e. it requires an excellent institutional organization, management services, strategic planning, social control, in order to provide answers to complex challenges.

Keywords: fundamental rights, health, sanitation, universal access

Procedia PDF Downloads 379
903 Enhanced Model for Risk-Based Assessment of Employee Security with Bring Your Own Device Using Cyber Hygiene

Authors: Saidu I. R., Shittu S. S.

Abstract:

As the trend of personal devices accessing corporate data continues to rise through Bring Your Own Device (BYOD) practices, organizations recognize the potential cost reduction and productivity gains. However, the associated security risks pose a significant threat to these benefits. Often, organizations adopt BYOD environments without fully considering the vulnerabilities introduced by human factors in this context. This study presents an enhanced assessment model that evaluates the security posture of employees in BYOD environments using cyber hygiene principles. The framework assesses users' adherence to best practices and guidelines for maintaining a secure computing environment, employing scales and the Euclidean distance formula. By utilizing this algorithm, the study measures the distance between users' security practices and the organization's optimal security policies. To facilitate user evaluation, a simple and intuitive interface for automated assessment is developed. To validate the effectiveness of the proposed framework, design science research methods are employed, and empirical assessments are conducted using five artifacts to analyze user suitability in BYOD environments. By addressing the human factor vulnerabilities through the assessment of cyber hygiene practices, this study aims to enhance the overall security of BYOD environments and enable organizations to leverage the advantages of this evolving trend while mitigating potential risks.

Keywords: security, BYOD, vulnerability, risk, cyber hygiene

Procedia PDF Downloads 39
902 Submicron Laser-Induced Dot, Ripple and Wrinkle Structures and Their Applications

Authors: P. Slepicka, N. Slepickova Kasalkova, I. Michaljanicova, O. Nedela, Z. Kolska, V. Svorcik

Abstract:

Polymers exposed to laser or plasma treatment or modified with different wet methods which enable the introduction of nanoparticles or biologically active species, such as amino-acids, may find many applications both as biocompatible or anti-bacterial materials or on the contrary, can be applied for a decrease in the number of cells on the treated surface which opens application in single cell units. For the experiments, two types of materials were chosen, a representative of non-biodegradable polymers, polyethersulphone (PES) and polyhydroxybutyrate (PHB) as biodegradable material. Exposure of solid substrate to laser well below the ablation threshold can lead to formation of various surface structures. The ripples have a period roughly comparable to the wavelength of the incident laser radiation, and their dimensions depend on many factors, such as chemical composition of the polymer substrate, laser wavelength and the angle of incidence. On the contrary, biopolymers may significantly change their surface roughness and thus influence cell compatibility. The focus was on the surface treatment of PES and PHB by pulse excimer KrF laser with wavelength of 248 nm. The changes of physicochemical properties, surface morphology, surface chemistry and ablation of exposed polymers were studied both for PES and PHB. Several analytical methods involving atomic force microscopy, gravimetry, scanning electron microscopy and others were used for the analysis of the treated surface. It was found that the combination of certain input parameters leads not only to the formation of optimal narrow pattern, but to the combination of a ripple and a wrinkle-like structure, which could be an optimal candidate for cell attachment. The interaction of different types of cells and their interactions with the laser exposed surface were studied. It was found that laser treatment contributes as a major factor for wettability/contact angle change. The combination of optimal laser energy and pulse number was used for the construction of a surface with an anti-cellular response. Due to the simple laser treatment, we were able to prepare a biopolymer surface with higher roughness and thus significantly influence the area of growth of different types of cells (U-2 OS cells).

Keywords: cell response, excimer laser, polymer treatment, periodic pattern, surface morphology

Procedia PDF Downloads 207
901 Frontal Oscillatory Activity and Phase–Amplitude Coupling during Chan Meditation

Authors: Arthur C. Tsai, Chii-Shyang Kuo, Vincent S. C. Chien, Michelle Liou, Philip E. Cheng

Abstract:

Meditation enhances mental abilities and it is an antidote to anxiety. However, very little is known about brain mechanisms and cortico-subcortical interactions underlying meditation-induced anxiety relief. In this study, the changes of phase-amplitude coupling (PAC) in which the amplitude of the beta frequency band were modulated in phase with delta rhythm were investigated after eight-week of meditation training. The study hypothesized that through a concentrate but relaxed mental training the delta-beta coupling in the frontal regions is attenuated. The delta-beta coupling analysis was applied to within and between maximally-independent component sources returned from the extended infomax independent components analysis (ICA) algorithm on the continuous EEG data during mediation. A unique meditative concentration task through relaxing body and mind was used with a constant level of moderate mental effort, so as to approach an ‘emptiness’ meditative state. A pre-test/post-test control group design was used in this study. To evaluate cross-frequency phase-amplitude coupling of component sources, the modulation index (MI) with statistics to calculate circular phase statistics were estimated. Our findings reveal that a significant delta-beta decoupling was observed in a set of frontal regions bilaterally. In addition, beta frequency band of prefrontal component were amplitude modulated in phase with the delta rhythm of medial frontal component.

Keywords: phase-amplitude coupling, ICA, meditation, EEG

Procedia PDF Downloads 391
900 Evaluation of Surface Water and Groundwater Quality in Parts of Umunneochi Southeast, Nigeria

Authors: Joshua Chima Chizoba, Wisdom Izuchukwu Uzoma, Elizabeth Ifeyiwa Okoyeh

Abstract:

Water cannot be optimally used and sustained unless the quality is periodically assessed. The study area Umunneochi and environs are located in south eastern part of Nigeria. It stretches geographically from latitudes 50501N to 60000N and longitudes 70201E to 70301. The major geologic formations in the area include the Asu River group, Nkporo Shale, and Ajali Sandstone. The aim of this study is to evaluate the hydrochemical characteristics of surface and ground water sources in parts of Umunneochi and environs in order to establish portability of the water sources for drinking, domestic and irrigation purposes. A total of 15 samples were collected randomly from streams, springs and wells. The samples were analyzed for physicochemical parameters and heavy metals using handheld digital kits, photometer, titration method and Atomic Absorption Spectrophotometer (AAS) following acceptable standards. The obtained analytical data were interpreted, and results were compared with World Health Organization (WHO) standard. The concentration of pH, SO42-and Cl- range from 5.81 mg/l – 6.07 mg/l, 41.93 mg/l – 142.95 mg/l and 20.00 mg/l – 111 mg/l respectively, while Pb and Zn revealed a relative low mean concentration of 0.14 mg/l and 0.40 mg/l, which are all within (WHO) permissible limits except pH. About 27% of the samples are moderately hard. This is attributed to the mining activities in the areas. The abundance of cations and anions in the area are in the order of K+>Na+>Mg2+>Ca2+ and SO4->Cl->HCO3->NO3-, respectively. Chloride, bicarbonate, and nitrate are all within the permissible limits. 13.33% of the total samples contain Sulphate above the standard permissible limits. The values of calculated Water Quality Index (WQI) are less than 50 indicating excellent water. The predominant water-type in the study area is Na-Cl water type and mixed Ca-Mg-Cl water type based on the sample plots on the Piper diagram. The Sodium Absorption Ratio (SAR) calculations showed excellent water for consumption and also good water for irrigation purpose with low sodium and alkalinity ratio respectively. Government water projects are recommended in the area for sustainable domestic and agricultural water supply to ease the stress of water supply problems.

Keywords: groundwater, hydrochemical, physichochemical, water-type, sodium adsorption ratio

Procedia PDF Downloads 98
899 Non-Invasive Pre-Implantation Genetic Assessment Using NGS in IVF Clinical Routine

Authors: Katalin Gombos, Bence Gálik, Krisztina Ildikó Kalács, Krisztina Gödöny, Ákos Várnagy, József Bódis, Attila Gyenesei, Gábor L. Kovács

Abstract:

Although non-invasive pre-implantation genetic testing for aneuploidy (NIPGT-A) is potentially appropriate to assess chromosomal ploidy of the embryo, practical application of it in a routine IVF center has not been started in the absence of a recommendation. We developed a comprehensive workflow for a clinically applicable strategy for NIPGT-A based on next-generation sequencing (NGS) technology. We performed MALBAC whole genome amplification and NGS on spent blastocyst culture media of Day 3 embryos fertilized with intra-cytoplasmic sperm injection (ICSI). Spent embryonic culture media of morphologically good quality score embryos were enrolled in further analysis with the blank culture media as background control. Chromosomal abnormalities were identified by an optimized bioinformatics pipeline applying a copy number variation (CNV) detecting algorithm. We demonstrate a comprehensive workflow covering both wet- and dry-lab procedures supporting a clinically applicable strategy for NIPGT-A. It can be carried out within 48 h which is critical for the same-cycle blastocyst transfer, but also suitable for “freeze all” and “elective frozen embryo” strategies. The described integrated approach of non-invasive evaluation of embryonic DNA content of the culture media can potentially supplement existing pre-implantation genetic screening methods.

Keywords: next generation sequencing, in vitro fertilization, embryo assessment, non-invasive pre-implantation genetic testing

Procedia PDF Downloads 127
898 Identification and Classification of Fiber-Fortified Semolina by Near-Infrared Spectroscopy (NIR)

Authors: Amanda T. Badaró, Douglas F. Barbin, Sofia T. Garcia, Maria Teresa P. S. Clerici, Amanda R. Ferreira

Abstract:

Food fortification is the intentional addition of a nutrient in a food matrix and has been widely used to overcome the lack of nutrients in the diet or increasing the nutritional value of food. Fortified food must meet the demand of the population, taking into account their habits and risks that these foods may cause. Wheat and its by-products, such as semolina, has been strongly indicated to be used as a food vehicle since it is widely consumed and used in the production of other foods. These products have been strategically used to add some nutrients, such as fibers. Methods of analysis and quantification of these kinds of components are destructive and require lengthy sample preparation and analysis. Therefore, the industry has searched for faster and less invasive methods, such as Near-Infrared Spectroscopy (NIR). NIR is a rapid and cost-effective method, however, it is based on indirect measurements, yielding high amount of data. Therefore, NIR spectroscopy requires calibration with mathematical and statistical tools (Chemometrics) to extract analytical information from the corresponding spectra, as Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA). PCA is well suited for NIR, once it can handle many spectra at a time and be used for non-supervised classification. Advantages of the PCA, which is also a data reduction technique, is that it reduces the data spectra to a smaller number of latent variables for further interpretation. On the other hand, LDA is a supervised method that searches the Canonical Variables (CV) with the maximum separation among different categories. In LDA, the first CV is the direction of maximum ratio between inter and intra-class variances. The present work used a portable infrared spectrometer (NIR) for identification and classification of pure and fiber-fortified semolina samples. The fiber was added to semolina in two different concentrations, and after the spectra acquisition, the data was used for PCA and LDA to identify and discriminate the samples. The results showed that NIR spectroscopy associate to PCA was very effective in identifying pure and fiber-fortified semolina. Additionally, the classification range of the samples using LDA was between 78.3% and 95% for calibration and 75% and 95% for cross-validation. Thus, after the multivariate analysis such as PCA and LDA, it was possible to verify that NIR associated to chemometric methods is able to identify and classify the different samples in a fast and non-destructive way.

Keywords: Chemometrics, fiber, linear discriminant analysis, near-infrared spectroscopy, principal component analysis, semolina

Procedia PDF Downloads 181
897 An Investigation of the Relationship Between Privacy Crisis, Public Discourse on Privacy, and Key Performance Indicators at Facebook (2004–2021)

Authors: Prajwal Eachempati, Laurent Muzellec, Ashish Kumar Jha

Abstract:

We use Facebook as a case study to investigate the complex relationship between the firm’s public discourse (and actions) surrounding data privacy and the performance of a business model based on monetizing user’s data. We do so by looking at the evolution of public discourse over time (2004–2021) and relate topics to revenue and stock market evolution Drawing from archival sources like Zuckerberg We use LDA topic modelling algorithm to reveal 19 topics regrouped in 6 major themes. We first show how, by using persuasive and convincing language that promises better protection of consumer data usage, but also emphasizes greater user control over their own data, the privacy issue is being reframed as one of greater user control and responsibility. Second, we aim to understand and put a value on the extent to which privacy disclosures have a potential impact on the financial performance of social media firms. There we found significant relationship between the topics pertaining to privacy and social media/technology, sentiment score and stock market prices. Revenue is found to be impacted by topics pertaining to politics and new product and service innovations while number of active users is not impacted by the topics unless moderated by external control variables like Return on Assets and Brand Equity.

Keywords: public discourses, data protection, social media, privacy, topic modeling, business models, financial performance

Procedia PDF Downloads 58
896 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 262
895 The Effect of Different Parameters on a Single Invariant Lateral Displacement Distribution to Consider the Higher Modes Effect in a Displacement-Based Pushover Procedure

Authors: Mohamad Amin Amini, Mehdi Poursha

Abstract:

Nonlinear response history analysis (NL-RHA) is a robust analytical tool for estimating the seismic demands of structures responding in the inelastic range. However, because of its conceptual and numerical complications, the nonlinear static procedure (NSP) is being increasingly used as a suitable tool for seismic performance evaluation of structures. The conventional pushover analysis methods presented in various codes (FEMA 356; Eurocode-8; ATC-40), are limited to the first-mode-dominated structures, and cannot take higher modes effect into consideration. Therefore, since more than a decade ago, researchers developed enhanced pushover analysis procedures to take higher modes effect into account. The main objective of this study is to propose an enhanced invariant lateral displacement distribution to take higher modes effect into consideration in performing a displacement-based pushover analysis, whereby a set of laterally applied displacements, rather than forces, is monotonically applied to the structure. For this purpose, the effect of different parameters such as the spectral displacement of ground motion, the modal participation factor, and the effective modal participating mass ratio on the lateral displacement distribution is investigated to find the best distribution. The major simplification of this procedure is that the effect of higher modes is concentrated into a single invariant lateral load distribution. Therefore, only one pushover analysis is sufficient without any need to utilize a modal combination rule for combining the responses. The invariant lateral displacement distribution for pushover analysis is then calculated by combining the modal story displacements using the modal combination rules. The seismic demands resulting from the different procedures are compared to those from the more accurate nonlinear response history analysis (NL-RHA) as a benchmark solution. Two structures of different heights including 10 and 20-story special steel moment resisting frames (MRFs) were selected and evaluated. Twenty ground motion records were used to conduct the NL-RHA. The results show that more accurate responses can be obtained in comparison with the conventional lateral loads when the enhanced modal lateral displacement distributions are used.

Keywords: displacement-based pushover, enhanced lateral load distribution, higher modes effect, nonlinear response history analysis (NL-RHA)

Procedia PDF Downloads 243
894 Development of Antioxidant Rich Bakery Products by Applying Lysine and Maillard Reaction Products

Authors: Attila Kiss, Erzsébet Némedi, Zoltán Naár

Abstract:

Due to the rapidly growing number of conscious customers in the recent years, more and more people look for products with positive physiological effects which may contribute to the preservation of their health. In response to these demands Food Science Research Institute of Budapest develops and introduces into the market new functional foods of guaranteed positive effect that contain bioactive agents. New, efficient technologies are also elaborated in order to preserve the maximum biological effect of the produced foods. The main objective of our work was the development of new functional biscuits fortified with physiologically beneficial ingredients. Bakery products constitute the base of the food nutrients’ pyramid, thus they might be regarded as foodstuffs of the largest consumed quantity. In addition to the well-known and certified physiological benefits of lysine, as an essential amino acid, a series of antioxidant type compounds is formed as a consequence of the occurring Maillard-reaction. Progress of the evoked Maillard-reaction was studied by applying diverse sugars (glucose, fructose, saccharose, isosugar) and lysine at several temperatures (120-170°C). Interval of thermal treatment was also varied (10-30 min). The composition and production technologies were tailored in order to reach the maximum of the possible biological benefits, so as to the highest antioxidant capacity in the biscuits. Out of the examined sugar components, theextent of the Maillard-reaction-driven transformation of glucose was the most pronounced at both applied temperatures. For the precise assessment of the antioxidant activity of the products FRAP and DPPH methods were adapted and optimised. To acquire an authentic and extensive mechanism of the occurring transformations, Maillard-reaction products were identified, and relevant reaction pathways were revealed. GC-MS and HPLC-MS techniques were applied for the analysis of the 60 generated MRPs and characterisation of actual transformation processes. 3 plausible major transformation routes might have been suggested based on the analytical result and the deductive sequence of possible occurring conversions between lysine and the sugars.

Keywords: Maillard-reaction, lysine, antioxidant activity, GC-MS and HPLC-MS techniques

Procedia PDF Downloads 446
893 Numerical Buckling of Composite Cylindrical Shells under Axial Compression Using Asymmetric Meshing Technique (AMT)

Authors: Zia R. Tahir, P. Mandal

Abstract:

This paper presents the details of a numerical study of buckling and post buckling behaviour of laminated carbon fiber reinforced plastic (CFRP) thin-walled cylindrical shell under axial compression using asymmetric meshing technique (AMT) by ABAQUS. AMT is considered to be a new perturbation method to introduce disturbance without changing geometry, boundary conditions or loading conditions. Asymmetric meshing affects both predicted buckling load and buckling mode shapes. Cylindrical shell having lay-up orientation [0°/+45°/-45°/0°] with radius to thickness ratio (R/t) equal to 265 and length to radius ratio (L/R) equal to 1.5 is analysed numerically. A series of numerical simulations (experiments) are carried out with symmetric and asymmetric meshing to study the effect of asymmetric meshing on predicted buckling behaviour. Asymmetric meshing technique is employed in both axial direction and circumferential direction separately using two different methods, first by changing the shell element size and varying the total number elements, and second by varying the shell element size and keeping total number of elements constant. The results of linear analysis (Eigenvalue analysis) and non-linear analysis (Riks analysis) using symmetric meshing agree well with analytical results. The results of numerical analysis are presented in form of non-dimensional load factor, which is the ratio of buckling load using asymmetric meshing technique to buckling load using symmetric meshing technique. Using AMT, load factor has about 2% variation for linear eigenvalue analysis and about 2% variation for non-linear Riks analysis. The behaviour of load end-shortening curve for pre-buckling is same for both symmetric and asymmetric meshing but for asymmetric meshing curve behaviour in post-buckling becomes extraordinarily complex. The major conclusions are: different methods of AMT have small influence on predicted buckling load and significant influence on load displacement curve behaviour in post buckling; AMT in axial direction and AMT in circumferential direction have different influence on buckling load and load displacement curve in post-buckling.

Keywords: CFRP composite cylindrical shell, asymmetric meshing technique, primary buckling, secondary buckling, linear eigenvalue analysis, non-linear riks analysis

Procedia PDF Downloads 326
892 Composite Approach to Extremism and Terrorism Web Content Classification

Authors: Kolade Olawande Owoeye, George Weir

Abstract:

Terrorism and extremism activities on the internet are becoming the most significant threats to national security because of their potential dangers. In response to this challenge, law enforcement and security authorities are actively implementing comprehensive measures by countering the use of the internet for terrorism. To achieve the measures, there is need for intelligence gathering via the internet. This includes real-time monitoring of potential websites that are used for recruitment and information dissemination among other operations by extremist groups. However, with billions of active webpages, real-time monitoring of all webpages become almost impossible. To narrow down the search domain, there is a need for efficient webpage classification techniques. This research proposed a new approach tagged: SentiPosit-based method. SentiPosit-based method combines features of the Posit-based method and the Sentistrenght-based method for classification of terrorism and extremism webpages. The experiment was carried out on 7500 webpages obtained through TENE-webcrawler by International Cyber Crime Research Centre (ICCRC). The webpages were manually grouped into three classes which include the ‘pro-extremist’, ‘anti-extremist’ and ‘neutral’ with 2500 webpages in each category. A supervised learning algorithm is then applied on the classified dataset in order to build the model. Results obtained was compared with existing classification method using the prediction accuracy and runtime. It was observed that our proposed hybrid approach produced a better classification accuracy compared to existing approaches within a reasonable runtime.

Keywords: sentiposit, classification, extremism, terrorism

Procedia PDF Downloads 247
891 Time and Cost Prediction Models for Language Classification Over a Large Corpus on Spark

Authors: Jairson Barbosa Rodrigues, Paulo Romero Martins Maciel, Germano Crispim Vasconcelos

Abstract:

This paper presents an investigation of the performance impacts regarding the variation of five factors (input data size, node number, cores, memory, and disks) when applying a distributed implementation of Naïve Bayes for text classification of a large Corpus on the Spark big data processing framework. Problem: The algorithm's performance depends on multiple factors, and knowing before-hand the effects of each factor becomes especially critical as hardware is priced by time slice in cloud environments. Objectives: To explain the functional relationship between factors and performance and to develop linear predictor models for time and cost. Methods: the solid statistical principles of Design of Experiments (DoE), particularly the randomized two-level fractional factorial design with replications. This research involved 48 real clusters with different hardware arrangements. The metrics were analyzed using linear models for screening, ranking, and measurement of each factor's impact. Results: Our findings include prediction models and show some non-intuitive results about the small influence of cores and the neutrality of memory and disks on total execution time, and the non-significant impact of data input scale on costs, although notably impacts the execution time.

Keywords: big data, design of experiments, distributed machine learning, natural language processing, spark

Procedia PDF Downloads 82
890 Permeable Reactive Pavement for Controlling the Transport of Benzene, Toluene, Ethyl-Benzene, and Xylene (BTEX) Contaminants

Authors: Shengyi Huang, Chenju Liang

Abstract:

Volatile organic compounds such as benzene, toluene, ethyl-benzene, and xylene (BTEX) are common contaminants in environment, which could come from asphalt concrete or exhaust emissions of vehicles. The BTEX may invade to the subsurface environment via wet and dry atmospheric depositions. If there aren’t available ways for controlling contaminants’ fate and transport, they would extensively harm natural environment. In the 1st phase of this study, various adsorbents were screened for a suitable one to be an additive in the porous asphalt mixture. In the 2nd phase, addition of the selected adsorbent was incorporated with the design of porous asphalt concrete (PAC) to produce the permeable reactive pavement (PRP), which was subsequently tested for the potential of adsorbing aqueous BTEX as compared to the PAC, in the 3rd phase. The PRP was prepared according to the following steps: firstly, the suitable adsorbent was chosen based on the analytical results of specific surface area analysis, thermal-gravimetric analysis, adsorption kinetics and isotherms, and thermal dynamics analysis; secondly, the materials of coarse aggregate, fine aggregate, filler, asphalt, and fiber were tested in order to meet regulated specifications (e.g., water adsorption, soundness, viscosity etc.) for preparing the PRP; thirdly, the amount of adsorbent additive was determined in the PRP; fourthly, the prepared PAC and PRP were examined for their physical properties (e.g., abrasion loss, drain-down loss, Marshall stability, Marshall flow, dynamic stability etc.). As a result of comparison between PRP and PAC, the PRP showed better physical performance than the traditional PAC. At last, the Marshall Specimen column tests were conducted to explore the adsorption capacities of PAC and PRPs. The BTEX adsorption capacities of PRPs are higher than those obtained from traditional PAC. In summary, PRPs showed superior physical performance and adsorption capacities, which exhibit the potential of PRP to be applied as a replacement of PAC for better controlling the transport of non-point source pollutants.

Keywords: porous asphalt concrete, volatile organic compounds, permeable reactive pavement, non-point source pollution

Procedia PDF Downloads 167
889 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach

Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed

Abstract:

Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.

Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model

Procedia PDF Downloads 427
888 Forced Displacement and Mental Health Problems in Refugees Residing in Quetta for Decades

Authors: Silsila Sherzad, Hazrat Ali Khan, Tabasum Sherzad, Hazratullah, Sanaullah

Abstract:

Objective: To study the prevalence of common Mental health disorders among forcibly displaced people and to compare with the common mental health disorders among host community members. Study design: Analytical Study. Place of study: Balochistan institute of psychiatry and behavioral sciences, Quetta, Baluchistan, Pakistan. Methodology: Data from the Outpatient department were analyzed to numerate both the host community and refugees. Out of 4120, 354 refugee patients were identified using their proof registration (POR) card and for 3776 of the host community using their computerized national identity card (CNIC), data was analyzed for the prevalence of mental health disorders among them. Results: This study states that Afghan Refugees presented to OPD services of Balochistan institute of psychiatry and behavioral sciences, 47% were diagnosed as Major depressive disorder with/without psychosis, 19% with Generalized anxiety disorder, 5% were diagnosed as Bipolar Affective disorder, 5% With schizophrenia, 4% as Post-traumatic stress disorder, 3% as migraine, 3% conversion disorder, 2% Obsessive-compulsive disorder, 1% somatoform disorder and 10% of them presented with other psychiatric disorders, while in host community 21% were diagnosed as Major depressive disorder with/without psychosis, 24% as Generalized anxiety disorder, 12% as somatoform disorder, 10% as Obsessive-compulsive disorder, 8% as migraine, 7% as conversion disorder, 4% as Bipolar Affective disorder, 3% as schizophrenia, 3% as Mental and behavioral disorder due to substance misuse and rest of 7% presented with other psychiatric disorders. Conclusion: The conclusion of this study states that mental health disorders are more common among refugees than in other populations. The result of this study shows that there is a big difference in the prevalence of mental health disorders among displaced people and the rest of the population. Some Mental health disorders are present in a higher percentage among displaced people rather than among the host community, while some other disorders are present in a lower percentage among displaced people rather than among the host community. This study also highlights that further studies are needed to determine risk and protective factors within the host community.

Keywords: forced displacement, mental health, Afghan refugees, depression

Procedia PDF Downloads 71
887 Women’s Colours in Digital Innovation

Authors: Daniel J. Patricio Jiménez

Abstract:

Digital reality demands new ways of thinking, flexibility in learning, acquisition of new competencies, visualizing reality under new approaches, generating open spaces, understanding dimensions in continuous change, etc. We need inclusive growth, where colors are not lacking, where lights do not give a distorted reality, where science is not half-truth. In carrying out this study, the documentary or bibliographic collection has been taken into account, providing a reflective and analytical analysis of current reality. In this context, deductive and inductive methods have been used on different multidisciplinary information sources. Women today and tomorrow are a strategic element in science and arts, which, under the umbrella of sustainability, implies ‘meeting current needs without detriment to future generations’. We must build new scenarios, which qualify ‘the feminine and the masculine’ as an inseparable whole, encouraging cooperative behavior; nothing is exclusive or excluding, and that is where true respect for diversity must be based. We are all part of an ecosystem, which we will make better as long as there is a real balance in terms of gender. It is the time of ‘the lifting of the veil’, in other words, it is the time to discover the pseudonyms, the women who painted, wrote, investigated, recorded advances, etc. However, the current reality demands much more; we must remove doors where they are not needed. Mass processing of data, big data, needs to incorporate algorithms under the perspective of ‘the feminine’. However, most STEM students (science, technology, engineering, and math) are men. Our way of doing science is biased, focused on honors and short-term results to the detriment of sustainability. Historically, the canons of beauty, the way of looking, of perceiving, of feeling, depended on the circumstances and interests of each moment, and women had no voice in this. Parallel to science, there is an under-representation of women in the arts, but not so much in the universities, but when we look at galleries, museums, art dealers, etc., colours impoverish the gaze and once again highlight the gender gap and the silence of the feminine. Art registers sensations by divining the future, science will turn them into reality. The uniqueness of the so-called new normality requires women to be protagonists both in new forms of emotion and thought, and in the experimentation and development of new models. This will result in women playing a decisive role in the so-called "5.0 society" or, in other words, in a more sustainable, more humane world.

Keywords: art, digitalization, gender, science

Procedia PDF Downloads 133