Search results for: task and ego orientation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3185

Search results for: task and ego orientation

305 A Comparative Analysis on the Impact of the Prevention and Combating of Hate Crimes and Hate Speech Bill of 2016 on the Rights to Human Dignity, Equality, and Freedom in South Africa

Authors: Tholaine Matadi

Abstract:

South Africa is a democratic country with a historical record of racially-motivated marginalisation and exclusion of the majority. During the apartheid era the country was run along pieces of legislation and policies based on racial segregation. The system held a tight clamp on interracial mixing which forced people to remain in segregated areas. For example, a citizen from the Indian community could not own property in an area allocated to white people. In this way, a great majority of people were denied basic human rights. Now, there is a supreme constitution with an entrenched justiciable Bill of Rights founded on democratic values of social justice, human dignity, equality and the advancement of human rights and freedoms. The Constitution also enshrines the values of non-racialism and non-sexism. The Constitutional Court has the power to declare unconstitutional any law or conduct considered to be inconsistent with it. Now, more than two decades down the line, despite the abolition of apartheid, there is evidence that South Africa still experiences hate crimes which violate the entrenched right of vulnerable groups not to be discriminated against on the basis of race, sexual orientation, gender, national origin, occupation, or disability. To remedy this mischief parliament has responded by drafting the Prevention and Combatting of Hate Crimes and Hate Speech Bill. The Bill has been disseminated for public comment and suggestions. It is intended to combat hate crimes and hate speech based on sheer prejudice. The other purpose of the Bill is to bring South Africa in line with international human rights instruments against racism, racial discrimination, xenophobia and related expressions of intolerance identified in several international instruments. It is against this backdrop that this paper intends to analyse the impact of the Bill on the rights to human dignity, equality, and freedom. This study is significant because the Bill was highly contested and creates a huge debate. This study relies on a qualitative evaluative approach based on desktop and library research. The article recurs to primary and secondary sources. For comparative purpose, the paper compares South Africa with countries such as Australia, Canada, Kenya, Cuba, and United Kingdom which have criminalised hate crimes and hate speech. The finding from this study is that despite the Bill’s expressed positive intentions, this draft legislation is problematic for several reasons. The main reason is that it generates considerable controversy mostly because it is considered to infringe the right to freedom of expression. Though the author suggests that the Bill should not be rejected in its entirety, she notes the brutal psychological effect of hate crimes on their direct victims and the writer emphasises that a legislature can succeed to combat hate-crimes only if it provides for them as a separate stand-alone category of offences. In view of these findings, the study recommended that since hate speech clauses have a negative impact on freedom of expression it can be promulgated, subject to the legislature enacting the Prevention and Combatting of Hate-Crimes Bill as a stand-alone law which criminalises hate crimes.

Keywords: freedom of expression, hate crimes, hate speech, human dignity

Procedia PDF Downloads 173
304 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 273
303 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection

Authors: S. Delgado, C. Cerrada, R. S. Gómez

Abstract:

This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.

Keywords: voxelization, GPU acceleration, computer graphics, compute shaders

Procedia PDF Downloads 73
302 Simple Finite-Element Procedure for Modeling Crack Propagation in Reinforced Concrete Bridge Deck under Repetitive Moving Truck Wheel Loads

Authors: Rajwanlop Kumpoopong, Sukit Yindeesuk, Pornchai Silarom

Abstract:

Modeling cracks in concrete is complicated by its strain-softening behavior which requires the use of sophisticated energy criteria of fracture mechanics to assure stable and convergent solutions in the finite-element (FE) analysis particularly for relatively large structures. However, for small-scale structures such as beams and slabs, a simpler approach relies on retaining some shear stiffness in the cracking plane has been adopted in literature to model the strain-softening behavior of concrete under monotonically increased loading. According to the shear retaining approach, each element is assumed to be an isotropic material prior to cracking of concrete. Once an element is cracked, the isotropic element is replaced with an orthotropic element in which the new orthotropic stiffness matrix is formulated with respect to the crack orientation. The shear transfer factor of 0.5 is used in parallel to the crack plane. The shear retaining approach is adopted in this research to model cracks in RC bridge deck with some modifications to take into account the effect of repetitive moving truck wheel loads as they cause fatigue cracking of concrete. First modification is the introduction of fatigue tests of concrete and reinforcing steel and the Palmgren-Miner linear criterion of cumulative damage in the conventional FE analysis. For a certain loading, the number of cycles to failure of each concrete or RC element can be calculated from the fatigue or S-N curves of concrete and reinforcing steel. The elements with the minimum number of cycles to failure are the failed elements. For the elements that do not fail, the damage is accumulated according to Palmgren-Miner linear criterion of cumulative damage. The stiffness of the failed element is modified and the procedure is repeated until the deck slab fails. The total number of load cycles to failure of the deck slab can then be obtained from which the S-N curve of the deck slab can be simulated. Second modification is the modification in shear transfer factor. Moving loading causes continuous rubbing of crack interfaces which greatly reduces shear transfer mechanism. It is therefore conservatively assumed in this study that the analysis is conducted with shear transfer factor of zero for the case of moving loading. A customized FE program has been developed using the MATLAB software to accomodate such modifications. The developed procedure has been validated with the fatigue test of the 1/6.6-scale AASHTO bridge deck under the applications of both fixed-point repetitive loading and moving loading presented in the literature. Results are in good agreement both experimental vs. simulated S-N curves and observed vs. simulated crack patterns. Significant contribution of the developed procedure is a series of S-N relations which can now be simulated at any desired levels of cracking in addition to the experimentally derived S-N relation at the failure of the deck slab. This permits the systematic investigation of crack propagation or deterioration of RC bridge deck which is appeared to be useful information for highway agencies to prolong the life of their bridge decks.

Keywords: bridge deck, cracking, deterioration, fatigue, finite-element, moving truck, reinforced concrete

Procedia PDF Downloads 257
301 A Geometric Based Hybrid Approach for Facial Feature Localization

Authors: Priya Saha, Sourav Dey Roy Jr., Debotosh Bhattacharjee, Mita Nasipuri, Barin Kumar De, Mrinal Kanti Bhowmik

Abstract:

Biometric face recognition technology (FRT) has gained a lot of attention due to its extensive variety of applications in both security and non-security perspectives. It has come into view to provide a secure solution in identification and verification of person identity. Although other biometric based methods like fingerprint scans, iris scans are available, FRT is verified as an efficient technology for its user-friendliness and contact freeness. Accurate facial feature localization plays an important role for many facial analysis applications including biometrics and emotion recognition. But, there are certain factors, which make facial feature localization a challenging task. On human face, expressions can be seen from the subtle movements of facial muscles and influenced by internal emotional states. These non-rigid facial movements cause noticeable alterations in locations of facial landmarks, their usual shapes, which sometimes create occlusions in facial feature areas making face recognition as a difficult problem. The paper proposes a new hybrid based technique for automatic landmark detection in both neutral and expressive frontal and near frontal face images. The method uses the concept of thresholding, sequential searching and other image processing techniques for locating the landmark points on the face. Also, a Graphical User Interface (GUI) based software is designed that could automatically detect 16 landmark points around eyes, nose and mouth that are mostly affected by the changes in facial muscles. The proposed system has been tested on widely used JAFFE and Cohn Kanade database. Also, the system is tested on DeitY-TU face database which is created in the Biometrics Laboratory of Tripura University under the research project funded by Department of Electronics & Information Technology, Govt. of India. The performance of the proposed method has been done in terms of error measure and accuracy. The method has detection rate of 98.82% on JAFFE database, 91.27% on Cohn Kanade database and 93.05% on DeitY-TU database. Also, we have done comparative study of our proposed method with other techniques developed by other researchers. This paper will put into focus emotion-oriented systems through AU detection in future based on the located features.

Keywords: biometrics, face recognition, facial landmarks, image processing

Procedia PDF Downloads 412
300 Modeling Spatio-Temporal Variation in Rainfall Using a Hierarchical Bayesian Regression Model

Authors: Sabyasachi Mukhopadhyay, Joseph Ogutu, Gundula Bartzke, Hans-Peter Piepho

Abstract:

Rainfall is a critical component of climate governing vegetation growth and production, forage availability and quality for herbivores. However, reliable rainfall measurements are not always available, making it necessary to predict rainfall values for particular locations through time. Predicting rainfall in space and time can be a complex and challenging task, especially where the rain gauge network is sparse and measurements are not recorded consistently for all rain gauges, leading to many missing values. Here, we develop a flexible Bayesian model for predicting rainfall in space and time and apply it to Narok County, situated in southwestern Kenya, using data collected at 23 rain gauges from 1965 to 2015. Narok County encompasses the Maasai Mara ecosystem, the northern-most section of the Mara-Serengeti ecosystem, famous for its diverse and abundant large mammal populations and spectacular migration of enormous herds of wildebeest, zebra and Thomson's gazelle. The model incorporates geographical and meteorological predictor variables, including elevation, distance to Lake Victoria and minimum temperature. We assess the efficiency of the model by comparing it empirically with the established Gaussian process, Kriging, simple linear and Bayesian linear models. We use the model to predict total monthly rainfall and its standard error for all 5 * 5 km grid cells in Narok County. Using the Monte Carlo integration method, we estimate seasonal and annual rainfall and their standard errors for 29 sub-regions in Narok. Finally, we use the predicted rainfall to predict large herbivore biomass in the Maasai Mara ecosystem on a 5 * 5 km grid for both the wet and dry seasons. We show that herbivore biomass increases with rainfall in both seasons. The model can handle data from a sparse network of observations with many missing values and performs at least as well as or better than four established and widely used models, on the Narok data set. The model produces rainfall predictions consistent with expectation and in good agreement with the blended station and satellite rainfall values. The predictions are precise enough for most practical purposes. The model is very general and applicable to other variables besides rainfall.

Keywords: non-stationary covariance function, gaussian process, ungulate biomass, MCMC, maasai mara ecosystem

Procedia PDF Downloads 294
299 Implementing Peer Mediated Interventions with Visual Supports for Social Skills Development in a School-Based Work Setting with Secondary Students with Autism

Authors: Karen Eastman

Abstract:

More youths and young adults with autism spectrum disorder (ASD) have been entering the workforce in recent years. Historically, students with ASD struggle after leaving high school and experience lower rates of employment, with social skills continuing to be the most problematic area of concern. Special education teachers may find it challenging to identify effective combinations of evidence-based practices (EBPs) and supports to best guide these students. One EBP, Peer Mediated Instruction and Intervention (PMII) has been well documented in the literature as being effective for younger students with autism but not researched as much with older students and adults, particularly in work settings. A need to combine PMII with other EBPs has been identified as a way to achieve a greater positive impact rather than any practice alone. A multiple baseline across skills design was used in this research project with two participants in different settings. PMII was combined with Visual Supports, with typical peers being trained in both practices. PMII is an evidence-based practice used to address social concerns by training peers without disabilities as to how they can provide feedback to and support, the student with ASD with social interactions in structured settings. The peers without disabilities were the instructors, while the adults facilitated the social situations and provided support to both the peers and students with ASD when needed. Because many individuals with ASD learn best with visual input, rather than using only the spoken word (verbal directions and feedback), Visual Supports were used in conjunction with PMII. Visual Supports can include written words, pictures, symbols, videos, or objects. In this project, the Visual Supports used were written social scripts, videos, Stop and Think signs, written reminder cards, a school map, and a pictorial task analysis of work tasks. Variables that may affect intervention outcomes in this project included attendance at school and school-based work settings for both the students with ASD and the peers without disabilities and behaviors and responses from others in the settings. Qualitative data was also collected from observations and surveys with peers about the process and their role. Data indicated that the students with ASD responded more positively to redirection and support from their peers than to teachers and staff and showed an increase in positive interactions with others. Those surveyed indicated a positive attitude toward and response to the use of peer interventions with visual supports.

Keywords: autism, social skills, vocational training, peer interventions

Procedia PDF Downloads 42
298 Term Creation in Specialized Fields: An Evaluation of Shona Phonetics and Phonology Terminology at Great Zimbabwe University

Authors: Peniah Mabaso-Shamano

Abstract:

The paper evaluates Shona terms that were created to teach Phonetics and Phonology courses at Great Zimbabwe University (GZU). The phonetics and phonology terms to be discussed in this paper were created using different processes and strategies such as translation, borrowing, neologising, compounding, transliteration, circumlocution among many others. Most phonetics and phonology terms are alien to Shona and as a result, there are no suitable Shona equivalents. The lecturers and students for these courses have a mammoth task of creating terminology for the different modules offered in Shona and other Zimbabwean indigenous languages. Most linguistic reference books are written in English. As such, lecturers and students translate information from English to Shona, a measure which is proving to be too difficult for them. A term creation workshop was held at GZU to try to address the problem of lack of terminology in indigenous languages. Different indigenous language practitioners from different tertiary institutions convened for a two-day workshop at GZU. Due to the 'specialized' nature of phonetics and phonology, it was too difficult to come up with 'proper' indigenous terms. The researcher will consult tertiary institutions lecturers who teach linguistics courses and linguistics students to get their views on the created terms. The people consulted will not be the ones who took part in the term creation workshop held at GZU. The selected participants will be asked to evaluate and back-translate some of the terms. In instances where they feel the terms created are not suitable or user-friendly, they will be asked to suggest other terms. Since the researcher is also a linguistics lecturer, her observation and views will be important. From her experience in using some of the terms in teaching phonetics and phonology courses to undergraduate students, the researcher noted that most of the terms created have shortcomings since they are not user-friendly. These shortcomings include terms longer than the English terms as some terms are translated to Shona through a whole statement. Most of these terms are neologisms, compound neologisms, transliterations, circumlocutions, and blends. The paper will show that there is overuse of transliterated terms due to the lack of Shona equivalents for English terms. Most single English words were translated into compound neologisms or phrases after attempts to reduce them to one word terms failed. In other instances, circumlocution led to the problem of creating longer terms than the original and as a result, the terms are not user-friendly. The paper will discuss and evaluate the different phonetics and phonology terms created and the different strategies and processes used in creating them.

Keywords: blending, circumlocution, term creation, translation

Procedia PDF Downloads 147
297 An Observational Study Assessing the Baseline Communication Behaviors among Healthcare Professionals in an Inpatient Setting in Singapore

Authors: Pin Yu Chen, Puay Chuan Lee, Yu Jen Loo, Ju Xia Zhang, Deborah Teo, Jack Wei Chieh Tan, Biauw Chi Ong

Abstract:

Background: Synchronous communication, such as telephone calls, remains the standard communication method between nurses and other healthcare professionals in Singapore public hospitals despite advances in asynchronous technological platforms, such as instant messaging. Although miscommunication is one of the most common causes of lapses in patient care, there is a scarcity of research characterizing baseline inter-professional healthcare communications in a hospital setting due to logistic difficulties. Objective: This study aims to characterize the frequency and patterns of communication behaviours among healthcare professionals. Methods: The one-week observational study was conducted on Monday through Sunday at the nursing station of a cardiovascular medicine and cardiothoracic surgery inpatient ward at the National Heart Centre Singapore. Subjects were shadowed by two physicians for sixteen hours or consecutive morning and afternoon nursing shifts. Communications were logged and characterized by type, duration, caller, and recipient. Results: A total of 1,023 communication events involving the attempted use of the common telephones at the nursing station were logged over a period of one week, corresponding to a frequency of one event every 5.45 minutes (SD 6.98, range 0-56 minutes). Nurses initiated the highest proportion of outbound calls (38.7%) via the nursing station common phone. A total of 179 face-to-face communications (17.5%), 362 inbound calls (35.39%), 481 outbound calls (47.02%), and 1 emergency alert (0.10%) were captured. Average response time for task-oriented communications was 159 minutes (SD 387.6, range 86-231). Approximately 1 in 3 communications captured aimed to clarify patient-related information. The total duration of time spent on synchronous communication events over one week, calculated from total inbound and outbound calls, was estimated to be a total of 7 hours. Conclusion: The results of our study showed that there is a significant amount of time spent on inter-professional healthcare communications via synchronous channels. Integration of patient-related information and use of asynchronous communication channels may help to reduce the redundancy of communications and clarifications. Future studies should explore the use of asynchronous mobile platforms to address the inefficiencies observed in healthcare communications.

Keywords: healthcare communication, healthcare management, nursing, qualitative observational study

Procedia PDF Downloads 210
296 Relativity in Toddlers' Understanding of the Physical World as Key to Misconceptions in the Science Classroom

Authors: Michael Hast

Abstract:

Within their first year, infants can differentiate between objects based on their weight. By at least 5 years children hold consistent weight-related misconceptions about the physical world, such as that heavy things fall faster than lighter ones because of their weight. Such misconceptions are seen as a challenge for science education since they are often highly resistant to change through instruction. Understanding the time point of emergence of such ideas could, therefore, be crucial for early science pedagogy. The paper thus discusses two studies that jointly address the issue by examining young children’s search behaviour in hidden displacement tasks under consideration of relative object weight. In both studies, they were tested with a heavy or a light ball, and they either had information about one of the balls only or both. In Study 1, 88 toddlers aged 2 to 3½ years watched a ball being dropped into a curved tube and were then allowed to search for the ball in three locations – one straight beneath the tube entrance, one where the curved tube lead to, and one that corresponded to neither of the previous outcomes. Success and failure at the task were not impacted by weight of the balls alone in any particular way. However, from around 3 years onwards, relative lightness, gained through having tactile experience of both balls beforehand, enhanced search success. Conversely, relative heaviness increased search errors such that children increasingly searched in the location immediately beneath the tube entry – known as the gravity bias. In Study 2, 60 toddlers aged 2, 2½ and 3 years watched a ball roll down a ramp and behind a screen with four doors, with a barrier placed along the ramp after one of four doors. Toddlers were allowed to open the doors to find the ball. While search accuracy generally increased with age, relative weight did not play a role in 2-year-olds’ search behaviour. Relative lightness improved 2½-year-olds’ searches. At 3 years, both relative lightness and relative heaviness had a significant impact, with the former improving search accuracy and the latter reducing it. Taken together, both studies suggest that between 2 and 3 years of age, relative object weight is increasingly taken into consideration in navigating naïve physical concepts. In particular, it appears to contribute to the early emergence of misconceptions relating to object weight. This insight from developmental psychology research may have consequences for early science education and related pedagogy towards early conceptual change.

Keywords: conceptual development, early science education, intuitive physics, misconceptions, object weight

Procedia PDF Downloads 190
295 Mechanisms Underlying Comprehension of Visualized Personal Health Information: An Eye Tracking Study

Authors: Da Tao, Mingfu Qin, Wenkai Li, Tieyan Wang

Abstract:

While the use of electronic personal health portals has gained increasing popularity in the healthcare industry, users usually experience difficulty in comprehending and correctly responding to personal health information, partly due to inappropriate or poor presentation of the information. The way personal health information is visualized may affect how users perceive and assess their personal health information. This study was conducted to examine the effects of information visualization format and visualization mode on the comprehension and perceptions of personal health information among personal health information users with eye tracking techniques. A two-factor within-subjects experimental design was employed, where participants were instructed to complete a series of personal health information comprehension tasks under varied types of visualization mode (i.e., whether the information visualization is static or dynamic) and three visualization formats (i.e., bar graph, instrument-like graph, and text-only format). Data on a set of measures, including comprehension performance, perceptions, and eye movement indicators, were collected during the task completion in the experiment. Repeated measure analysis of variance analyses (RM-ANOVAs) was used for data analysis. The results showed that while the visualization format yielded no effects on comprehension performance, it significantly affected users’ perceptions (such as perceived ease of use and satisfaction). The two graphic visualizations yielded significantly higher favorable scores on subjective evaluations than that of the text format. While visualization mode showed no effects on users’ perception measures, it significantly affected users' comprehension performance in that dynamic visualization significantly reduced users' information search time. Both visualization format and visualization mode had significant main effects on eye movement behaviors, and their interaction effects were also significant. While the bar graph format and text format had similar time to first fixation across dynamic and static visualizations, instrument-like graph format had a larger time to first fixation for dynamic visualization than for static visualization. The two graphic visualization formats yielded shorter total fixation duration compared with the text-only format, indicating their ability to improve information comprehension efficiency. The results suggest that dynamic visualization can improve efficiency in comprehending important health information, and graphic visualization formats were favored more by users. The findings are helpful in the underlying comprehension mechanism of visualized personal health information and provide important implications for optimal design and visualization of personal health information.

Keywords: eye tracking, information comprehension, personal health information, visualization

Procedia PDF Downloads 109
294 Private Coded Computation of Matrix Multiplication

Authors: Malihe Aliasgari, Yousef Nejatbakhsh

Abstract:

The era of Big Data and the immensity of real-life datasets compels computation tasks to be performed in a distributed fashion, where the data is dispersed among many servers that operate in parallel. However, massive parallelization leads to computational bottlenecks due to faulty servers and stragglers. Stragglers refer to a few slow or delay-prone processors that can bottleneck the entire computation because one has to wait for all the parallel nodes to finish. The problem of straggling processors, has been well studied in the context of distributed computing. Recently, it has been pointed out that, for the important case of linear functions, it is possible to improve over repetition strategies in terms of the tradeoff between performance and latency by carrying out linear precoding of the data prior to processing. The key idea is that, by employing suitable linear codes operating over fractions of the original data, a function may be completed as soon as enough number of processors, depending on the minimum distance of the code, have completed their operations. The problem of matrix-matrix multiplication in the presence of practically big sized of data sets faced with computational and memory related difficulties, which makes such operations are carried out using distributed computing platforms. In this work, we study the problem of distributed matrix-matrix multiplication W = XY under storage constraints, i.e., when each server is allowed to store a fixed fraction of each of the matrices X and Y, which is a fundamental building of many science and engineering fields such as machine learning, image and signal processing, wireless communication, optimization. Non-secure and secure matrix multiplication are studied. We want to study the setup, in which the identity of the matrix of interest should be kept private from the workers and then obtain the recovery threshold of the colluding model, that is, the number of workers that need to complete their task before the master server can recover the product W. The problem of secure and private distributed matrix multiplication W = XY which the matrix X is confidential, while matrix Y is selected in a private manner from a library of public matrices. We present the best currently known trade-off between communication load and recovery threshold. On the other words, we design an achievable PSGPD scheme for any arbitrary privacy level by trivially concatenating a robust PIR scheme for arbitrary colluding workers and private databases and the proposed SGPD code that provides a smaller computational complexity at the workers.

Keywords: coded distributed computation, private information retrieval, secret sharing, stragglers

Procedia PDF Downloads 122
293 TARF: Web Toolkit for Annotating RNA-Related Genomic Features

Authors: Jialin Ma, Jia Meng

Abstract:

Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.

Keywords: RNA-related genomic features, annotation, visualization, web server

Procedia PDF Downloads 208
292 Time of Week Intensity Estimation from Interval Censored Data with Application to Police Patrol Planning

Authors: Jiahao Tian, Michael D. Porter

Abstract:

Law enforcement agencies are tasked with crime prevention and crime reduction under limited resources. Having an accurate temporal estimate of the crime rate would be valuable to achieve such a goal. However, estimation is usually complicated by the interval-censored nature of crime data. We cast the problem of intensity estimation as a Poisson regression using an EM algorithm to estimate the parameters. Two special penalties are added that provide smoothness over the time of day and day of the week. This approach presented here provides accurate intensity estimates and can also uncover day-of-week clusters that share the same intensity patterns. Anticipating where and when crimes might occur is a key element to successful policing strategies. However, this task is complicated by the presence of interval-censored data. The censored data refers to the type of data that the event time is only known to lie within an interval instead of being observed exactly. This type of data is prevailing in the field of criminology because of the absence of victims for certain types of crime. Despite its importance, the research in temporal analysis of crime has lagged behind the spatial component. Inspired by the success of solving crime-related problems with a statistical approach, we propose a statistical model for the temporal intensity estimation of crime with censored data. The model is built on Poisson regression and has special penalty terms added to the likelihood. An EM algorithm was derived to obtain maximum likelihood estimates, and the resulting model shows superior performance to the competing model. Our research is in line with the smart policing initiative (SPI) proposed by the Bureau Justice of Assistance (BJA) as an effort to support law enforcement agencies in building evidence-based, data-driven law enforcement tactics. The goal is to identify strategic approaches that are effective in crime prevention and reduction. In our case, we allow agencies to deploy their resources for a relatively short period of time to achieve the maximum level of crime reduction. By analyzing a particular area within cities where data are available, our proposed approach could not only provide an accurate estimate of intensities for the time unit considered but a time-variation crime incidence pattern. Both will be helpful in the allocation of limited resources by either improving the existing patrol plan with the understanding of the discovery of the day of week cluster or supporting extra resources available.

Keywords: cluster detection, EM algorithm, interval censoring, intensity estimation

Procedia PDF Downloads 66
291 Simultech - Innovative Country-Wide Ultrasound Training Center

Authors: Yael Rieder, Yael Gilboa, S. O. Adva, Efrat Halevi, Ronnie Tepper

Abstract:

Background: Operation of ultrasound equipment is a core skill for many clinical specialties. As part of the training program at -Simultech- a simulation center for Ob\Gyn at the Meir Medical Center, Israel, teaching how to operate ultrasound equipment requires dealing with misunderstandings of spatial and 3D orientation, failure of the operator to hold a transducer correctly, and limited ability to evaluate the data on the screen. We have developed a platform intended to endow physicians and sonographers with clinical and operational skills of obstetric ultrasound. Simultech's simulations are focused on medical knowledge, risk management, technology operations and physician-patient communication. The simulations encompass extreme work conditions. Setup: Between eight and ten of the eight hundred and fifty physicians and sonographers of the Clalit health services from seven hospitals and eight community centers across Israel, participate in individual Ob/Gyn training sessions each week. These include Ob/Gyn specialists, experts, interns, and sonographers. Innovative teaching and training methodologies: The six-hour training program includes: (1) An educational computer program that challenges trainees to deal with medical questions based upon ultrasound pictures and films. (2) Sophisticated hands-on simulators that challenge the trainees to practice correct grip of the transducer, elucidate pathology, and practice daily tasks such as biometric measurements and analysis of sonographic data. (3) Participation in a video-taped simulation which focuses on physician-patient communications. In the simulation, the physician is required to diagnose the clinical condition of a hired actress based on the data she provides and by evaluating the assigned ultrasound films accordingly. Giving ‘bad news’ to the patient may put the physician in a stressful situation that must be properly managed. (4) Feedback at the end of each phase is provided by a designated trainer, not a physician, who is specially qualified by Ob\Gyn senior specialists. (5) A group exercise in which the trainer presents a medico-legal case in order to encourage the participants to use their own experience and knowledge to conduct a productive ‘brainstorming’ session. Medical cases are presented and analyzed by the participants together with the trainer's feedback. Findings: (1) The training methods and content that Simultech provides allows trainees to review their medical and communications skills. (2) Simultech training sessions expose physicians to both basic and new, up-to-date cases, refreshing and expanding the trainee's knowledge. (3) Practicing on advanced simulators enables trainees to understand the sonographic space and to implement the basic principles of ultrasound. (4) Communications simulations were found to be beneficial for trainees who were unaware of their interpersonal skills. The trainer feedback, supported by the recorded simulation, allows the trainee to draw conclusions about his performance. Conclusion: Simultech was found to contribute to physicians at all levels of clinical expertise who deal with ultrasound. A break in daily routine together with attendance at a neutral educational center can vastly improve performance and outlook.

Keywords: medical training, simulations, ultrasound, Simultech

Procedia PDF Downloads 280
290 Media Framing of Media Regulators in Ghana: A Content Analysis of Selected News Articles on Four Ghanaian Online Newspapers

Authors: Elizabeth Owusu Asiamah

Abstract:

The Ghanaian news media play a crucial role in shaping people's thinking patterns through the nature of the coverage they give to issues, events and personalities. Since the media do not work in a vacuum but within a broader spectrum, which is society, whatever stories they cover and the nature of frames used to narrate such stories go a long way to influence how citizens perceive issues in the country. Consequently, the National Media Commission and the National Communications Authority were instituted to monitor and direct the activities of the media to ensure professionalism that prioritizes society's interest over commercial interest. As the two media regulators go about their routine task of monitoring the operations of the media, they receive coverage from various media outlets (newspapers, radio, television and online). Some people believe that the kind of approach the regulators adopt depends on the nature of coverage the media give them in their reportage. This situation demands an investigation into how the media, regulated by these regulatory bodies, are representing the regulators in the public's eye and the issues arising from such coverage. Extant literature indicates that studies on media framing have centered on politics, environmental issues, public health issues, conflict and wars, etc. However, there appear to be no studies on media framing of media regulators, especially in the Ghanaian context. Since online newspapers have assumed more mainstream positions in the Ghanaian media and have attracted more audiences in recent times, this study investigates the nature of coverage given to media regulators by four purposively sampled online newspapers in Ghana. 96 news articles are extracted from the websites of the Daily Graphic, Ghanaian Times, Daily Guide and Chronicle newspapers within a five-year period to identify the prominence given to stories about the two media regulators and the frames used to narrate stories about them. Data collected are thematically analyzed through the lens of agenda-setting and media-framing theories. The findings of the study revealed that the two regulators were not given much coverage by way of frequency; however, much prominence was given to them in terms of enhancements such as images. The study further disclosed that most of the news articles framed the regulators as weak and incompetent, which is likely to affect how the public also views the regulators. The study concludes that since frames around the supportive nature of the regulators to issues of the media were not hammered by the online newspapers, the public will not perceive the regulators as playing their roles effectively. Thus, a need for more positive frames to be used to narrate stories about the National Media Commission and the National Communication Authority to promote a cordial relationship between the two institutions and a good image to the public.

Keywords: agenda setting, media framing, media regulators, online newspapers

Procedia PDF Downloads 69
289 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 315
288 Optical-Based Lane-Assist System for Rowing Boats

Authors: Stephen Tullis, M. David DiDonato, Hong Sung Park

Abstract:

Rowing boats (shells) are often steered by a small rudder operated by one of the backward-facing rowers; the attention required of that athlete then slightly decreases the power that that athlete can provide. Reducing the steering distraction would then increase the overall boat speed. Races are straight 2000 m courses with each boat in a 13.5 m wide lane marked by small (~15 cm) widely-spaced (~10 m) buoys, and the boat trajectory is affected by both cross-currents and winds. An optical buoy recognition and tracking system has been developed that provides the boat’s location and orientation with respect to the lane edges. This information is provided to the steering athlete as either: a simple overlay on a video display, or fed to a simplified autopilot system giving steering directions to the athlete or directly controlling the rudder. The system is then effectively a “lane-assist” device but with small, widely-spaced lane markers viewed from a very shallow angle due to constraints on camera height. The image is captured with a lightweight 1080p webcam, and most of the image analysis is done in OpenCV. The colour RGB-image is converted to a grayscale using the difference of the red and blue channels, which provides good contrast between the red/yellow buoys and the water, sky, land background and white reflections and noise. Buoy detection is done with thresholding within a tight mask applied to the image. Robust linear regression using Tukey’s biweight estimator of the previously detected buoy locations is used to develop the mask; this avoids the false detection of noise such as waves (reflections) and, in particular, buoys in other lanes. The robust regression also provides the current lane edges in the camera frame that are used to calculate the displacement of the boat from the lane centre (lane location), and its yaw angle. The interception of the detected lane edges provides a lane vanishing point, and yaw angle can be calculated simply based on the displacement of this vanishing point from the camera axis and the image plane distance. Lane location is simply based on the lateral displacement of the vanishing point from any horizontal cut through the lane edges. The boat lane position and yaw are currently fed what is essentially a stripped down marine auto-pilot system. Currently, only the lane location is used in a PID controller of a rudder actuator with integrator anti-windup to deal with saturation of the rudder angle. Low Kp and Kd values decrease unnecessarily fast return to lane centrelines and response to noise, and limiters can be used to avoid lane departure and disqualification. Yaw is not used as a control input, as cross-winds and currents can cause a straight course with considerable yaw or crab angle. Mapping of the controller with rudder angle “overall effectiveness” has not been finalized - very large rudder angles stall and have decreased turning moments, but at less extreme angles the increased rudder drag slows the boat and upsets boat balance. The full system has many features similar to automotive lane-assist systems, but with the added constraints of the lane markers, camera positioning, control response and noise increasing the challenge.

Keywords: auto-pilot, lane-assist, marine, optical, rowing

Procedia PDF Downloads 132
287 Downtime Estimation of Building Structures Using Fuzzy Logic

Authors: M. De Iuliis, O. Kammouh, G. P. Cimellaro, S. Tesfamariam

Abstract:

Community Resilience has gained a significant attention due to the recent unexpected natural and man-made disasters. Resilience is the process of maintaining livable conditions in the event of interruptions in normally available services. Estimating the resilience of systems, ranging from individuals to communities, is a formidable task due to the complexity involved in the process. The most challenging parameter involved in the resilience assessment is the 'downtime'. Downtime is the time needed for a system to recover its services following a disaster event. Estimating the exact downtime of a system requires a lot of inputs and resources that are not always obtainable. The uncertainties in the downtime estimation are usually handled using probabilistic methods, which necessitates acquiring large historical data. The estimation process also involves ignorance, imprecision, vagueness, and subjective judgment. In this paper, a fuzzy-based approach to estimate the downtime of building structures following earthquake events is proposed. Fuzzy logic can integrate descriptive (linguistic) knowledge and numerical data into the fuzzy system. This ability allows the use of walk down surveys, which collect data in a linguistic or a numerical form. The use of fuzzy logic permits a fast and economical estimation of parameters that involve uncertainties. The first step of the method is to determine the building’s vulnerability. A rapid visual screening is designed to acquire information about the analyzed building (e.g. year of construction, structural system, site seismicity, etc.). Then, a fuzzy logic is implemented using a hierarchical scheme to determine the building damageability, which is the main ingredient to estimate the downtime. Generally, the downtime can be divided into three main components: downtime due to the actual damage (DT1); downtime caused by rational and irrational delays (DT2); and downtime due to utilities disruption (DT3). In this work, DT1 is computed by relating the building damageability results obtained from the visual screening to some already-defined components repair times available in the literature. DT2 and DT3 are estimated using the REDITM Guidelines. The Downtime of the building is finally obtained by combining the three components. The proposed method also allows identifying the downtime corresponding to each of the three recovery states: re-occupancy; functional recovery; and full recovery. Future work is aimed at improving the current methodology to pass from the downtime to the resilience of buildings. This will provide a simple tool that can be used by the authorities for decision making.

Keywords: resilience, restoration, downtime, community resilience, fuzzy logic, recovery, damage, built environment

Procedia PDF Downloads 160
286 A Critical Analysis of the Creation of Geoparks in Brazil: Challenges and Possibilities

Authors: Isabella Maria Beil

Abstract:

The International Geosciences and Geoparks Programme (IGGP) were officially created in 2015 by the United Nations Educational, Scientific and Cultural Organization (UNESCO) to enhance the protection of the geological heritage and fill the gaps on the World Heritage Convention. According to UNESCO, a Global Geopark is an unified area where sites and landscapes of international geological significance are managed based on a concept of sustainable development. Tourism is seen as a main activity to develop new sources of revenue. Currently (November 2022), UNESCO recognized 177 Global Geoparks, of which more than 50% are in Europe, 40% in Asia, 6% in Latin America, and the remaining 4% are distributed between Africa and Anglo-Saxon America. This picture shows the existence of a much uneven geographical distribution of these areas across the planet. Currently, there are three Geoparks in Brazil; however, the first of them was accepted by the Global Geoparks Network in 2006 and, just fifteen years later, two other Brazilian Geoparks also obtained the UNESCO title. Therefore, this paper aims to provide an overview of the current geopark situation in Brazil and to identify the main challenges faced by the implementation of these areas in the country. To this end, the Brazilian history and its main characteristics regarding the development of geoparks over the years will be briefly presented. Then, the results obtained from interviews with those responsible for each of the current 29 aspiring geoparks in Brazil will be presented. Finally, the main challenges related to the implementation of Geoparks in the country will be listed. Among these challenges, the answers obtained through the interviews revealed conflicts and problems that pose hindrances both to the start of the development of a Geopark project and to its continuity and implementation. It is clear that the task of getting multiple social actors, or stakeholders, to engage with the Geopark, one of UNESCO’s guidelines, is one of its most complex aspects. Therefore, among the main challenges, stand out the difficulty of establishing solid partnerships, what directly reflects divergences between the different social actors and their goals. This difficulty in establishing partnerships happens for a number of reasons. One of them is that the investment in a Geopark project can be high and investors often expect a short-term financial return. In addition, political support from the public sector is often costly as well, since the possible results and positive influences of a Geopark in a given area will only be experienced during future mandates. These results demonstrate that the research on Geoparks goes far beyond the geological perspective linked to its origins, and is deeply embedded in political and economic issues.

Keywords: Brazil, geoparks, tourism, UNESCO

Procedia PDF Downloads 90
285 A Cross-Sectional Study on Evaluation of Studies Conducted on Women in Turkey

Authors: Oya Isik, Filiz Yurtal, Kubilay Vursavus, Muge K. Davran, Metehan Celik, Munire Akgul, Olcay Karacan

Abstract:

In this study, to discuss the causes and problems of women by bringing together different disciplines engaged in women's studies were aimed. Also, to solve these problems, to share information and experiences in different disciplines about women, and to reach the task areas and decision mechanisms in practice were other objectives. For this purpose, proceedings presented at the Second Congress of Women's Studies held in Adana, Turkey, on 28-30 November 2018 was evaluated. The document analysis model, which is one of the qualitative research methods, was used in the evaluation of the congress proceedings. A total of 86 papers were presented in the congress and the topic distributions of the papers were determined. At the evaluation stage, the papers were classified according to their subjects and descriptive analyses were made on the papers. According to the analysis results of the papers presented in the congress, 64 % of the total 86 papers presented in the Congress were review-based and 36 % were research-based studies. When the distribution of these reports was examined based on subject, the biggest share with the rate of 34.9% (13 reviews and 17 research-based papers) has been studied on women's issues through sociology, psychology and philosophy. This was followed by the economy, employment, organization, and non-governmental organizations with 20.9% (9 reviews and nine research-based papers), arts and literature with 17.4% (15 reviews based papers) and law with 12.8% (11 reviews based papers). The lowest share of the congress was presented in politics with one review based paper (1.2%), health with two research-based paper (2.3%), history with two reviews based papers (2.3%), religion with two reviews and one research-based papers (3.5%) and media-communication with two compilations and two researches based papers (4.7%). In the papers categorized under main headings, women were examined in terms of gender and gender roles. According to the results, it was determined that discrimination against women continued, changes in-laws were not put into practice sufficiently, education and economic independence levels of women were insufficient, and violence against women continued increasingly. To eliminate all these problems and to make the society conscious, it was decided that scientific studies should be supported. Furthermore, support policies should be realized jointly for women and men to make women visible in public life, tolerance or mitigation should not be put forward for any reason or in any group in cases of harassment and assault against women. However, it has been determined that women in Turkey should be in a better position in the social, cultural, psychological, economic and educational areas, and future studies should be carried out to improve women's rights and to create a positive perspective.

Keywords: gender, gender roles, sociology, psychology and philosophy, women studies

Procedia PDF Downloads 146
284 An Integrated Approach to the Carbonate Reservoir Modeling: Case Study of the Eastern Siberia Field

Authors: Yana Snegireva

Abstract:

Carbonate reservoirs are known for their heterogeneity, resulting from various geological processes such as diagenesis and fracturing. These complexities may cause great challenges in understanding fluid flow behavior and predicting the production performance of naturally fractured reservoirs. The investigation of carbonate reservoirs is crucial, as many petroleum reservoirs are naturally fractured, which can be difficult due to the complexity of their fracture networks. This can lead to geological uncertainties, which are important for global petroleum reserves. The problem outlines the key challenges in carbonate reservoir modeling, including the accurate representation of fractures and their connectivity, as well as capturing the impact of fractures on fluid flow and production. Traditional reservoir modeling techniques often oversimplify fracture networks, leading to inaccurate predictions. Therefore, there is a need for a modern approach that can capture the complexities of carbonate reservoirs and provide reliable predictions for effective reservoir management and production optimization. The modern approach to carbonate reservoir modeling involves the utilization of the hybrid fracture modeling approach, including the discrete fracture network (DFN) method and implicit fracture network, which offer enhanced accuracy and reliability in characterizing complex fracture systems within these reservoirs. This study focuses on the application of the hybrid method in the Nepsko-Botuobinskaya anticline of the Eastern Siberia field, aiming to prove the appropriateness of this method in these geological conditions. The DFN method is adopted to model the fracture network within the carbonate reservoir. This method considers fractures as discrete entities, capturing their geometry, orientation, and connectivity. But the method has significant disadvantages since the number of fractures in the field can be very high. Due to limitations in the amount of main memory, it is very difficult to represent these fractures explicitly. By integrating data from image logs (formation micro imager), core data, and fracture density logs, a discrete fracture network (DFN) model can be constructed to represent fracture characteristics for hydraulically relevant fractures. The results obtained from the DFN modeling approaches provide valuable insights into the East Siberia field's carbonate reservoir behavior. The DFN model accurately captures the fracture system, allowing for a better understanding of fluid flow pathways, connectivity, and potential production zones. The analysis of simulation results enables the identification of zones of increased fracturing and optimization opportunities for reservoir development with the potential application of enhanced oil recovery techniques, which were considered in further simulations on the dual porosity and dual permeability models. This approach considers fractures as separate, interconnected flow paths within the reservoir matrix, allowing for the characterization of dual-porosity media. The case study of the East Siberia field demonstrates the effectiveness of the hybrid model method in accurately representing fracture systems and predicting reservoir behavior. The findings from this study contribute to improved reservoir management and production optimization in carbonate reservoirs with the use of enhanced and improved oil recovery methods.

Keywords: carbonate reservoir, discrete fracture network, fracture modeling, dual porosity, enhanced oil recovery, implicit fracture model, hybrid fracture model

Procedia PDF Downloads 75
283 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 170
282 Participatory Cartography for Disaster Reduction in Pogreso, Yucatan Mexico

Authors: Gustavo Cruz-Bello

Abstract:

Progreso is a coastal community in Yucatan, Mexico, highly exposed to floods produced by severe storms and tropical cyclones. A participatory cartography approach was conducted to help to reduce floods disasters and assess social vulnerability within the community. The first step was to engage local authorities in risk management to facilitate the process. Two workshop were conducted, in the first, a poster size printed high spatial resolution satellite image of the town was used to gather information from the participants: eight women and seven men, among them construction workers, students, government employees and fishermen, their ages ranged between 23 and 58 years old. For the first task, participants were asked to locate emblematic places and place them in the image to familiarize with it. Then, they were asked to locate areas that get flooded, the buildings that they use as refuges, and to list actions that they usually take to reduce vulnerability, as well as to collectively come up with others that might reduce disasters. The spatial information generated at the workshops was digitized and integrated into a GIS environment. A printed version of the map was reviewed by local risk management experts, who validated feasibility of proposed actions. For the second workshop, we retrieved the information back to the community for feedback. Additionally a survey was applied in one household per block in the community to obtain socioeconomic, prevention and adaptation data. The information generated from the workshops was contrasted, through T and Chi Squared tests, with the survey data in order to probe the hypothesis that poorer or less educated people, are less prepared to face floods (more vulnerable) and live near or among higher presence of floods. Results showed that a great majority of people in the community are aware of the hazard and are prepared to face it. However, there was not a consistent relationship between regularly flooded areas with people’s average years of education, house services, or house modifications against heavy rains to be prepared to hazards. We could say that the participatory cartography intervention made participants aware of their vulnerability and made them collectively reflect about actions that can reduce disasters produced by floods. They also considered that the final map could be used as a communication and negotiation instrument with NGO and government authorities. It was not found that poorer and less educated people are located in areas with higher presence of floods.

Keywords: climate change, floods, Mexico, participatory mapping, social vulnerability

Procedia PDF Downloads 113
281 Transport Properties of Alkali Nitrites

Authors: Y. Mateyshina, A.Ulihin, N.Uvarov

Abstract:

Electrolytes with different type of charge carrier can find widely application in different using, e.g. sensors, electrochemical equipments, batteries and others. One of important components ensuring stable functioning of the equipment is electrolyte. Electrolyte has to be characterized by high conductivity, thermal stability, and wide electrochemical window. In addition to many advantageous characteristic for liquid electrolytes, the solid state electrolytes have good mechanical stability, wide working range of temperature range. Thus search of new system of solid electrolytes with high conductivity is an actual task of solid state chemistry. Families of alkali perchlorates and nitrates have been investigated by us earlier. In literature data about transport properties of alkali nitrites are absent. Nevertheless, alkali nitrites MeNO2 (Me= Li+, Na+, K+, Rb+ and Cs+), except for the lithium salt, have high-temperature phases with crystal structure of the NaCl-type. High-temperature phases of nitrites are orientationally disordered, i.e. non-spherical anions are reoriented over several equivalents directions in the crystal lattice. Pure lithium nitrite LiNO2 is characterized by ionic conductivity near 10-4 S/cm at 180°C and more stable as compared with lithium nitrate and can be used as a component for synthesis of composite electrolytes. In this work composite solid electrolytes in the binary system LiNO2 - A (A= MgO, -Al2O3, Fe2O3, CeO2, SnO2, SiO2) were synthesized and their structural, thermodynamic and electrical properties investigated. Alkali nitrite was obtained by exchange reaction from water solutions of barium nitrite and alkali sulfate. The synthesized salt was characterized by X-ray powder diffraction technique using D8 Advance X-Ray Diffractometer with Cu K radiation. Using thermal analysis, the temperatures of dehydration and thermal decomposition of salt were determined.. The conductivity was measured using a two electrode scheme in a forevacuum (6.7 Pa) with an HP 4284A (Precision LCR meter) in a frequency range 20 Hz < ν < 1 MHz. Solid composite electrolytes LiNO2 - A A (A= MgO, -Al2O3, Fe2O3, CeO2, SnO2, SiO2) have been synthesized by mixing of preliminary dehydrated components followed by sintering at 250°C. In the series of nitrite of alkaline metals Li+-Cs+, the conductivity varies not monotonically with increasing radius of cation. The minimum conductivity is observed for KNO2; however, with further increase in the radius of cation in the series, the conductivity tends to increase. The work was supported by the Russian Foundation for Basic research, grant #14-03-31442.

Keywords: conductivity, alkali nitrites, composite electrolytes, transport properties

Procedia PDF Downloads 319
280 Robust Electrical Segmentation for Zone Coherency Delimitation Base on Multiplex Graph Community Detection

Authors: Noureddine Henka, Sami Tazi, Mohamad Assaad

Abstract:

The electrical grid is a highly intricate system designed to transfer electricity from production areas to consumption areas. The Transmission System Operator (TSO) is responsible for ensuring the efficient distribution of electricity and maintaining the grid's safety and quality. However, due to the increasing integration of intermittent renewable energy sources, there is a growing level of uncertainty, which requires a faster responsive approach. A potential solution involves the use of electrical segmentation, which involves creating coherence zones where electrical disturbances mainly remain within the zone. Indeed, by means of coherent electrical zones, it becomes possible to focus solely on the sub-zone, reducing the range of possibilities and aiding in managing uncertainty. It allows faster execution of operational processes and easier learning for supervised machine learning algorithms. Electrical segmentation can be applied to various applications, such as electrical control, minimizing electrical loss, and ensuring voltage stability. Since the electrical grid can be modeled as a graph, where the vertices represent electrical buses and the edges represent electrical lines, identifying coherent electrical zones can be seen as a clustering task on graphs, generally called community detection. Nevertheless, a critical criterion for the zones is their ability to remain resilient to the electrical evolution of the grid over time. This evolution is due to the constant changes in electricity generation and consumption, which are reflected in graph structure variations as well as line flow changes. One approach to creating a resilient segmentation is to design robust zones under various circumstances. This issue can be represented through a multiplex graph, where each layer represents a specific situation that may arise on the grid. Consequently, resilient segmentation can be achieved by conducting community detection on this multiplex graph. The multiplex graph is composed of multiple graphs, and all the layers share the same set of vertices. Our proposal involves a model that utilizes a unified representation to compute a flattening of all layers. This unified situation can be penalized to obtain (K) connected components representing the robust electrical segmentation clusters. We compare our robust segmentation to the segmentation based on a single reference situation. The robust segmentation proves its relevance by producing clusters with high intra-electrical perturbation and low variance of electrical perturbation. We saw through the experiences when robust electrical segmentation has a benefit and in which context.

Keywords: community detection, electrical segmentation, multiplex graph, power grid

Procedia PDF Downloads 79
279 Associations between Mindfulness, Temporal Discounting, Locus of Control, and Reward-Based Eating in a Sample of Overweight and Obese Adults

Authors: Andrea S. Badillo-Perez, Alexis D. Mitchell, Sara M. Levens

Abstract:

Overeating, and obesity have been associated with addictive behavior, primarily due to behaviors like reward-based eating, the tendency to overeat due to factors such as lack of control, preoccupation over food, and lack of satiation. Temporal discounting (TD), the ability to select future rewards over short term gains, and mindfulness, the process of maintaining present moment awareness, have been suggested to have significant, differential impacts on health-related behaviors. An individual’s health locus of control, the degree to which they feel that they have control over their health is also known to have an impact on health outcomes. The goal of this study was to investigate the relationship between health locus of control and reward-based eating, as well as the relation between TD and mindfulness in a sample (N = 126) of overweight or obese participants from larger health-focused study. Through the use of questionnaires (including the Five Facet Mindfulness Questionnaire (FFMQ), Reward-Based Eating Drive (RED), and Multidimensional Health Locus of Control (MHLOC)), anthropometric measurements, and a computerized TD task, a series of regressions tested the association between subscales of these measures. Results revealed differences in how the mindfulness subscales are associated with TD measures. Specifically the ‘Observing’ (beta =-.203) and ‘Describing’ (beta =.26) subscales were associated with lower TD rates and a longer subjective devaluation time-frame respectively. In contrast, the ‘Acting with Awareness’ subscale was associated with a shorter subjective devaluation timeframe (beta =-.23). These findings suggest that the reflective perspective initiated through the observing and describing components of mindfulness may facilitate delay of gratification, whereas the acting with awareness component of mindfulness, which focuses on the present moment, may make delay of gratification more challenging. Results also indicated that a higher degree of reward-based eating was associated with a higher degree of an external health locus of control based on the power of chance (beta =.10). However, an external locus of control based on the power of others had no significant association with reward-based eating. This finding implies that the belief that health is due to chance is associated with greater reward-based eating behavior, suggesting that interventions that focus on locus of control may be helpful. Overall, findings demonstrate that weight loss interventions may benefit from health locus of control and mindfulness exercises, but caution should be taken as the components of mindfulness appear to have different effects on increasing or decreasing delay of gratification.

Keywords: health locus of control, mindfulness, obesity, reward-based eating, temporal discounting

Procedia PDF Downloads 137
278 Cognitive Control Moderates the Concurrent Effect of Autistic and Schizotypal Traits on Divergent Thinking

Authors: Julie Ramain, Christine Mohr, Ahmad Abu-Akel

Abstract:

Divergent thinking—a cognitive component of creativity—and particularly the ability to generate unique and novel ideas, has been linked to both autistic and schizotypal traits. However, to our knowledge, the concurrent effect of these trait dimensions on divergent thinking has not been investigated. Moreover, it has been suggested that creativity is associated with different types of attention and cognitive control, and consequently how information is processed in a given context. Intriguingly, consistent with the diametric model, autistic and schizotypal traits have been associated with contrasting attentional and cognitive control styles. Positive schizotypal traits have been associated with reactive cognitive control and attentional flexibility, while autistic traits have been associated with proactive cognitive control and the increased focus of attention. The current study investigated the relationship between divergent thinking, autistic and schizotypal traits and cognitive control in a non-clinical sample of 83 individuals (Males = 42%; Mean age = 22.37, SD = 2.93), sufficient to detect a medium effect size. Divergent thinking was evaluated in an adapted version of-of the Figural Torrance Test of Creative Thinking. Crucially, since we were interested in testing divergent thinking productivity across contexts, participants were asked to generate items from basic shapes in four different contexts. The variance of the proportion of unique to total responses across contexts represented a measure of context adaptability, with lower variance indicating increased context adaptability. Cognitive control was estimated with the Behavioral Proactive Index of the AX-CPT task, with higher scores representing the ability to actively maintain goal-relevant information in a sustained/anticipatory manner. Autistic and schizotypal traits were assessed with the Autism Quotient (AQ) and the Community Assessment of Psychic Experiences (CAPE-42). Generalized linear models revealed a 3-way interaction of autistic and positive schizotypal traits, and proactive cognitive control, associated with increased context adaptability. Specifically, the concurrent effect of autistic and positive schizotypal traits on increased context adaptability was moderated by the level of proactive control and was only significant when proactive cognitive control was high. Our study reveals that autistic and positive schizotypal traits interactively facilitate the capacity to generate unique ideas across various contexts. However, this effect depends on cognitive control mechanisms indicative of the ability to proactively maintain attention when needed. The current results point to a unique profile of divergent thinkers who have the ability to respectively tap both systematic and flexible processing modes within and across contexts. This is particularly intriguing as such combination of phenotypes has been proposed to explain the genius of Beethoven, Nash, and Newton.

Keywords: autism, schizotypy, creativity, cognitive control

Procedia PDF Downloads 137
277 High-Performance Thin-layer Chromatography (HPTLC) Analysis of Multi-Ingredient Traditional Chinese Medicine Supplement

Authors: Martin Cai, Khadijah B. Hashim, Leng Leo, Edmund F. Tian

Abstract:

Analysis of traditional Chinese medicinal (TCM) supplements has always been a laborious task, particularly in the case of multi‐ingredient formulations. Traditionally, herbal extracts are analysed using one or few markers compounds. In the recent years, however, pharmaceutical companies are introducing health supplements of TCM active ingredients to cater to the needs of consumers in the fast-paced society in this age. As such, new problems arise in the aspects of composition identification as well as quality analysis. In most cases of products or supplements formulated with multiple TCM herbs, the chemical composition, and nature of each raw material differs greatly from the others in the formulation. This results in a requirement for individual analytical processes in order to identify the marker compounds in the various botanicals. Thin-layer Chromatography (TLC) is a simple, cost effective, yet well-regarded method for the analysis of natural products, both as a Pharmacopeia-approved method for identification and authentication of herbs, and a great analytical tool for the discovery of chemical compositions in herbal extracts. Recent technical advances introduced High-Performance TLC (HPTLC) where, with the help of automated equipment and improvements on the chromatographic materials, both the quality and reproducibility are greatly improved, allowing for highly standardised analysis with greater details. Here we report an industrial consultancy project with ONI Global Pte Ltd for the analysis of LAC Liver Protector, a TCM formulation aimed at improving liver health. The aim of this study was to identify 4 key components of the supplement using HPTLC, following protocols derived from Chinese Pharmacopeia standards. By comparing the TLC profiles of the supplement to the extracts of the herbs reported in the label, this project proposes a simple and cost-effective analysis of the presence of the 4 marker compounds in the multi‐ingredient formulation by using 4 different HPTLC methods. With the increasing trend of small and medium-sized enterprises (SMEs) bringing natural products and health supplements into the market, it is crucial that the qualities of both raw materials and end products be well-assured for the protection of consumers. With the technology of HPTLC, science can be incorporated to help SMEs with their quality control, thereby ensuring product quality.

Keywords: traditional Chinese medicine supplement, high performance thin layer chromatography, active ingredients, product quality

Procedia PDF Downloads 280
276 Safety Tolerance Zone for Driver-Vehicle-Environment Interactions under Challenging Conditions

Authors: Matjaž Šraml, Marko Renčelj, Tomaž Tollazzi, Chiara Gruden

Abstract:

Road safety is a worldwide issue with numerous and heterogeneous factors influencing it. On the side, driver state – comprising distraction/inattention, fatigue, drowsiness, extreme emotions, and socio-cultural factors highly affect road safety. On the other side, the vehicle state has an important role in mitigating (or not) the road risk. Finally, the road environment is still one of the main determinants of road safety, defining driving task complexity. At the same time, thanks to technological development, a lot of detailed data is easily available, creating opportunities for the detection of driver state, vehicle characteristics and road conditions and, consequently, for the design of ad hoc interventions aimed at improving driver performance, increase awareness and mitigate road risks. This is the challenge faced by the i-DREAMS project. i-DREAMS, which stands for a smart Driver and Road Environment Assessment and Monitoring System, is a 3-year project funded by the European Union’s Horizon 2020 research and innovation program. It aims to set up a platform to define, develop, test and validate a ‘Safety Tolerance Zone’ to prevent drivers from getting too close to the boundaries of unsafe operation by mitigating risks in real-time and after the trip. After the definition and development of the Safety Tolerance Zone concept and the concretization of the same in an Advanced driver-assistance system (ADAS) platform, the system was tested firstly for 2 months in a driving simulator environment in 5 different countries. After that, naturalistic driving studies started for a 10-month period (comprising a 1-month pilot study, 3-month baseline study and 6 months study implementing interventions). Currently, the project team has approved a common evaluation approach, and it is developing the assessment of the usage and outcomes of the i-DREAMS system, which is turning positive insights. The i-DREAMS consortium consists of 13 partners, 7 engineering universities and research groups, 4 industry partners and 2 partners (European Transport Safety Council - ETSC - and POLIS cities and regions for transport innovation) closely linked to transport safety stakeholders, covering 8 different countries altogether.

Keywords: advanced driver assistant systems, driving simulator, safety tolerance zone, traffic safety

Procedia PDF Downloads 68