Search results for: visual preferences
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2535

Search results for: visual preferences

45 Snake Locomotion: From Sinusoidal Curves and Periodic Spiral Formations to the Design of a Polymorphic Surface

Authors: Ennios Eros Giogos, Nefeli Katsarou, Giota Mantziorou, Elena Panou, Nikolaos Kourniatis, Socratis Giannoudis

Abstract:

In the context of the postgraduate course Productive Design, Department of Interior Architecture of the University of West Attica in Athens, under the guidance of Professors Nikolaos Koyrniatis and Socratis Giannoudis, kinetic mechanisms with parametric models were examined for their further application in the design of objects. In the first phase, the students studied a motion mechanism that they chose from daily experience and then analyzed its geometric structure in relation to the geometric transformations that exist. In the second phase, the students tried to design it through a parametric model in Grasshopper3d for Rhino algorithmic processor and plan the design of its application in an everyday object. For the project presented, our team began by studying the movement of living beings, specifically the snake. By studying the snake and the role that the environment has in its movement, four basic typologies were recognized: serpentine, concertina, sidewinding and rectilinear locomotion, as well as its ability to perform spiral formations. Most typologies are characterized by ripples, a series of sinusoidal curves. For the application of the snake movement in a polymorphic space divider, the use of a coil-type joint was studied. In the Grasshopper program, the simulation of the desired motion for the polymorphic surface was tested by applying a coil on a sinusoidal curve and a spiral curve. It was important throughout the process that the points corresponding to the nodes of the real object remain constant in number, as well as the distances between them and the elasticity of the construction had to be achieved through a modular movement of the coil and not some elastic element (material) at the nodes. Using mesh (repeating coil), the whole construction is transformed into a supporting body and combines functionality with aesthetics. The set of elements functions as a vertical spatial network, where each element participates in its coherence and stability. Depending on the positions of the elements in terms of the level of support, different perspectives are created in terms of the visual perception of the adjacent space. For the implementation of the model on the scale (1:3), (0.50m.x2.00m.), the load-bearing structure that was studied has aluminum rods for the basic pillars Φ6mm and Φ 2.50 mm, for the secondary columns. Filling elements and nodes are of similar material and were made of MDF surfaces. During the design process, four trapezoidal patterns were picketed, which function as filling elements, while in order to support their assembly, a different engraving facet was done. The nodes have holes that can be pierced by the rods, while their connection point with the patterns has a half-carved recess. The patterns have a corresponding recess. The nodes are of two different types depending on the column that passes through them. The patterns and knots were designed to be cut and engraved using a Laser Cutter and attached to the knots using glue. The parameters participate in the design as mechanisms that generate complex forms and structures through the repetition of constantly changing versions of the parts that compose the object.

Keywords: polymorphic, locomotion, sinusoidal curves, parametric

Procedia PDF Downloads 105
44 The Assessment of Infiltrated Wastewater on the Efficiency of Recovery Reuse and Irrigation Scheme: North Gaza Emergency Sewage Treatment Project as a Case Study

Authors: Yaser S. Kishawi, Sadi R. Ali

Abstract:

Part of Palestine, Gaza Strip (365 km2 and 1.8 million habitants) is considered a semi-arid zone relies solely on the Coastal Aquifer. The coastal aquifer is only source of water with only 5-10% suitable for human use. This barely covers the domestic and agricultural needs of Gaza Strip. Palestinian Water Authority Strategy is finding non-conventional water resource from treated wastewater to cover agricultural requirements and serve the population. A new WWTP project is to replace the old-overloaded Biet Lahia WWTP. The project consists of three parts; phase A (pressure line and infiltration basins-IBs), phase B (a new WWTP) and phase C (Recovery and Reuse Scheme–RRS– to capture the spreading plume). Currently, only phase A is functioning. Nearly 23 Mm3 of partially treated wastewater were infiltrated into the aquifer. Phase B and phase C witnessed many delays and this forced a reassessment of the RRS original design. An Environmental Management Plan was conducted from Jul 2013 to Jun 2014 on 13 existing monitoring wells surrounding the project location. This is to measure the efficiency of the SAT system and the spread of the contamination plume with relation to the efficiency of the proposed RRS. Along with the proposed location of the 27 recovery wells as part of the proposed RRS. The results of monitored wells were assessed compared with PWA baseline data. This was put into a groundwater model to simulate the plume to propose the best suitable solution to the delays. The redesign mainly manipulated the pumping rate of wells, proposed locations and functioning schedules (including wells groupings). The proposed simulations were examined using visual MODFLOW V4.2 to simulate the results. The results of monitored wells were assessed based on the location of the monitoring wells related to the proposed recovery wells locations (200m, 500m, and 750m away from the IBs). Near the 500m line (the first row of proposed recovery wells), an increase of nitrate (from 30 to 70mg/L) compare to a decrease in Chloride (1500 to below 900mg/L) was found during the monitoring period which indicated an expansion of plume to this distance. On this rate with the required time to construct the recovery scheme, keeping the original design the RRS will fail to capture the plume. Based on that many simulations were conducted leading into three main scenarios. The scenarios manipulated the starting dates, the pumping rate and the locations of recovery wells. A simulation of plume expansion and path-lines were extracted from the model monitoring how to prevent the expansion towards the nearby municipal wells. It was concluded that the location is the most important factor in determining the RRS efficiency. Scenario III was adopted and showed effective results even with a reduced pumping rates. This scenario proposed adding two additional recovery wells in a location beyond the 750m line to compensate the delays and effectively capture the plume. A continuous monitoring program for current and future monitoring wells should be in place to support the proposed scenario and ensure maximum protection.

Keywords: soil aquifer treatment, recovery reuse scheme, infiltration basins, North Gaza

Procedia PDF Downloads 204
43 Automatic Content Curation of Visual Heritage

Authors: Delphine Ribes Lemay, Valentine Bernasconi, André Andrade, Lara DéFayes, Mathieu Salzmann, FréDéRic Kaplan, Nicolas Henchoz

Abstract:

Digitization and preservation of large heritage induce high maintenance costs to keep up with the technical standards and ensure sustainable access. Creating impactful usage is instrumental to justify the resources for long-term preservation. The Museum für Gestaltung of Zurich holds one of the biggest poster collections of the world from which 52’000 were digitised. In the process of building a digital installation to valorize the collection, one objective was to develop an algorithm capable of predicting the next poster to show according to the ones already displayed. The work presented here describes the steps to build an algorithm able to automatically create sequences of posters reflecting associations performed by curator and professional designers. The exposed challenge finds similarities with the domain of song playlist algorithms. Recently, artificial intelligence techniques and more specifically, deep-learning algorithms have been used to facilitate their generations. Promising results were found thanks to Recurrent Neural Networks (RNN) trained on manually generated playlist and paired with clusters of extracted features from songs. We used the same principles to create the proposed algorithm but applied to a challenging medium, posters. First, a convolutional autoencoder was trained to extract features of the posters. The 52’000 digital posters were used as a training set. Poster features were then clustered. Next, an RNN learned to predict the next cluster according to the previous ones. RNN training set was composed of poster sequences extracted from a collection of books from the Gestaltung Museum of Zurich dedicated to displaying posters. Finally, within the predicted cluster, the poster with the best proximity compared to the previous poster is selected. The mean square distance between features of posters was used to compute the proximity. To validate the predictive model, we compared sequences of 15 posters produced by our model to randomly and manually generated sequences. Manual sequences were created by a professional graphic designer. We asked 21 participants working as professional graphic designers to sort the sequences from the one with the strongest graphic line to the one with the weakest and to motivate their answer with a short description. The sequences produced by the designer were ranked first 60%, second 25% and third 15% of the time. The sequences produced by our predictive model were ranked first 25%, second 45% and third 30% of the time. The sequences produced randomly were ranked first 15%, second 29%, and third 55% of the time. Compared to designer sequences, and as reported by participants, model and random sequences lacked thematic continuity. According to the results, the proposed model is able to generate better poster sequencing compared to random sampling. Eventually, our algorithm is sometimes able to outperform a professional designer. As a next step, the proposed algorithm should include a possibility to create sequences according to a selected theme. To conclude, this work shows the potentiality of artificial intelligence techniques to learn from existing content and provide a tool to curate large sets of data, with a permanent renewal of the presented content.

Keywords: Artificial Intelligence, Digital Humanities, serendipity, design research

Procedia PDF Downloads 184
42 Guiding Urban Development in a Traditional Neighbourhood: Case Application of Kolkata

Authors: Nabamita Nath, Sanghamitra Sarkar

Abstract:

Urban development in traditional neighbourhoods of cities is undergoing a sea change due to imposition of irregular development patterns on a predominantly inclusive urban fabric. In recent times, traditional neighbourhoods of Kolkata have experienced irregular urban development which has resulted in transformation of its immediate urban character. The goal is to study and analyse impact of new urban developments within traditional neighbourhoods of Kolkata and establish development guidelines to balance the old with the new. Various cities predominantly in third world countries are also experiencing similar development patterns in their traditional neighbourhoods. Existing literature surveys of development patterns in such neighbourhoods have established 9 major parameters viz. edge, movement, node, landmark, size-density, pattern-grain-texture, open spaces, urban spaces, urban form and views-vistas of the neighbourhood. To evaluate impact of urban development in traditional neighbourhoods of Kolkata, 3 different areas have been chronologically selected based on their settlement patterns. Parameters established through literature surveys have been applied to the selected areas to study and analyse the existing patterns of development. The main sources of this study included extensive on-site surveys, academic archive, census data, organisational records and informational websites. Applying the established parameters, 5 major conclusions were derived. Firstly, it was found that pedestrian friendly neighbourhoods of the city were becoming more car-centric. This has resulted in loss of interactive and social spaces which defined the cultural heritage of Kolkata. Secondly, the urban pattern which was composed of dense and compact fabric is gradually losing its character due to incorporation of new building typologies. Thirdly, the new building typologies include gated communities with private open spaces which is a stark departure from the existing built typology. However, these open spaces have not contributed in creation of inclusive public places for the community which are a significant part of such heritage neighbourhood precincts. Fourthly, commercial zones that primarily developed along major access routes have now infiltrated within these neighbourhoods. Gated communities do not favour formation of on-street commercial activities generating haphazard development patterns. Lastly, individual residential buildings that reflected Indo-saracenic and Neo-gothic architectural styles are converting into multi-storeyed residential apartments. As a result, the axis that created a definite visual identity for a neighbourhood is progressively following an irregular pattern. Thus, uniformity of the old skyline is gradually becoming inconsistent. The major issue currently is threat caused by irregular urban development to heritage zones and buildings of traditional neighbourhoods. Streets, lanes, courtyards, open spaces and buildings of old neighbourhoods imparted a unique cultural identity to the city that is disappearing with emerging urban development patterns. It has been concluded that specific guidelines for urban development should be regulated primarily based on existing urban form of traditional neighbourhoods. Such neighbourhood development strategies should be formulated for various cities of third world countries to control irregular developments thereby balancing heritage and development.

Keywords: heritage, Kolkata, traditional neighbourhood, urban development

Procedia PDF Downloads 179
41 „Real and Symbolic in Poetics of Multiplied Screens and Images“

Authors: Kristina Horvat Blazinovic

Abstract:

In the context of a work of art, one can talk about the idea-concept-term-intention expressed by the artist by using various forms of repetition (external, material, visible repetition). Such repetitions of elements (images in space or moving visual and sound images in time) suggest a "covert", "latent" ("dressed") repetition – i.e., "hidden", "latent" term-intention-idea. Repeating in this way reveals a "deeper truth" that the viewer needs to decode and which is hidden "under" the technical manifestation of the multiplied images. It is not only images, sounds, and screens that are repeated - something else is repeated through them as well, even if, in some cases, the very idea of repetition is repeated. This paper examines serial images and single-channel or multi-channel artwork in the field of video/film art and video installations, which in a way implies the concept of repetition and multiplication. Moving or static images and screens (as multi-screens) are repeated in time and space. The categories of the real and the symbolic partly refer to the Lacan registers of reality, i.e., the Imaginary - Symbolic – Real trinity that represents the orders within which human subjectivity is established. Authors such as Bruce Nauman, VALIE EXPORT, Ragnar Kjartansson, Wolf Vostell, Shirin Neshat, Paul Sharits, Harun Farocki, Dalibor Martinis, Andy Warhol, Douglas Gordon, Bill Viola, Frank Gillette, and Ira Schneider, and Marina Abramovic problematize, in different ways, the concept and procedures of multiplication - repetition, but not in the sense of "copying" and "repetition" of reality or the original, but of repeated repetitions of the simulacrum. Referential works of art are often connected by the theme of the traumatic. Repetitions of images and situations are a response to the traumatic (experience) - repetition itself is a symptom of trauma. On the other hand, repeating and multiplying traumatic images results in a new traumatic effect or cancels it. Reflections on repetition as a temporal and spatial phenomenon are in line with the chapters that link philosophical considerations of space and time and experience temporality with their manifestation in works of art. The observations about time and the relation of perception and memory are according to Henry Bergson and his conception of duration (durée) as "quality of quantity." The video works intended to be displayed as a video loop, express the idea of infinite duration ("pure time," according to Bergson). The Loop wants to be always present - to fixate in time. Wholeness is unrecognizable because the intention is to make the effect infinitely cyclic. Reflections on time and space end with considerations about the occurrence and effects of time and space intervals as places and moments "between" – the points of connection and separation, of continuity and stopping - by reference to the "interval theory" of Soviet filmmaker DzigaVertov. The scale of opportunities that can be explored in interval mode is wide. Intervals represent the perception of time and space in the form of pauses, interruptions, breaks (e.g., emotional, dramatic, or rhythmic) denote emptiness or silence, distance, proximity, interstitial space, or a gap between various states.

Keywords: video installation, performance, repetition, multi-screen, real and symbolic, loop, video art, interval, video time

Procedia PDF Downloads 173
40 Closing the Gap: Efficient Voxelization with Equidistant Scanlines and Gap Detection

Authors: S. Delgado, C. Cerrada, R. S. Gómez

Abstract:

This research introduces an approach to voxelizing the surfaces of triangular meshes with efficiency and accuracy. Our method leverages parallel equidistant scan-lines and introduces a Gap Detection technique to address the limitations of existing approaches. We present a comprehensive study showcasing the method's effectiveness, scalability, and versatility in different scenarios. Voxelization is a fundamental process in computer graphics and simulations, playing a pivotal role in applications ranging from scientific visualization to virtual reality. Our algorithm focuses on enhancing the voxelization process, especially for complex models and high resolutions. One of the major challenges in voxelization in the Graphics Processing Unit (GPU) is the high cost of discovering the same voxels multiple times. These repeated voxels incur in costly memory operations with no useful information. Our scan-line-based method ensures that each voxel is detected exactly once when processing the triangle, enhancing performance without compromising the quality of the voxelization. The heart of our approach lies in the use of parallel, equidistant scan-lines to traverse the interiors of triangles. This minimizes redundant memory operations and avoids revisiting the same voxels, resulting in a significant performance boost. Moreover, our method's computational efficiency is complemented by its simplicity and portability. Written as a single compute shader in Graphics Library Shader Language (GLSL), it is highly adaptable to various rendering pipelines and hardware configurations. To validate our method, we conducted extensive experiments on a diverse set of models from the Stanford repository. Our results demonstrate not only the algorithm's efficiency, but also its ability to produce 26 tunnel free accurate voxelizations. The Gap Detection technique successfully identifies and addresses gaps, ensuring consistent and visually pleasing voxelized surfaces. Furthermore, we introduce the Slope Consistency Value metric, quantifying the alignment of each triangle with its primary axis. This metric provides insights into the impact of triangle orientation on scan-line based voxelization methods. It also aids in understanding how the Gap Detection technique effectively improves results by targeting specific areas where simple scan-line-based methods might fail. Our research contributes to the field of voxelization by offering a robust and efficient approach that overcomes the limitations of existing methods. The Gap Detection technique fills a critical gap in the voxelization process. By addressing these gaps, our algorithm enhances the visual quality and accuracy of voxelized models, making it valuable for a wide range of applications. In conclusion, "Closing the Gap: Efficient Voxelization with Equidistant Scan-lines and Gap Detection" presents an effective solution to the challenges of voxelization. Our research combines computational efficiency, accuracy, and innovative techniques to elevate the quality of voxelized surfaces. With its adaptable nature and valuable innovations, this technique could have a positive influence on computer graphics and visualization.

Keywords: voxelization, GPU acceleration, computer graphics, compute shaders

Procedia PDF Downloads 72
39 Bacterial Exposure and Microbial Activity in Dental Clinics during Cleaning Procedures

Authors: Atin Adhikari, Sushma Kurella, Pratik Banerjee, Nabanita Mukherjee, Yamini M. Chandana Gollapudi, Bushra Shah

Abstract:

Different sharp instruments, drilling machines, and high speed rotary instruments are routinely used in dental clinics during dental cleaning. Therefore, these cleaning procedures release a lot of oral microorganisms including bacteria in clinic air and may cause significant occupational bioaerosol exposure risks for dentists, dental hygienists, patients, and dental clinic employees. Two major goals of this study were to quantify volumetric airborne concentrations of bacteria and to assess overall microbial activity in this type of occupational environment. The study was conducted in several dental clinics of southern Georgia and 15 dental cleaning procedures were targeted for sampling of airborne bacteria and testing of overall microbial activity in settled dusts over clinic floors. For air sampling, a Biostage viable cascade impactor was utilized, which comprises an inlet cone, precision-drilled 400-hole impactor stage, and a base that holds an agar plate (Tryptic soy agar). A high-flow Quick-Take-30 pump connected to this impactor pulls microorganisms in air at 28.3 L/min flow rate through the holes (jets) where they are collected on the agar surface for approx. five minutes. After sampling, agar plates containing the samples were placed in an ice chest with blue ice and plates were incubated at 30±2°C for 24 to 72 h. Colonies were counted and converted to airborne concentrations (CFU/m3) followed by positive hole corrections. Most abundant bacterial colonies (selected by visual screening) were identified by PCR amplicon sequencing of 16S rRNA genes. For understanding overall microbial activity in clinic floors and estimating a general cleanliness of the clinic surfaces during or after dental cleaning procedures, ATP levels were determined in swabbed dust samples collected from 10 cm2 floor surfaces. Concentration of ATP may indicate both the cell viability and the metabolic status of settled microorganisms in this situation. An ATP measuring kit was used, which utilized standard luciferin-luciferase fluorescence reaction and a luminometer, which quantified ATP levels as relative light units (RLU). Three air and dust samples were collected during each cleaning procedure (at the beginning, during cleaning, and immediately after the procedure was completed (n = 45). Concentrations at the beginning, during, and after dental cleaning procedures were 671±525, 917±1203, and 899±823 CFU/m3, respectively for airborne bacteria and 91±101, 243±129, and 139±77 RLU/sample, respectively for ATP levels. The concentrations of bacteria were significantly higher than typical indoor residential environments. Although an increasing trend for airborne bacteria was observed during cleaning, the data collected at three different time points were not significantly different (ANOVA: p = 0.38) probably due to high standard deviations of data. The ATP levels, however, demonstrated a significant difference (ANOVA: p <0.05) in this scenario indicating significant change in microbial activity on floor surfaces during dental cleaning. The most common bacterial genera identified were: Neisseria sp., Streptococcus sp., Chryseobacterium sp., Paenisporosarcina sp., and Vibrio sp. in terms of frequencies of occurrences, respectively. The study concluded that bacterial exposure in dental clinics could be a notable occupational biohazard, and appropriate respiratory protections for the employees are urgently needed.

Keywords: bioaerosols, hospital hygiene, indoor air quality, occupational biohazards

Procedia PDF Downloads 311
38 Rectus Sheath Block to Extend the Effectiveness of Post Operative Epidural Analgesia

Authors: Sugam Kale, Arif Uzair Bin Mohammed Roslan, Cindy Lee, Syed Beevee Mohammed Ismail

Abstract:

Preemptive analgesia is an established concept in the modern practice of anaesthesia. To be most effective, it is best instituted earlier than the surgical stimulus and should last beyond the offset of surgically induced pain till healing is complete. Whereas the start of afferent pain blockade with regional anaesthesia is common, its effect often falls short to cover the entire period of pain impulses making their way to CNS in the post-operative period. We tried to use a combination of two regional anaesthetic techniques used sequentially to overcome this handicap. Madam S., a 56 year old lady, was scheduled for elective surgery for pancreatic cancer. She underwent laparotomy and distal pancreatectomy, splenectomy, bilateral salpingo oophorectomy, and sigmoid colectomy. Surgery was expected to be extensive, and it was presumed that the standard pain relief with PCA with opiates and oral analgesics would not be adequate. After counselling the patient pre-operative about the technique of regional anaesthesia techniques, including epidural catheterization and rectus sheath catheter placement, their benefits, and potential complications, informed consent was obtained. Epidural catheter was placed awake, and general anaesthesia was then induced. Epidural infusion of local anaesthetics was started prior to surgical incision and was continued till 60 hours into the postoperative period. Before skin closure, the surgeons inserted commercially available rectus sheath catheters bilaterally along the midline incision used for laparotomy. After 46 hours post-op, local anaesthetic infusion via these was started as bridging while the epidural infusion rate was tapered off. The epidural catheter was removed at 75 hours. Elastomeric pumps were used to provide local anaesthetic infusion with the ability to vary infusion rates. Acute pain service followed up the patient’s vital signs and effectiveness of pain relief twice daily or more frequently as required. Rectus sheath catheters were removed 137 hours post-op. The patient had good post-op analgesia with the minimal additional analgesic requirement. For the most part, the visual analog score (VAS) for pain remained at 1-3 on a scale of 1 to 10. Haemodynamics remained stable, and surgical recovery was as expected. Minimal opiate requirement after an extensive laparotomy also translates to the early return of intestinal motility. Our experience was encouraging, and we are hoping to extend this combination of two regional anaesthetic techniques to patients undergoing similar surgeries. Epidural analgesia is denser and offers excellent pain relief for both visceral and somatic pain in the first few days after surgery. As the pain intensity grows weaker, rectus sheath block and oral analgesics provide almost the same degree of pain relief after the epidural catheter is removed. We discovered that the background infusion of local anaesthetic down the rectus sheath catherter largely reduced the requirement for other classes of analgesics. We aim to study this further with a larger patient cohort and hope that it may become an established clinical practice that benefits patients everywhere.

Keywords: rectus sheath, epidural infusion, post operative analgesia, elastomeric

Procedia PDF Downloads 134
37 Optical Imaging Based Detection of Solder Paste in Printed Circuit Board Jet-Printing Inspection

Authors: D. Heinemann, S. Schramm, S. Knabner, D. Baumgarten

Abstract:

Purpose: Applying solder paste to printed circuit boards (PCB) with stencils has been the method of choice over the past years. A new method uses a jet printer to deposit tiny droplets of solder paste through an ejector mechanism onto the board. This allows for more flexible PCB layouts with smaller components. Due to the viscosity of the solder paste, air blisters can be trapped in the cartridge. This can lead to missing solder joints or deviations in the applied solder volume. Therefore, a built-in and real-time inspection of the printing process is needed to minimize uncertainties and increase the efficiency of the process by immediate correction. The objective of the current study is the design of an optimal imaging system and the development of an automatic algorithm for the detection of applied solder joints from optical from the captured images. Methods: In a first approach, a camera module connected to a microcomputer and LED strips are employed to capture images of the printed circuit board under four different illuminations (white, red, green and blue). Subsequently, an improved system including a ring light, an objective lens, and a monochromatic camera was set up to acquire higher quality images. The obtained images can be divided into three main components: the PCB itself (i.e., the background), the reflections induced by unsoldered positions or screw holes and the solder joints. Non-uniform illumination is corrected by estimating the background using a morphological opening and subtraction from the input image. Image sharpening is applied in order to prevent error pixels in the subsequent segmentation. The intensity thresholds which divide the main components are obtained from the multimodal histogram using three probability density functions. Determining the intersections delivers proper thresholds for the segmentation. Remaining edge gradients produces small error areas which are removed by another morphological opening. For quantitative analysis of the segmentation results, the dice coefficient is used. Results: The obtained PCB images show a significant gradient in all RGB channels, resulting from ambient light. Using different lightings and color channels 12 images of a single PCB are available. A visual inspection and the investigation of 27 specific points show the best differentiation between those points using a red lighting and a green color channel. Estimating two thresholds from analyzing the multimodal histogram of the corrected images and using them for segmentation precisely extracts the solder joints. The comparison of the results to manually segmented images yield high sensitivity and specificity values. Analyzing the overall result delivers a Dice coefficient of 0.89 which varies for single object segmentations between 0.96 for a good segmented solder joints and 0.25 for single negative outliers. Conclusion: Our results demonstrate that the presented optical imaging system and the developed algorithm can robustly detect solder joints on printed circuit boards. Future work will comprise a modified lighting system which allows for more precise segmentation results using structure analysis.

Keywords: printed circuit board jet-printing, inspection, segmentation, solder paste detection

Procedia PDF Downloads 336
36 Forming Form, Motivation and Their Biolinguistic Hypothesis: The Case of Consonant Iconicity in Tashelhiyt Amazigh and English

Authors: Noury Bakrim

Abstract:

When dealing with motivation/arbitrariness, forming form (Forma Formans) and morphodynamics are to be grasped as relevant implications of enunciation/enactment, schematization within the specificity of language as sound/meaning articulation. Thus, the fact that a language is a form does not contradict stasis/dynamic enunciation (reflexivity vs double articulation). Moreover, some languages exemplify the role of the forming form, uttering, and schematization (roots in Semitic languages, the Chinese case). Beyond the evolutionary biosemiotic process (form/substance bifurcation, the split between realization/representation), non-isomorphism/asymmetry between linguistic form/norm and linguistic realization (phonetics for instance) opens up a new horizon problematizing the role of Brain – sensorimotor contribution in the continuous forming form. Therefore, we hypothesize biotization as both process/trace co-constructing motivation/forming form. Henceforth, referring to our findings concerning distribution and motivation patterns within Berber written texts (pulse based obstruents and nasal-lateral levels in poetry) and oral storytelling (consonant intensity clustering in quantitative and semantic/prosodic motivation), we understand consonant clustering, motivation and schematization as a complex phenomenon partaking in patterns of oral/written iconic prosody and reflexive metalinguistic representation opening the stable form. We focus our inquiry on both Amazigh and English clusters (/spl/, /spr/) and iconic consonant iteration in [gnunnuy] (to roll/tumble), [smummuy] (to moan sadly or crankily). For instance, the syllabic structures of /splaeʃ/ and /splaet/ imply an anamorphic representation of the state of the world: splash, impact on aquatic surfaces/splat impact on the ground. The pair has stridency and distribution as distinctive features which specify its phonetic realization (and a part of its meaning) /ʃ/ is [+ strident] and /t/ is [+ distributed] on the vocal tract. Schematization is then a process relating both physiology/code as an arthron vocal/bodily, vocal/practical shaping of the motor-articulatory system, leading to syntactic/semantic thematization (agent/patient roles in /spl/, /sm/ and other clusters or the tense uvular /qq/ at the initial position in Berber). Furthermore, the productivity of serial syllable sequencing in Berber points out different expressivity forms. We postulate two Components of motivated formalization: i) the process of memory paradigmatization relating to sequence modeling under sensorimotor/verbal specific categories (production/perception), ii) the process of phonotactic selection - prosodic unconscious/subconscious distribution by virtue of iconicity. Basing on multiple tests including a questionnaire, phonotactic/visual recognition and oral/written reproduction, we aim at patterning/conceptualizing consonant schematization and motivation among EFL and Amazigh (Berber) learners and speakers integrating biolinguistic hypotheses.

Keywords: consonant motivation and prosody, language and order of life, anamorphic representation, represented representation, biotization, sensori-motor and brain representation, form, formalization and schematization

Procedia PDF Downloads 143
35 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study

Authors: Cecile Laval, Harriet Lowe

Abstract:

Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.

Keywords: eye-tracking, language teaching, processing instruction, second language acquisition

Procedia PDF Downloads 279
34 Stroke Prevention in Patients with Atrial Fibrillation and Co-Morbid Physical and Mental Health Problems

Authors: Dina Farran, Mark Ashworth, Fiona Gaughran

Abstract:

Atrial fibrillation (AF), the most prevalent cardiac arrhythmia, is associated with an increased risk of stroke, contributing to heart failure and death. In this project, we aim to improve patient safety by screening for stroke risk among people with AF and co-morbid mental illness. To do so, we started by conducting a systematic review and meta-analysis on prevalence, management, and outcomes of AF in people with Serious Mental Illness (SMI) versus the general population. We then evaluated oral anticoagulation (OAC) prescription trends in people with AF and co-morbid SMI in King’s College Hospital. We also evaluated the association between mental illness severity and OAC prescription in eligible patients in South London and Maudsley (SLaM) NHS Foundation Trust. Next, we implemented an electronic clinical decision support system (eCDSS) consisting of a visual prompt on patient electronic Personal Health Records to screen for AF-related stroke risk in three Mental Health of Older Adults wards at SLaM. Finally, we assessed the feasibility and acceptability of the eCDSS by qualitatively investigating clinicians’ perspectives of the potential usefulness of the eCDSS (pre-intervention) and their experiences and their views regarding its impact on clinicians and patients (post-intervention). The systematic review showed that people with SMI had low reported rates of AF. AF patients with SMI were less likely to receive OAC than the general population. When receiving warfarin, people with SMI, particularly bipolar disorder, experienced poor anticoagulation control compared to the general population. Meta-analysis showed that SMI was not significantly associated with an increased risk of stroke or major bleeding when adjusting for underlying risk factors. The main findings of the first observational study were that among AF patients having a high stroke risk, those with co-morbid SMI were less likely than non-SMI to be prescribed any OAC, particularly warfarin. After 2019, there was no significant difference between the two groups. In the second observational study, patients with AF and co-morbid SMI were less likely to be prescribed any OAC compared to those with dementia, substance use disorders, or common mental disorders, adjusting for age, sex, stroke, and bleeding risk scores. Among AF patients with co-morbid SMI, warfarin was less likely to be prescribed to those having alcohol or substance dependency, serious self-injury, hallucinations or delusions, and activities of daily living impairment. In the intervention, clinicians were asked to confirm the presence of AF, clinically assess stroke and bleeding risks, record risk scores in clinical notes, and refer patients at high risk of stroke to OAC clinics. Clinicians reported many potential benefits for the eCDSS, including improving clinical effectiveness, better identification of patients at risk, safer and more comprehensive care, consistency in decision making and saving time. Identified potential risks included rigidity in decision-making, overreliance, reduced critical thinking, false positive recommendations, annoyance, and increased workload. This study presents a unique opportunity to quantify AF patients with mental illness who are at high risk of severe outcomes using electronic health records. This has the potential to improve health outcomes and, therefore patients' quality of life.

Keywords: atrial fibrillation, stroke, mental health conditions, electronic clinical decision support systems

Procedia PDF Downloads 49
33 Green Building Risks: Limits on Environmental and Health Quality Metrics for Contractors

Authors: Erica Cochran Hameen, Bobuchi Ken-Opurum, Mounica Guturu

Abstract:

The United Stated (U.S.) populous spends the majority of their time indoors in spaces where building codes and voluntary sustainability standards provide clear Indoor Environmental Quality (IEQ) metrics. The existing sustainable building standards and codes are aimed towards improving IEQ, health of occupants, and reducing the negative impacts of buildings on the environment. While they address the post-occupancy stage of buildings, there are fewer standards on the pre-occupancy stage thereby placing a large labor population in environments much less regulated. Construction personnel are often exposed to a variety of uncomfortable and unhealthy elements while on construction sites, primarily thermal, visual, acoustic, and air quality related. Construction site power generators, equipment, and machinery generate on average 9 decibels (dBA) above the U.S. OSHA regulations, creating uncomfortable noise levels. Research has shown that frequent exposure to high noise levels leads to chronic physiological issues and increases noise induced stress, yet beyond OSHA no other metric focuses directly on the impacts of noise on contractors’ well-being. Research has also associated natural light with higher productivity and attention span, and lower cases of fatigue in construction workers. However, daylight is not always available as construction workers often perform tasks in cramped spaces, dark areas, or at nighttime. In these instances, the use of artificial light is necessary, yet lighting standards for use during lengthy tasks and arduous activities is not specified. Additionally, ambient air, contaminants, and material off-gassing expelled at construction sites are one of the causes of serious health effects in construction workers. Coupled with extreme hot and cold temperatures for different climate zones, health and productivity can be seriously compromised. This research evaluates the impact of existing green building metrics on construction and risk management, by analyzing two codes and nine standards including LEED, WELL, and BREAM. These metrics were chosen based on the relevance to the U.S. construction industry. This research determined that less than 20% of the sustainability context within the standards and codes (texts) are related to the pre-occupancy building sector. The research also investigated the impact of construction personnel’s health and well-being on construction management through two surveys of project managers and on-site contractors’ perception of their work environment on productivity. To fully understand the risks of limited Environmental and Health Quality metrics for contractors (EHQ) this research evaluated the connection between EHQ factors such as inefficient lighting, on construction workers and investigated the correlation between various site coping strategies for comfort and productivity. Outcomes from this research are three-pronged. The first includes fostering a discussion about the existing conditions of EQH elements, i.e. thermal, lighting, ergonomic, acoustic, and air quality on the construction labor force. The second identifies gaps in sustainability standards and codes during the pre-occupancy stage of building construction from ground-breaking to substantial completion. The third identifies opportunities for improvements and mitigation strategies to improve EQH such as increased monitoring of effects on productivity and health of contractors and increased inclusion of the pre-occupancy stage in green building standards.

Keywords: construction contractors, health and well-being, environmental quality, risk management

Procedia PDF Downloads 132
32 Sentinel-2 Based Burn Area Severity Assessment Tool in Google Earth Engine

Authors: D. Madhushanka, Y. Liu, H. C. Fernando

Abstract:

Fires are one of the foremost factors of land surface disturbance in diverse ecosystems, causing soil erosion and land-cover changes and atmospheric effects affecting people's lives and properties. Generally, the severity of the fire is calculated as the Normalized Burn Ratio (NBR) index. This is performed manually by comparing two images obtained afterward. Then by using the bitemporal difference of the preprocessed satellite images, the dNBR is calculated. The burnt area is then classified as either unburnt (dNBR<0.1) or burnt (dNBR>= 0.1). Furthermore, Wildfire Severity Assessment (WSA) classifies burnt areas and unburnt areas using classification levels proposed by USGS and comprises seven classes. This procedure generates a burn severity report for the area chosen by the user manually. This study is carried out with the objective of producing an automated tool for the above-mentioned process, namely the World Wildfire Severity Assessment Tool (WWSAT). It is implemented in Google Earth Engine (GEE), which is a free cloud-computing platform for satellite data processing, with several data catalogs at different resolutions (notably Landsat, Sentinel-2, and MODIS) and planetary-scale analysis capabilities. Sentinel-2 MSI is chosen to obtain regular processes related to burnt area severity mapping using a medium spatial resolution sensor (15m). This tool uses machine learning classification techniques to identify burnt areas using NBR and to classify their severity over the user-selected extent and period automatically. Cloud coverage is one of the biggest concerns when fire severity mapping is performed. In WWSAT based on GEE, we present a fully automatic workflow to aggregate cloud-free Sentinel-2 images for both pre-fire and post-fire image compositing. The parallel processing capabilities and preloaded geospatial datasets of GEE facilitated the production of this tool. This tool consists of a Graphical User Interface (GUI) to make it user-friendly. The advantage of this tool is the ability to obtain burn area severity over a large extent and more extended temporal periods. Two case studies were carried out to demonstrate the performance of this tool. The Blue Mountain national park forest affected by the Australian fire season between 2019 and 2020 is used to describe the workflow of the WWSAT. This site detected more than 7809 km2, using Sentinel-2 data, giving an error below 6.5% when compared with the area detected on the field. Furthermore, 86.77% of the detected area was recognized as fully burnt out, of which high severity (17.29%), moderate-high severity (19.63%), moderate-low severity (22.35%), and low severity (27.51%). The Arapaho and Roosevelt National Forest Park, California, the USA, which is affected by the Cameron peak fire in 2020, is chosen for the second case study. It was found that around 983 km2 had burned out, of which high severity (2.73%), moderate-high severity (1.57%), moderate-low severity (1.18%), and low severity (5.45%). These spots also can be detected through the visual inspection made possible by cloud-free images generated by WWSAT. This tool is cost-effective in calculating the burnt area since satellite images are free and the cost of field surveys is avoided.

Keywords: burnt area, burnt severity, fires, google earth engine (GEE), sentinel-2

Procedia PDF Downloads 235
31 Celebrity Culture and Social Role of Celebrities in Türkiye during the 1990s: The Case of Türkiye, Newspaper, Radio, Televison (TGRT) Channel

Authors: Yelda Yenel, Orkut Acele

Abstract:

In a media-saturated world, celebrities have become ubiquitous figures, encountered both in public spaces and within the privacy of our homes, seamlessly integrating into daily life. From Alexander the Great to contemporary media personalities, the image of celebrity has persisted throughout history, manifesting in various forms and contexts. Over time, as the relationship between society and the market evolved, so too did the roles and behaviors of celebrities. These transformations offer insights into the cultural climate, revealing shifts in habits and worldviews. In Türkiye, the emergence of private television channels brought an influx of celebrities into everyday life, making them a pervasive part of daily routines. To understand modern celebrity culture, it is essential to examine the ideological functions of media within political, economic, and social contexts. Within this framework, celebrities serve as both reflections and creators of cultural values and, at times, act as intermediaries, offering insights into the society of their era. Starting its broadcasting life in 1992 with religious films and religious conversation, Türkiye Newspaper, Radio, Television channel (TGRT) later changed its appearance, slogan, and the celebrities it featured in response to the political atmosphere. Celebrities played a critical role in transforming from the existing slogan 'Peace has come to the screen' to 'Watch and see what will happen”. Celebrities hold significant roles in society, and their images are produced and circulated by various actors, including media organizations and public relations teams. Understanding these dynamics is crucial for analyzing their influence and impact. This study aims to explore Turkish society in the 1990s, focusing on TGRT and its visual and discursive characteristics regarding celebrity figures such as Seda Sayan. The first section examines the historical development of celebrity culture and its transformations, guided by the conceptual framework of celebrity studies. The complex and interconnected image of celebrity, as introduced by post-structuralist approaches, plays a fundamental role in making sense of existing relationships. This section traces the existence and functions of celebrities from antiquity to the present day. The second section explores the economic, social, and cultural contexts of 1990s Türkiye, focusing on the media landscape and visibility that became prominent in the neoliberal era following the 1980s. This section also discusses the political factors underlying TGRT's transformation, such as the 1997 military memorandum. The third section analyzes TGRT as a case study, focusing on its significance as an Islamic television channel and the shifts in its public image, categorized into two distinct periods. The channel’s programming, which aligned with Islamic teachings, and the celebrities who featured prominently during these periods became the public face of both TGRT and the broader society. In particular, the transition to a more 'secular' format during TGRT's second phase is analyzed, focusing on changes in celebrity attire and program formats. This study reveals that celebrities are used as indicators of ideology, benefiting from this instrumentalization by enhancing their own fame and reflecting the prevailing cultural hegemony in society.

Keywords: celebrity culture, media, neoliberalism, TGRT

Procedia PDF Downloads 30
30 Emotional State and Cognitive Workload during a Flight Simulation: Heart Rate Study

Authors: Damien Mouratille, Antonio R. Hidalgo-Muñoz, Nadine Matton, Yves Rouillard, Mickael Causse, Radouane El Yagoubi

Abstract:

Background: The monitoring of the physiological activity related to mental workload (MW) on pilots will be useful to improve aviation safety by anticipating human performance degradation. The electrocardiogram (ECG) can reveal MW fluctuations due to either cognitive workload or/and emotional state since this measure exhibits autonomic nervous system modulations. Arguably, heart rate (HR) is one of its most intuitive and reliable parameters. It would be particularly interesting to analyze the interaction between cognitive requirements and emotion in ecologic sets such as a flight simulator. This study aims to explore by means of HR the relation between cognitive demands and emotional activation. Presumably, the effects of cognition and emotion overloads are not necessarily cumulative. Methodology: Eight healthy volunteers in possession of the Private Pilot License were recruited (male; 20.8±3.2 years). ECG signal was recorded along the whole experiment by placing two electrodes on the clavicle and left pectoral of the participants. The HR was computed within 4 minutes segments. NASA-TLX and Big Five inventories were used to assess subjective workload and to consider the influence of individual personality differences. The experiment consisted in completing two dual-tasks of approximately 30 minutes of duration into a flight simulator AL50. Each dual-task required the simultaneous accomplishment of both a pre-established flight plan and an additional task based on target stimulus discrimination inserted between Air Traffic Control instructions. This secondary task allowed us to vary the cognitive workload from low (LC) to high (HC) levels, by combining auditory and visual numerical stimuli to respond to meeting specific criteria. Regarding emotional condition, the two dual-tasks were designed to assure analogous difficulty in terms of solicited cognitive demands. The former was realized by the pilot alone, i.e. Low Arousal (LA) condition. In contrast, the latter generates a high arousal (HA), since the pilot was supervised by two evaluators, filmed and involved into a mock competition with the rest of the participants. Results: Performance for the secondary task showed significant faster reaction times (RT) for HA compared to LA condition (p=.003). Moreover, faster RT was found for LC compared to HC (p < .001) condition. No interaction was found. Concerning HR measure, despite the lack of main effects an interaction between emotion and cognition is evidenced (p=.028). Post hoc analysis showed smaller HR for HA compared to LA condition only for LC (p=.049). Conclusion. The control of an aircraft is a very complex task including strong cognitive demands and depends on the emotional state of pilots. According to the behavioral data, the experimental set has permitted to generate satisfactorily different emotional and cognitive levels. As suggested by the interaction found in HR measure, these two factors do not seem to have a cumulative impact on the sympathetic nervous system. Apparently, low cognitive workload makes pilots more sensitive to emotional variations. These results hint the independency between data processing and emotional regulation. Further physiological data are necessary to confirm and disentangle this relation. This procedure may be useful for monitoring objectively pilot’s mental workload.

Keywords: cognitive demands, emotion, flight simulator, heart rate, mental workload

Procedia PDF Downloads 275
29 Vertebral Artery Dissection Complicating Pregnancy and Puerperium: Case Report and Review of the Literature

Authors: N. Reza Pour, S. Chuah, T. Vo

Abstract:

Background: Vertebral artery dissection (VAD) is a rare complication of pregnancy. It can occur spontaneously or following a traumatic event. The pathogenesis is unclear. Predisposing factors include chronic hypertension, Marfan’s syndrome, fibromuscular dysplasia, vasculitis and cystic medial necrosis. Physiological changes of pregnancy have also been proposed as potential mechanisms of injury to the vessel wall. The clinical presentation varies and it can present as a headache, neck pain, diplopia, transient ischaemic attack, or an ischemic stroke. Isolated cases of VAD in pregnancy and puerperium have been reported in the literature. One case was found to have posterior circulation stroke as a result of bilateral VAD and labour was induced at 37 weeks gestation for preeclampsia. Another patient at 38 weeks with severe neck pain that persisted after induction for elevated blood pressure and arteriography showed right VAD postpartum. A single case of lethal VAD in pregnancy with subsequent massive subarachnoid haemorrhage has been reported which was confirmed by the autopsy. Case Presentation: We report two cases of vertebral artery dissection in pregnancy. The first patient was a 32-year-old primigravida presented at the 38th week of pregnancy with the onset of early labour and blood pressure (BP) of 130/70 on arrival. After 2 hours, the patient developed a severe headache with blurry vision and BP was 238/120. Despite treatment with an intravenous antihypertensive, she had eclamptic fit. Magnesium solfate was started and Emergency Caesarean Section was performed under the general anaesthesia. On the second day after the operation, she developed left-sided neck pain. Magnetic Resonance Imaging (MRI) angiography confirmed a short segment left vertebral artery dissection at the level of C3. The patient was treated with aspirin and remained stable without any neurological deficit. The second patient was a 33-year-old primigavida who was admitted to the hospital at 36 weeks gestation with BP of 155/105, constant headache and visual disturbances. She was medicated with an oral antihypertensive agent. On day 4, she complained of right-sided neck pain. MRI angiogram revealed a short segment dissection of the right vertebral artery at the C2-3 level. Pregnancy was terminated on the same day with emergency Caesarean Section and anticoagulation was started subsequently. Post-operative recovery was complicated by rectus sheath haematoma requiring evacuation. She was discharged home on Aspirin without any neurological sequelae. Conclusion: Because of collateral circulation, unilateral vertebral artery dissections may go unrecognized and may be more common than suspected. The outcome for most patients is benign, reflecting the adequacy of the collateral circulation in young patients. Spontaneous VAD is usually treated with anticoagulation or antiplatelet therapy for a minimum of 3-6 months to prevent future ischaemic events, allowing the dissection to heal on its own. We had two cases of VAD in the context of hypertensive disorders of pregnancy with an acceptable outcome. A high level of vigilance is required particularly with preeclamptic patients presenting with head/neck pain to allow an early diagnosis. This is as we hypothesize, early and aggressive management of vertebral artery dissection may potentially prevent further complications.

Keywords: eclampsia, preeclampsia, pregnancy, Vertebral Artery Dissection

Procedia PDF Downloads 278
28 Automated Facial Symmetry Assessment for Orthognathic Surgery: Utilizing 3D Contour Mapping and Hyperdimensional Computing-Based Machine Learning

Authors: Wen-Chung Chiang, Lun-Jou Lo, Hsiu-Hsia Lin

Abstract:

This study aimed to improve the evaluation of facial symmetry, which is crucial for planning and assessing outcomes in orthognathic surgery (OGS). Facial symmetry plays a key role in both aesthetic and functional aspects of OGS, making its accurate evaluation essential for optimal surgical results. To address the limitations of traditional methods, a different approach was developed, combining three-dimensional (3D) facial contour mapping with hyperdimensional (HD) computing to enhance precision and efficiency in symmetry assessments. The study was conducted at Chang Gung Memorial Hospital, where data were collected from 2018 to 2023 using 3D cone beam computed tomography (CBCT), a highly detailed imaging technique. A large and comprehensive dataset was compiled, consisting of 150 normal individuals and 2,800 patients, totaling 5,750 preoperative and postoperative facial images. These data were critical for training a machine learning model designed to analyze and quantify facial symmetry. The machine learning model was trained to process 3D contour data from the CBCT images, with HD computing employed to power the facial symmetry quantification system. This combination of technologies allowed for an objective and detailed analysis of facial features, surpassing the accuracy and reliability of traditional symmetry assessments, which often rely on subjective visual evaluations by clinicians. In addition to developing the system, the researchers conducted a retrospective review of 3D CBCT data from 300 patients who had undergone OGS. The patients’ facial images were analyzed both before and after surgery to assess the clinical utility of the proposed system. The results showed that the facial symmetry algorithm achieved an overall accuracy of 82.5%, indicating its robustness in real-world clinical applications. Postoperative analysis revealed a significant improvement in facial symmetry, with an average score increase of 51%. The mean symmetry score rose from 2.53 preoperatively to 3.89 postoperatively, demonstrating the system's effectiveness in quantifying improvements after OGS. These results underscore the system's potential for providing valuable feedback to surgeons and aiding in the refinement of surgical techniques. The study also led to the development of a web-based system that automates facial symmetry assessment. This system integrates HD computing and 3D contour mapping into a user-friendly platform that allows for rapid and accurate evaluations. Clinicians can easily access this system to perform detailed symmetry assessments, making it a practical tool for clinical settings. Additionally, the system facilitates better communication between clinicians and patients by providing objective, easy-to-understand symmetry scores, which can help patients visualize the expected outcomes of their surgery. In conclusion, this study introduced a valuable and highly effective approach to facial symmetry evaluation in OGS, combining 3D contour mapping, HD computing, and machine learning. The resulting system achieved high accuracy and offers a streamlined, automated solution for clinical use. The development of the web-based platform further enhances its practicality, making it a valuable tool for improving surgical outcomes and patient satisfaction in orthognathic surgery.

Keywords: facial symmetry, orthognathic surgery, facial contour mapping, hyperdimensional computing

Procedia PDF Downloads 26
27 Academic Achievement in Argentinean College Students: Major Findings in Psychological Assessment

Authors: F. Uriel, M. M. Fernandez Liporace

Abstract:

In the last decade, academic achievement in higher education has become a topic of agenda in Argentina, regarding the high figures of adjustment problems, academic failure and dropout, and the low graduation rates in the context of massive classes and traditional teaching methods. Psychological variables, such as perceived social support, academic motivation and learning styles and strategies have much to offer since their measurement by tests allows a proper diagnose of their influence on academic achievement. Framed in a major research, several studies analysed multiple samples, totalizing 5135 students attending Argentinean public universities. The first goal was aimed at the identification of statistically significant differences in psychological variables -perceived social support, learning styles, learning strategies, and academic motivation- by age, gender, and degree of academic advance (freshmen versus sophomores). Thus, an inferential group differences study for each psychological dependent variable was developed by means of student’s T tests, given the features of data distribution. The second goal, aimed at examining associations between the four psychological variables on the one hand, and academic achievement on the other, was responded by correlational studies, calculating Pearson’s coefficients, employing grades as the quantitative indicator of academic achievement. The positive and significant results that were obtained led to the formulation of different predictive models of academic achievement which had to be tested in terms of adjustment and predictive power. These models took the four psychological variables above mentioned as predictors, using regression equations, examining predictors individually, in groups of two, and together, analysing indirect effects as well, and adding the degree of academic advance and gender, which had shown their importance within the first goal’s findings. The most relevant results were: first, gender showed no influence on any dependent variable. Second, only good achievers perceived high social support from teachers, and male students were prone to perceive less social support. Third, freshmen exhibited a pragmatic learning style, preferring unstructured environments, the use of examples and simultaneous-visual processing in learning, whereas sophomores manifest an assimilative learning style, choosing sequential and analytic processing modes. Despite these features, freshmen have to deal with abstract contents and sophomores, with practical learning situations due to study programs in force. Fifth, no differences in academic motivation were found between freshmen and sophomores. However, the latter employ a higher number of more efficient learning strategies. Sixth, freshmen low achievers lack intrinsic motivation. Seventh, models testing showed that social support, learning styles and academic motivation influence learning strategies, which affect academic achievement in freshmen, particularly males; only learning styles influence achievement in sophomores of both genders with direct effects. These findings led to conclude that educational psychologists, education specialists, teachers, and universities must plan urgent and major changes. These must be applied in renewed and better study programs, syllabi and classes, as well as tutoring and training systems. Such developments should be targeted to the support and empowerment of students in their academic pathways, and therefore to the upgrade of learning quality, especially in the case of freshmen, male freshmen, and low achievers.

Keywords: academic achievement, academic motivation, coping, learning strategies, learning styles, perceived social support

Procedia PDF Downloads 122
26 Official Game Account Analysis: Factors Influence Users' Judgments in Limited-Word Posts

Authors: Shanhua Hu

Abstract:

Social media as a critical propagandizing form of film, video games, and digital products has received substantial research attention, but there exists several critical barriers such as: (1) few studies exploring the internal and external connections of a product as part of the multimodal context that gives rise to readability and commercial return; (2) the lack of study of multimodal analysis in product’s official account of game publishers and its impact on users’ behaviors including purchase intention, social media engagement, and playing time; (3) no standardized ecologically-valid, game type-varying data can be used to study the complexity of official account’s postings within a time period. This proposed research helps to tackle these limitations in order to develop a model of readability study that is more ecologically valid, robust, and thorough. To accomplish this objective, this paper provides a more diverse dataset comprising different visual elements and messages collected from the official Twitter accounts of the Top 20 best-selling games of 2021. Video game companies target potential users through social media, a popular approach is to set up an official account to maintain exposure. Typically, major game publishers would create an official account on Twitter months before the game's release date to update on the game's development, announce collaborations, and reveal spoilers. Analyses of tweets from those official Twitter accounts would assist publishers and marketers in identifying how to efficiently and precisely deploy advertising to increase game sales. The purpose of this research is to determine how official game accounts use Twitter to attract new customers, specifically which types of messages are most effective at increasing sales. The dataset includes the number of days until the actual release date on Twitter posts, the readability of the post (Flesch Reading Ease Score, FRES), the number of emojis used, the number of hashtags, the number of followers of the mentioned users, the categorization of the posts (i.e., spoilers, collaborations, promotions), and the number of video views. The timeline of Twitter postings from official accounts will be compared to the history of pre-orders and sales figures to determine the potential impact of social media posts. This study aims to determine how the above-mentioned characteristics of official accounts' Twitter postings influence the sales of the game and to examine the possible causes of this influence. The outcome will provide researchers with a list of potential aspects that could influence people's judgments in limited-word posts. With the increased average online time, users would adapt more quickly than before in online information exchange and readings, such as the word to use sentence length, and the use of emojis or hashtags. The study on the promotion of official game accounts will not only enable publishers to create more effective promotion techniques in the future but also provide ideas for future research on the influence of social media posts with a limited number of words on consumers' purchasing decisions. Future research can focus on more specific linguistic aspects, such as precise word choice in advertising.

Keywords: engagement, official account, promotion, twitter, video game

Procedia PDF Downloads 76
25 Reimagining Landscapes: Psychological Responses and Behavioral Shifts in the Aftermath of the Lytton Creek Fire

Authors: Tugba Altin

Abstract:

In an era where the impacts of climate change resonate more pronouncedly than ever, communities globally grapple with events bearing both tangible and intangible ramifications. Situating this within the evolving landscapes of Psychological and Behavioral Sciences, this research probes the profound psychological and behavioral responses evoked by such events. The Lytton Creek Fire of 2021 epitomizes these challenges. While tangible destruction is immediate and evident, the intangible repercussions—emotional distress, disintegration of cultural landscapes, and disruptions in place attachment (PA)—require meticulous exploration. PA, emblematic of the emotional and cognitive affiliations individuals nurture with their environments, emerges as a cornerstone for comprehending how environmental cataclysms influence cultural identity and bonds to land. This study, harmonizing the core tenets of an interpretive phenomenological approach with a hermeneutic framework, underscores the pivotal nature of this attachment. It delves deep into the realm of individuals' experiences post the Lytton Creek Fire, unraveling the intricate dynamics of PA amidst such calamity. The study's methodology deviates from conventional paradigms. Instead of traditional interview techniques, it employs walking audio sessions and photo elicitation methods, granting participants the agency to immerse, re-experience, and vocalize their sentiments in real-time. Such techniques shed light on spatial narratives post-trauma and capture the otherwise elusive emotional nuances, offering a visually rich representation of place-based experiences. Central to this research is the voice of the affected populace, whose lived experiences and testimonies form the nucleus of the inquiry. As they renegotiate their bonds with transformed environments, their narratives reveal the indispensable role of cultural landscapes in forging place-based identities. Such revelations accentuate the necessity of integrating both tangible and intangible trauma facets into community recovery strategies, ensuring they resonate more profoundly with affected individuals. Bridging the domains of environmental psychology and behavioral sciences, this research accentuates the intertwined nature of tangible restoration with the imperative of emotional and cultural recuperation post-environmental disasters. It advocates for adaptation initiatives that are rooted in the lived realities of the affected, emphasizing a holistic approach that recognizes the profundity of human connections to landscapes. This research advocates the interdisciplinary exchange of ideas and strategies in addressing post-disaster community recovery strategies. It not only enriches the climate change discourse by emphasizing the human facets of disasters but also reiterates the significance of an interdisciplinary approach, encompassing psychological and behavioral nuances, for fostering a comprehensive understanding of climate-induced traumas. Such a perspective is indispensable for shaping more informed, empathetic, and effective adaptation strategies.

Keywords: place attachment, community recovery, disaster response, restorative landscapes, sensory response, visual methodologies

Procedia PDF Downloads 59
24 Learning Language through Story: Development of Storytelling Website Project for Amazighe Language Learning

Authors: Siham Boulaknadel

Abstract:

Every culture has its share of a rich history of storytelling in oral, visual, and textual form. The Amazigh language, as many languages, has its own which has entertained and informed across centuries and cultures, and its instructional potential continues to serve teachers. According to many researchers, listening to stories draws attention to the sounds of language and helps children develop sensitivity to the way language works. Stories including repetitive phrases, unique words, and enticing description encourage students to join in actively to repeat, chant, sing, or even retell the story. This kind of practice is important to language learners’ oral language development, which is believed to correlate completely with student’s academic success. Today, with the advent of multimedia, digital storytelling for instance can be a practical and powerful learning tool. It has the potential in transforming traditional learning into a world of unlimited imaginary environment. This paper reports on a research project on development of multimedia Storytelling Website using traditional Amazigh oral narratives called “tell me a story”. It is a didactic tool created for the learning of good moral values in an interactive multimedia environment combining on-screen text, graphics and audio in an enticing environment and enabling the positive values of stories to be projected. This Website developed in this study is based on various pedagogical approaches and learning theories deemed suitable for children age 8 to 9 year-old. The design and development of Website was based on a well-researched conceptual framework enabling users to: (1) re-play and share the stories in schools or at home, and (2) access the Website anytime and anywhere. Furthermore, the system stores the students work and activities over the system, allowing parents or teachers to monitor students’ works, and provide online feedback. The Website contains following main feature modules: Storytelling incorporates a variety of media such as audio, text and graphics in presenting the stories. It introduces the children to various kinds of traditional Amazigh oral narratives. The focus of this module is to project the positive values and images of stories using digital storytelling technique. Besides development good moral sense in children using projected positive images and moral values, it also allows children to practice their comprehending and listening skills. Reading module is developed based on multimedia material approach which offers the potential for addressing the challenges of reading instruction. This module is able to stimulate children and develop reading practice indirectly due to the tutoring strategies of scaffolding, self-explanation and hyperlinks offered in this module. Word Enhancement assists the children in understanding the story and appreciating the good moral values more efficiently. The difficult words or vocabularies are attached to present the explanation, which makes the children understand the vocabulary better. In conclusion, we believe that the interactive multimedia storytelling reveals an interesting and exciting tool for learning Amazigh. We plan to address some learning issues, in particularly the uses of activities to test and evaluate the children on their overall understanding of story and words presented in the learning modules.

Keywords: Amazigh language, e-learning, storytelling, language teaching

Procedia PDF Downloads 403
23 High Pressure Thermophysical Properties of Complex Mixtures Relevant to Liquefied Natural Gas (LNG) Processing

Authors: Saif Al Ghafri, Thomas Hughes, Armand Karimi, Kumarini Seneviratne, Jordan Oakley, Michael Johns, Eric F. May

Abstract:

Knowledge of the thermophysical properties of complex mixtures at extreme conditions of pressure and temperature have always been essential to the Liquefied Natural Gas (LNG) industry’s evolution because of the tremendous technical challenges present at all stages in the supply chain from production to liquefaction to transport. Each stage is designed using predictions of the mixture’s properties, such as density, viscosity, surface tension, heat capacity and phase behaviour as a function of temperature, pressure, and composition. Unfortunately, currently available models lead to equipment over-designs of 15% or more. To achieve better designs that work more effectively and/or over a wider range of conditions, new fundamental property data are essential, both to resolve discrepancies in our current predictive capabilities and to extend them to the higher-pressure conditions characteristic of many new gas fields. Furthermore, innovative experimental techniques are required to measure different thermophysical properties at high pressures and over a wide range of temperatures, including near the mixture’s critical points where gas and liquid become indistinguishable and most existing predictive fluid property models used breakdown. In this work, we present a wide range of experimental measurements made for different binary and ternary mixtures relevant to LNG processing, with a particular focus on viscosity, surface tension, heat capacity, bubble-points and density. For this purpose, customized and specialized apparatus were designed and validated over the temperature range (200 to 423) K at pressures to 35 MPa. The mixtures studied were (CH4 + C3H8), (CH4 + C3H8 + CO2) and (CH4 + C3H8 + C7H16); in the last of these the heptane contents was up to 10 mol %. Viscosity was measured using a vibrating wire apparatus, while mixture densities were obtained by means of a high-pressure magnetic-suspension densimeter and an isochoric cell apparatus; the latter was also used to determine bubble-points. Surface tensions were measured using the capillary rise method in a visual cell, which also enabled the location of the mixture critical point to be determined from observations of critical opalescence. Mixture heat capacities were measured using a customised high-pressure differential scanning calorimeter (DSC). The combined standard relative uncertainties were less than 0.3% for density, 2% for viscosity, 3% for heat capacity and 3 % for surface tension. The extensive experimental data gathered in this work were compared with a variety of different advanced engineering models frequently used for predicting thermophysical properties of mixtures relevant to LNG processing. In many cases the discrepancies between the predictions of different engineering models for these mixtures was large, and the high quality data allowed erroneous but often widely-used models to be identified. The data enable the development of new or improved models, to be implemented in process simulation software, so that the fluid properties needed for equipment and process design can be predicted reliably. This in turn will enable reduced capital and operational expenditure by the LNG industry. The current work also aided the community of scientists working to advance theoretical descriptions of fluid properties by allowing to identify deficiencies in theoretical descriptions and calculations.

Keywords: LNG, thermophysical, viscosity, density, surface tension, heat capacity, bubble points, models

Procedia PDF Downloads 274
22 Improvements and Implementation Solutions to Reduce the Computational Load for Traffic Situational Awareness with Alerts (TSAA)

Authors: Salvatore Luongo, Carlo Luongo

Abstract:

This paper discusses the implementation solutions to reduce the computational load for the Traffic Situational Awareness with Alerts (TSAA) application, based on Automatic Dependent Surveillance-Broadcast (ADS-B) technology. In 2008, there were 23 total mid-air collisions involving general aviation fixed-wing aircraft, 6 of which were fatal leading to 21 fatalities. These collisions occurred during visual meteorological conditions, indicating the limitations of the see-and-avoid concept for mid-air collision avoidance as defined in the Federal Aviation Administration’s (FAA). The commercial aviation aircraft are already equipped with collision avoidance system called TCAS, which is based on classic transponder technology. This system dramatically reduced the number of mid-air collisions involving air transport aircraft. In general aviation, the same reduction in mid-air collisions has not occurred, so this reduction is the main objective of the TSAA application. The major difference between the original conflict detection application and the TSAA application is that the conflict detection is focused on preventing loss of separation in en-route environments. Instead TSAA is devoted to reducing the probability of mid-air collision in all phases of flight. The TSAA application increases the flight crew traffic situation awareness providing alerts of traffic that are detected in conflict with ownship in support of the see-and-avoid responsibility. The relevant effort has been spent in the design process and the code generation in order to maximize the efficiency and performances in terms of computational load and memory consumption reduction. The TSAA architecture is divided into two high-level systems: the “Threats database” and the “Conflict detector”. The first one receives the traffic data from ADS-B device and provides the memorization of the target’s data history. Conflict detector module estimates ownship and targets trajectories in order to perform the detection of possible future loss of separation between ownship and each target. Finally, the alerts are verified by additional conflict verification logic, in order to prevent possible undesirable behaviors of the alert flag. In order to reduce the computational load, a pre-check evaluation module is used. This pre-check is only a computational optimization, so the performances of the conflict detector system are not modified in terms of number of alerts detected. The pre-check module uses analytical trajectories propagation for both target and ownship. This allows major accuracy and avoids the step-by-step propagation, which requests major computational load. Furthermore, the pre-check permits to exclude the target that is certainly not a threat, using an analytical and efficient geometrical approach, in order to decrease the computational load for the following modules. This software improvement is not suggested by FAA documents, and so it is the main innovation of this work. The efficiency and efficacy of this enhancement are verified using fast-time and real-time simulations and by the execution on a real device in several FAA scenarios. The final implementation also permits the FAA software certification in compliance with DO-178B standard. The computational load reduction allows the installation of TSAA application also on devices with multiple applications and/or low capacity in terms of available memory and computational capabilities

Keywords: traffic situation awareness, general aviation, aircraft conflict detection, computational load reduction, implementation solutions, software certification

Procedia PDF Downloads 285
21 Computer-Integrated Surgery of the Human Brain, New Possibilities

Authors: Ugo Galvanetto, Pirto G. Pavan, Mirco Zaccariotto

Abstract:

The discipline of Computer-integrated surgery (CIS) will provide equipment able to improve the efficiency of healthcare systems and, which is more important, clinical results. Surgeons and machines will cooperate in new ways that will extend surgeons’ ability to train, plan and carry out surgery. Patient specific CIS of the brain requires several steps: 1 - Fast generation of brain models. Based on image recognition of MR images and equipped with artificial intelligence, image recognition techniques should differentiate among all brain tissues and segment them. After that, automatic mesh generation should create the mathematical model of the brain in which the various tissues (white matter, grey matter, cerebrospinal fluid …) are clearly located in the correct positions. 2 – Reliable and fast simulation of the surgical process. Computational mechanics will be the crucial aspect of the entire procedure. New algorithms will be used to simulate the mechanical behaviour of cutting through cerebral tissues. 3 – Real time provision of visual and haptic feedback A sophisticated human-machine interface based on ergonomics and psychology will provide the feedback to the surgeon. The present work will address in particular point 2. Modelling the cutting of soft tissue in a structure as complex as the human brain is an extremely challenging problem in computational mechanics. The finite element method (FEM), that accurately represents complex geometries and accounts for material and geometrical nonlinearities, is the most used computational tool to simulate the mechanical response of soft tissues. However, the main drawback of FEM lies in the mechanics theory on which it is based, classical continuum Mechanics, which assumes matter is a continuum with no discontinuity. FEM must resort to complex tools such as pre-defined cohesive zones, external phase-field variables, and demanding remeshing techniques to include discontinuities. However, all approaches to equip FEM computational methods with the capability to describe material separation, such as interface elements with cohesive zone models, X-FEM, element erosion, phase-field, have some drawbacks that make them unsuitable for surgery simulation. Interface elements require a-priori knowledge of crack paths. The use of XFEM in 3D is cumbersome. Element erosion does not conserve mass. The Phase Field approach adopts a diffusive crack model instead of describing true tissue separation typical of surgical procedures. Modelling discontinuities, so difficult when using computational approaches based on classical continuum Mechanics, is instead easy for novel computational methods based on Peridynamics (PD). PD is a non-local theory of mechanics formulated with no use of spatial derivatives. Its governing equations are valid at points or surfaces of discontinuity, and it is, therefore especially suited to describe crack propagation and fragmentation problems. Moreover, PD does not require any criterium to decide the direction of crack propagation or the conditions for crack branching or coalescence; in the PD-based computational methods, cracks develop spontaneously in the way which is the most convenient from an energy point of view. Therefore, in PD computational methods, crack propagation in 3D is as easy as it is in 2D, with a remarkable advantage with respect to all other computational techniques.

Keywords: computational mechanics, peridynamics, finite element, biomechanics

Procedia PDF Downloads 80
20 Culture and Health Equity: Unpacking the Sociocultural Determinants of Eye Health for Indigenous Australian Diabetics

Authors: Aryati Yashadhana, Ted Fields Jnr., Wendy Fernando, Kelvin Brown, Godfrey Blitner, Francis Hayes, Ruby Stanley, Brian Donnelly, Bridgette Jerrard, Anthea Burnett, Anthony B. Zwi

Abstract:

Indigenous Australians experience some of the worst health outcomes globally, with life expectancy being significantly poorer than those of non-Indigenous Australians. This is largely attributed to preventable diseases such as diabetes (prevalence 39% in Indigenous Australian adults > 55 years), which is attributed to a raised risk of diabetic visual impairment and cataract among Indigenous adults. Our study aims to explore the interface between structural and sociocultural determinants and human agency, in order to understand how they impact (1) accessibility of eye health and chronic disease services and (2) the potential for Indigenous patients to achieve positive clinical eye health outcomes. We used Participatory Action Research methods, and aimed to privilege the voices of Indigenous people through community collaboration. Semi-structured interviews (n=82) and patient focus groups (n=8) were conducted by Indigenous Community-Based Researchers (CBRs) with diabetic Indigenous adults (> 40 years) in four remote communities in Australia. Interviews (n=25) and focus groups (n=4) with primary health care clinicians in each community were also conducted. Data were audio recorded, transcribed verbatim, and analysed thematically using grounded theory, comparative analysis and Nvivo 10. Preliminary analysis occurred in tandem with data collection to determine theoretical saturation. The principal investigator (AY) led analysis sessions with CBRs, fostering cultural and contextual appropriateness to interpreting responses, knowledge exchange and capacity building. Identified themes were conceptualised into three spheres of influence: structural (health services, government), sociocultural (Indigenous cultural values, distrust of the health system, ongoing effects of colonialism and dispossession) and individual (health beliefs/perceptions, patient phenomenology). Permeating these spheres of influence were three core determinants: economic disadvantage, health literacy/education, and cultural marginalisation. These core determinants affected accessibility of services, and the potential for patients to achieve positive clinical outcomes at every level of care (primary, secondary, tertiary). Our findings highlight the clinical realities of institutionalised and structural inequities, illustrated through the lived experiences of Indigenous patients and primary care clinicians in the four sampled communities. The complex determinants surrounding inequity in health for Indigenous Australians, are entrenched through a longstanding experience of cultural discrimination and ostracism. Secure and long term funding of Aboriginal Community Controlled Health Services will be valuable, but are insufficient to address issues of inequity. Rather, working collaboratively with communities to build trust, and identify needs and solutions at the grassroots level, while leveraging community voices to drive change at the systemic/policy level are recommended.

Keywords: indigenous, Australia, culture, public health, eye health, diabetes, social determinants of health, sociology, anthropology, health equity, aboriginal and Torres strait islander, primary care

Procedia PDF Downloads 300
19 A Qualitative Exploration of the Sexual and Reproductive Health Practices of Adolescent Mothers from Indigenous Populations in Ratanak Kiri Province, Cambodia

Authors: Bridget J. Kenny, Elizabeth Hoban, Jo Williams

Abstract:

Adolescent pregnancy presents a significant public health challenge for Cambodia. Despite declines in the overall fertility rate, the adolescent fertility rate is increasing. Adolescent pregnancy is particularly problematic in the Northeast provinces of Ratanak Kiri and Mondul Kiri where 34 percent of girls aged between 15 and 19 have begun childbearing; this is almost three times Cambodia’s national average of 12 percent. Language, cultural and geographic barriers have restricted qualitative exploration of the sexual and reproductive health (SRH) challenges that face indigenous adolescents in Northeast Cambodia. The current study sought to address this gap by exploring the SRH practices of adolescent mothers from indigenous populations in Ratanak Kiri Province. Twenty-two adolescent mothers, aged between 15 and 19, were recruited from seven indigenous villages in Ratanak Kiri Province and asked to participate in a combined body mapping exercise and semi-structured interview. Participants were given a large piece of paper (59.4 x 84.1 cm) with the outline of a female body and asked to draw the female reproductive organs onto the ‘body map’. Participants were encouraged to explain what they had drawn with the purpose of evoking conversation about their reproductive bodies. Adolescent mothers were then invited to participate in a semi-structured interview to further expand on topics of SRH. The qualitative approach offered an excellent avenue to explore the unique SRH challenges that face indigenous adolescents in rural Cambodia. In particular, the use of visual data collection methods reduced the language and cultural barriers that have previously restricted or prevented qualitative exploration of this population group. Thematic analysis yielded six major themes: (1) understanding of the female reproductive body, (2) contraceptive knowledge, (3) contraceptive use, (4) barriers to contraceptive use, (5) sexual practices, (6) contact with healthcare facilities. Participants could name several modern contraceptive methods and knew where they could access family planning services. However, adolescent mothers explained that they gained this knowledge during antenatal care visits and consequently participants had limited SRH knowledge, including contraceptive awareness, at the time of sexual initiation. Fear of the perceived side effects of modern contraception, including infertility, provided an additional barrier to contraceptive use for indigenous adolescents. Participants did not cite cost or geographic isolation as barriers to accessing SRH services. Child marriage and early sexual initiation were also identified as important factors contributing to the high prevalence of adolescent pregnancy in this population group. The findings support the Ministry of Education, Youth and Sports' (MoEYS) recent introduction of SRH education into the primary and secondary school curriculum but suggest indigenous girls in rural Cambodia require additional sources of SRH information. Results indicate adolescent girls’ first point of contact with healthcare facilities occurs after they become pregnant. Promotion of an effective continuum of care by increasing access to healthcare services during the pre-pregnancy period is suggested as a means of providing adolescents girls with an additional avenue to acquire SRH information.

Keywords: adolescent pregnancy, contraceptive use, family planning, sexual and reproductive health

Procedia PDF Downloads 113
18 Implementation of Autologous Adipose Graft from the Abdomen for Complete Fat Pad Loss of the Heel Following a Traumatic Open Fracture Secondary to a Motor Vehicle Accident: A Case Study

Authors: Ahmad Saad, Shuja Abbas, Breanna Marine

Abstract:

Introduction: This study explores the potential applications of autologous pedal fat pad grafting as a minimally invasive therapeutic strategy for addressing pedal fat pad loss. Without adequate shock absorbing tissue, a patient can experience functional deficits, ulcerations, loss of quality of life, and significant limitations with ambulation. This study details a novel technique involving autologous adipose grafting from the abdomen to enhance plantar fat pad thickness in a patient involved in a severe motor vehicle accident which resulted in total fat pad loss of the heel. Autologous adipose grafting (AAG) was used following adipose allografting in an effort to recreate a normal shock absorbing surface to allow return to activities of daily living and painless ambulation. Methods: A 46-year-old male sustained multiple open pedal fractures and necrosis to the heel fat pad after a motorcycle accident, which resulted in complete loss of the calcaneal fat pad. The patient underwent serial debridement’s, utilization of wound vac therapy and split thickness skin grafting to accomplish complete closure, despite complete loss of adipose to area. Patient presented with complaints of pain on ambulation, inability to bear weight on the heel, recurrent ulcerations, admitted had not been ambulating for two years. Clinical exam demonstrated complete loss of the plantar fat pad with a thin layer of epithelial tissue overlying the calcaneal bone, allowing visibility of the osseous contour of the calcaneus. Scar tissue had formed in place of the fat pad, with thickened epithelial tissue extending from the midfoot to the calcaneus. After conservative measures were exhausted, the patient opted for initial management by adipose allograft matrix (AAM) injections. Post operative X-ray imaging revealed noticeable improvement in calcaneal fat pad thickness. At 1 year follow up, the patient was able to ambulate without assistive devices. The fat pad at this point was significantly thicker than it was pre-operatively, but the thickness did not restore to pre-accident thickness. In order to compare the take of allograft versus autografting of adipose tissue, the decision to use adipose autograft through abdominal liposuction harvesting was deemed suitable. A general surgeon completed harvesting of adipose cells from the patient’s abdomen via liposuction, and a podiatric surgeon performed the AAG injection into the heel. Total of 15 cc’s of autologous adipose tissue injected to the calcaneus. Results: There was a visual increase in the calcaneal fat pad thickness both clinically and radiographically. At the 6-week follow up, imaging revealed retention of the calcaneal fat pad thickness. Three months postop, patient returned to activities of daily living and increased quality of life due to their increased ability to ambulate. Discussion: AAG is a novel treatment for pedal fat pad loss. These treatments may be viable and reproducible therapeutic choices for patients suffering from fat pad atrophy, fat pad loss, and/or plantar ulcerations. Both treatments of AAM and AAG exhibited similar therapeutic results by providing pain relief for ambulation and allowing for patients to return to their quality of life.

Keywords: podiatry, wound, adipose, allograft, autograft, wound care, limb reconstruction, injection, limb salvage

Procedia PDF Downloads 82
17 Chronic Progressive External Ophthalmoplegia (CPEO)

Authors: Gagandeep Singh Digra, Pawan Kumar, Mandeep Kaur Sidhu

Abstract:

INTRODUCTION: Chronic Progressive External Ophthalmoplegia (CPEO), also known as Progressive External Ophthalmoplegia (PEO), is a type of eye disorder characterized by a loss of the muscle functions involved in eye and eyelid movement. CPEO can be caused by mutations in mitochondrial DNA. It typically manifests in young adults with bilateral and progressive ptosis as the most common presentation but can also present with difficulty swallowing (dysphagia) and general weakness of the skeletal muscles (myopathy), particularly in the neck, arms, or legs. CASE PRESENTATION: This is a case discussion of 3 cousins who presented to our clinic. A 23-year-old male with past surgical history (PSH) of ptosis repair 2 years ago presented with a chief complaint of nasal intonation for 1.5 years associated with difficulty swallowing. The patient also complained of nasal regurgitation of liquids. He denied any headaches, fever, seizures, weakness of arms or legs, urinary complaints or changes in bowel habits. Physical Examination was positive for facial muscle weakness, including an inability to lift eyebrows (Frontalis), inability to close eyes tightly (Orbicularis Oculi), corneal reflex absent bilaterally, difficulty clenching jaw (Masseter muscle), difficulty smiling (Zygomaticus major), inability to elevate upper lip (Zygomaticus minor). Another cousin of the first patient, a 25-year-old male with no past medical history, presented with complaints of nasal intonation for 2 years associated with difficulty swallowing. He denied a history of nasal regurgitation, headaches, fever, seizures, weakness, urinary complaints or changes in bowel habits. Physical Examination showed facial muscle weakness of the Frontalis muscle, Orbicularis Oculi muscle, Masseter Muscle, Zygomaticus Major, Zygomaticus Minor and absent corneal reflexes. A 28-year-old male, a cousin of the first two patients, presented with chief complaints of ptosis and nasal intonation for the last 8 years. He also complained of difficulty swallowing and nasal regurgitation of liquids. His physical examination showed facial muscle weakness, including frontalis muscle (inability to lift eyebrows), Orbicularis Oculi (inability to close eyes tightly), absent corneal reflexes bilaterally, Zygomaticus Major (difficulty smiling), and Zygomaticus Minor (inability to elevate upper lip). MRI brain and visual field of all the patients were normal. Differential diagnoses, including Grave’s disease, Myasthenia Gravis and Glioma, were ruled out. Due to financial reasons, muscle biopsy could not be pursued. Pedigree analysis revealed only males were affected, likely due to maternal inheritance, so the clinical diagnosis of CPEO was made. The patients underwent symptomatic management, including ptosis surgical correction for the third patient. CONCLUSION: Chronic Progressive External Ophthalmoplegia (CPEO), a rare case entity, occurs in young adults as a manifestation of mitochondrial myopathy. There are three modes of transmission- maternal transmission associated with mitochondrial point mutations, autosomal recessive, and autosomal dominant. CPEO can sometimes be difficult to diagnose, especially in asymmetric presentation. Therefore, it is crucial to keep it in differential diagnosis to avoid delay in diagnosis.

Keywords: neurology, chronic, progressive, ophthalmoplegia

Procedia PDF Downloads 110
16 Implementation of Building Information Modelling to Monitor, Assess, and Control the Indoor Environmental Quality of Higher Education Buildings

Authors: Mukhtar Maigari

Abstract:

The landscape of Higher Education (HE) institutions, especially following the CVID-19 pandemic, necessitates advanced approaches to manage Indoor Environmental Quality (IEQ) which is crucial for the comfort, health, and productivity of students and staff. This study investigates the application of Building Information Modelling (BIM) as a multifaceted tool for monitoring, assessing, and controlling IEQ in HE buildings aiming to bridge the gap between traditional management practices and the innovative capabilities of BIM. Central to the study is a comprehensive literature review, which lays the foundation by examining current knowledge and technological advancements in both IEQ and BIM. This review sets the stage for a deeper investigation into the practical application of BIM in IEQ management. The methodology consists of Post-Occupancy Evaluation (POE) which encompasses physical monitoring, questionnaire surveys, and interviews under the umbrella of case studies. The physical data collection focuses on vital IEQ parameters such as temperature, humidity, CO2 levels etc, conducted by using different equipment including dataloggers to ensure accurate data. Complementing this, questionnaire surveys gather perceptions and satisfaction levels from students, providing valuable insights into the subjective aspects of IEQ. The interview component, targeting facilities management teams, offers an in-depth perspective on IEQ management challenges and strategies. The research delves deeper into the development of a conceptual BIM-based framework, informed by the insight findings from case studies and empirical data. This framework is designed to demonstrate the critical functions necessary for effective IEQ monitoring, assessment, control and automation with real time data handling capabilities. This BIM-based framework leads to the developing and testing a BIM-based prototype tool. This prototype leverages on software such as Autodesk Revit with its visual programming tool i.e., Dynamo and an Arduino-based sensor network thereby allowing for real-time flow of IEQ data for monitoring, control and even automation. By harnessing the capabilities of BIM technology, the study presents a forward-thinking approach that aligns with current sustainability and wellness goals, particularly vital in the post-COVID-19 era. The integration of BIM in IEQ management promises not only to enhance the health, comfort, and energy efficiency of educational environments but also to transform them into more conducive spaces for teaching and learning. Furthermore, this research could influence the future of HE buildings by prompting universities and government bodies to revaluate and improve teaching and learning environments. It demonstrates how the synergy between IEQ and BIM can empower stakeholders to monitor IEQ conditions more effectively and make informed decisions in real-time. Moreover, the developed framework has broader applications as well; it can serve as a tool for other sustainability assessments, like energy analysis in HE buildings, leveraging measured data synchronized with the BIM model. In conclusion, this study bridges the gap between theoretical research and real-world application by practicalizing how advanced technologies like BIM can be effectively integrated to enhance environmental quality in educational institutions. It portrays the potential of integrating advanced technologies like BIM in the pursuit of improved environmental conditions in educational institutions.

Keywords: BIM, POE, IEQ, HE-buildings

Procedia PDF Downloads 49