Search results for: CCTV Images
249 Systematic Evaluation of Convolutional Neural Network on Land Cover Classification from Remotely Sensed Images
Authors: Eiman Kattan, Hong Wei
Abstract:
In using Convolutional Neural Network (CNN) for classification, there is a set of hyperparameters available for the configuration purpose. This study aims to evaluate the impact of a range of parameters in CNN architecture i.e. AlexNet on land cover classification based on four remotely sensed datasets. The evaluation tests the influence of a set of hyperparameters on the classification performance. The parameters concerned are epoch values, batch size, and convolutional filter size against input image size. Thus, a set of experiments were conducted to specify the effectiveness of the selected parameters using two implementing approaches, named pertained and fine-tuned. We first explore the number of epochs under several selected batch size values (32, 64, 128 and 200). The impact of kernel size of convolutional filters (1, 3, 5, 7, 10, 15, 20, 25 and 30) was evaluated against the image size under testing (64, 96, 128, 180 and 224), which gave us insight of the relationship between the size of convolutional filters and image size. To generalise the validation, four remote sensing datasets, AID, RSD, UCMerced and RSCCN, which have different land covers and are publicly available, were used in the experiments. These datasets have a wide diversity of input data, such as number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in both training and testing. The results have shown that increasing the number of epochs leads to a higher accuracy rate, as expected. However, the convergence state is highly related to datasets. For the batch size evaluation, it has shown that a larger batch size slightly decreases the classification accuracy compared to a small batch size. For example, selecting the value 32 as the batch size on the RSCCN dataset achieves the accuracy rate of 90.34 % at the 11th epoch while decreasing the epoch value to one makes the accuracy rate drop to 74%. On the other extreme, setting an increased value of batch size to 200 decreases the accuracy rate at the 11th epoch is 86.5%, and 63% when using one epoch only. On the other hand, selecting the kernel size is loosely related to data set. From a practical point of view, the filter size 20 produces 70.4286%. The last performed image size experiment shows a dependency in the accuracy improvement. However, an expensive performance gain had been noticed. The represented conclusion opens the opportunities toward a better classification performance in various applications such as planetary remote sensing.Keywords: CNNs, hyperparamters, remote sensing, land cover, land use
Procedia PDF Downloads 169248 Quality Care from the Perception of the Patient in Ambulatory Cancer Services: A Qualitative Study
Authors: Herlin Vallejo, Jhon Osorio
Abstract:
Quality is a concept that has gained importance in different scenarios over time, especially in the area of health. The nursing staff is one of the actors that contributes most to the care process and the satisfaction of the users in the evaluation of quality. However, until now, there are few tools to measure the quality of care in specialized performance scenarios. Patients receiving ambulatory cancer treatments can face various problems, which can increase their level of distress, so improving the quality of outpatient care for cancer patients should be a priority for oncology nursing. The experience of the patient in relation to the care in these services has been little investigated. The purpose of this study was to understand the perception that patients have about quality care in outpatient chemotherapy services. A qualitative, exploratory, descriptive study was carried out in 9 patients older than 18 years, diagnosed with cancer, who were treated at the Institute of Cancerology, in outpatient chemotherapy rooms, with a minimum of three months of treatment with curative intention and which had given your informed consent. The total of participants was determined by the theoretical saturation, and the selection of these was for convenience. Unstructured interviews were conducted, recorded and transcribed. The analysis of the information was done under the technique of content analysis. Three categories emerged that reflect the perception that patients have regarding quality care: patient-centered care, care with love and effects of care. Patients highlighted situations that show that care is centered on them, incorporating elements of patient-centered care from the institutional, infrastructure, qualities of care and what for them, in contrast, means inappropriate care. Care with love as a perception of quality care means for patients that the nursing staff must have certain qualities, perceive caring with love as a family affair, limits on care with love and the nurse-patient relationship. Quality care has effects on both the patient and the nursing staff. One of the most relevant effects was the confidence that the patient develops towards the nurse, besides to transform the unreal images about cancer treatment with chemotherapy. On the other hand, care with quality generates a commitment to self-care and is a facilitator in the transit of oncological disease and chemotherapeutic treatment, but from the perception of a healing transit. It is concluded that care with quality from the perception of patients, is a construction that goes beyond the structural issues and is related to an institutional culture of quality that is reflected in the attitude of the nursing staff and in the acts of Care that have positive effects on the experience of chemotherapy and disease. With the results, it contributes to better understand how quality care is built from the perception of patients and to open a range of possibilities for the future development of an individualized instrument that allows evaluating the quality of care from the perception of patients with cancer.Keywords: nursing care, oncology service hospital, quality management, qualitative studies
Procedia PDF Downloads 137247 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging
Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.Keywords: breast, machine learning, MRI, radiomics
Procedia PDF Downloads 267246 Radiographic Evaluation of Odontogenic Keratocyst: A 14 Years Retrospective Study
Authors: Nor Hidayah Reduwan, Jira Chindasombatjaroen, Suchaya Pornprasersuk-Damrongsri, Sopee Pomsawat
Abstract:
INTRODUCTION: Odontogenic keratocyst (OKC) remain as a controversial pathologic entity under the scrutiny of many researchers and maxillofacial surgeons alike. The high recurrence rate and relatively aggressive nature of this lesion demand a meticulous analysis of the radiographic characteristic of OKC leading to the formulation of an accurate diagnosis. OBJECTIVE: This study aims to determine the radiographic characteristic of odontogenic keratocyst (OKC) using conventional radiographs and cone beam computed tomography (CBCT) images. MATERIALS AND METHODS: Patients histopathologically diagnosed as OKC from 2003 to 2016 by Oral and Maxillofacial Pathology Department were retrospectively reviewed. Radiographs of these cases from the archives of the Department of Oral and Maxillofacial Radiology, Faculty of Dentistry Mahidol University were retrieved. Assessment of the location, shape, border, cortication, locularity, the relationship of lesion to embedded tooth, displacement of adjacent tooth, root resorption and bony expansion of the lesion were conducted. RESULTS: Radiographs of 91 patients (44 males, 47 females) with the mean age of 31 years old (10 to 84 years) were analyzed. Among all patients, 5 cases were syndromic patients. Hence, a total of 103 OKCs were studied. The most common location was at the ramus of mandible (32%) followed by posterior maxilla (29%). Most cases presented as a well-defined unilocular radiolucency with smooth and corticated border. The lesion was in associated with embedded tooth in 48 lesions (47%). Eighty five percent of embedded tooth are impacted 3rd molar. Thirty-seven percentage of embedded tooth were entirely encapsulated in the lesion. The lesion attached to the embedded tooth at the cementoenamel junction (CEJ) in 40% and extended to part of root in 23% of cases. Teeth displacement and root resorption were found in 29% and 6% of cases, respectively. Bony expansion in bucco-lingual dimension was seen in 63% of cases. CONCLUSION: OKCs were predominant in the posterior region of the mandible with radiographic features of a well-defined, unilocular radiolucency with smooth and corticated margin. The lesions might relate to an embedded tooth by surrounding an entire tooth, attached to the CEJ level or extending to part of root. Bony expansion could be found but teeth displacement and root resorption were not common. These features might help in giving the differential diagnosis.Keywords: cone beam computed tomography, imaging dentistry, odontogenic keratocyst, radiographic features
Procedia PDF Downloads 128245 Adaptation of Hough Transform Algorithm for Text Document Skew Angle Detection
Authors: Kayode A. Olaniyi, Olabanji F. Omotoye, Adeola A. Ogunleye
Abstract:
The skew detection and correction form an important part of digital document analysis. This is because uncompensated skew can deteriorate document features and can complicate further document image processing steps. Efficient text document analysis and digitization can rarely be achieved when a document is skewed even at a small angle. Once the documents have been digitized through the scanning system and binarization also achieved, document skew correction is required before further image analysis. Research efforts have been put in this area with algorithms developed to eliminate document skew. Skew angle correction algorithms can be compared based on performance criteria. Most important performance criteria are accuracy of skew angle detection, range of skew angle for detection, speed of processing the image, computational complexity and consequently memory space used. The standard Hough Transform has successfully been implemented for text documentation skew angle estimation application. However, the standard Hough Transform algorithm level of accuracy depends largely on how much fine the step size for the angle used. This consequently consumes more time and memory space for increase accuracy and, especially where number of pixels is considerable large. Whenever the Hough transform is used, there is always a tradeoff between accuracy and speed. So a more efficient solution is needed that optimizes space as well as time. In this paper, an improved Hough transform (HT) technique that optimizes space as well as time to robustly detect document skew is presented. The modified algorithm of Hough Transform presents solution to the contradiction between the memory space, running time and accuracy. Our algorithm starts with the first step of angle estimation accurate up to zero decimal place using the standard Hough Transform algorithm achieving minimal running time and space but lacks relative accuracy. Then to increase accuracy, suppose estimated angle found using the basic Hough algorithm is x degree, we then run again basic algorithm from range between ±x degrees with accuracy of one decimal place. Same process is iterated till level of desired accuracy is achieved. The procedure of our skew estimation and correction algorithm of text images is implemented using MATLAB. The memory space estimation and process time are also tabulated with skew angle assumption of within 00 and 450. The simulation results which is demonstrated in Matlab show the high performance of our algorithms with less computational time and memory space used in detecting document skew for a variety of documents with different levels of complexity.Keywords: hough-transform, skew-detection, skew-angle, skew-correction, text-document
Procedia PDF Downloads 159244 A Model of the Universe without Expansion of Space
Authors: Jia-Chao Wang
Abstract:
A model of the universe without invoking space expansion is proposed to explain the observed redshift-distance relation and the cosmic microwave background radiation (CMB). The main hypothesized feature of the model is that photons traveling in space interact with the CMB photon gas. This interaction causes the photons to gradually lose energy through dissipation and, therefore, experience redshift. The interaction also causes some of the photons to be scattered off their track toward an observer and, therefore, results in beam intensity attenuation. As observed, the CMB exists everywhere in space and its photon density is relatively high (about 410 per cm³). The small average energy of the CMB photons (about 6.3×10⁻⁴ eV) can reduce the energies of traveling photons gradually and will not alter their momenta drastically as in, for example, Compton scattering, to totally blur the images of distant objects. An object moving through a thermalized photon gas, such as the CMB, experiences a drag. The cause is that the object sees a blue shifted photon gas along the direction of motion and a redshifted one in the opposite direction. An example of this effect can be the observed CMB dipole: The earth travels at about 368 km/s (600 km/s) relative to the CMB. In the all-sky map from the COBE satellite, radiation in the Earth's direction of motion appears 0.35 mK hotter than the average temperature, 2.725 K, while radiation on the opposite side of the sky is 0.35 mK colder. The pressure of a thermalized photon gas is given by Pγ = Eγ/3 = αT⁴/3, where Eγ is the energy density of the photon gas and α is the Stefan-Boltzmann constant. The observed CMB dipole, therefore, implies a pressure difference between the two sides of the earth and results in a CMB drag on the earth. By plugging in suitable estimates of quantities involved, such as the cross section of the earth and the temperatures on the two sides, this drag can be estimated to be tiny. But for a photon traveling at the speed of light, 300,000 km/s, the drag can be significant. In the present model, for the dissipation part, it is assumed that a photon traveling from a distant object toward an observer has an effective interaction cross section pushing against the pressure of the CMB photon gas. For the attenuation part, the coefficient of the typical attenuation equation is used as a parameter. The values of these two parameters are determined by fitting the 748 µ vs. z data points compiled from 643 supernova and 105 γ-ray burst observations with z values up to 8.1. The fit is as good as that obtained from the lambda cold dark matter (ΛCDM) model using online cosmological calculators and Planck 2015 results. The model can be used to interpret Hubble's constant, Olbers' paradox, the origin and blackbody nature of the CMB radiation, the broadening of supernova light curves, and the size of the observable universe.Keywords: CMB as the lowest energy state, model of the universe, origin of CMB in a static universe, photon-CMB photon gas interaction
Procedia PDF Downloads 134243 Optical Assessment of Marginal Sealing Performance around Restorations Using Swept-Source Optical Coherence Tomography
Authors: Rima Zakzouk, Yasushi Shimada, Yasunori Sumi, Junji Tagami
Abstract:
Background and purpose: The resin composite has become the main material for the restorations of caries in recent years due to aesthetic characteristics, especially with the development of the adhesive techniques. The quality of adhesion to tooth structures is depending on an exchange process between inorganic tooth material and synthetic resin and a micromechanical retention promoted by resin infiltration in partially demineralized dentin. Optical coherence tomography (OCT) is a noninvasive diagnostic method for obtaining cross-sectional images that produce high-resolution of the biological tissue at the micron scale. The aim of this study was to evaluate the gap formation at adhesive/tooth interface of two-step self-etch adhesives that are preceded with or without phosphoric acid pre-etching in different regions of teeth using SS-OCT. Materials and methods: Round tapered cavities (2×2 mm) were prepared in cervical part of bovine incisors teeth and divided into 2 groups (n=10): first group self-etch adhesive (Clearfil SE Bond) was applied for SE group and second group treated with acid etching before applying the self-etch adhesive for PA group. Subsequently, both groups were restored with Estelite Flow Quick Flowable Composite Resin and observed under OCT. Following 5000 thermal cycles, the same section was obtained again for each cavity using OCT at 1310-nm wavelength. Scanning was repeated after two months to monitor the gap progress. Then the gap length was measured using image analysis software, and the statistics analysis were done between both groups using SPSS software. After that, the cavities were sectioned and observed under Confocal Laser Scanning Microscope (CLSM) to confirm the result of OCT. Results: Gaps formed at the bottom of the cavity was longer than the gap formed at the margin and dento-enamel junction in both groups. On the other hand, pre-etching treatment led to damage the DEJ regions creating longer gap. After 2 months the results showed almost progress in the gap length significantly at the bottom regions in both groups. In conclusions, phosphoric acid etching treatment did not reduce the gap lrngth in most regions of the cavity. Significance: The bottom region of tooth was more exposed to gap formation than margin and DEJ regions, The DEJ damaged with phosphoric acid treatment.Keywords: optical coherence tomography, self-etch adhesives, bottom, dento enamel junction
Procedia PDF Downloads 227242 Overcoming Mistrusted Masculinity: Analyzing Muslim Men and Their Aspirations for Fatherhood in Denmark
Authors: Anne Hovgaard Jorgensen
Abstract:
This study investigates how Muslim fathers in Denmark are struggling to overcome notions of mistrust from teachers and educators. Starting from school-home-cooperation (parent conferences, school-home communication, etc.), the study finds that many Muslim fathers do not feel acknowledged as a resource in the upbringing of their children. To explain these experiences further, the study suggest the notion of ‘mistrusted masculinity’ to grasp the controlling image these fathers meet in various schools and child-care-institutions in the Danish Welfare state. The paper is based on 9 months of fieldwork in a Danish school, a social housing area and in various ‘father groups’ in Denmark. Additional, 50 interviews were conducted with fathers, children, mothers, schoolteachers, and educators. By using Connell's concepts 'hegemonic' and 'marginalized' masculinity as steppingstones, the paper argues that these concepts might entail a too static and dualistic picture of gender. By applying the concepts of 'emergent masculinity' and 'emergent fatherhood' the paper brings along a long needed discussion of how Muslim men in Denmark are struggling to overcome and change the controlling images of them as patriarchal and/or ignorant fathers regarding the upbringing of their children. As such, the paper shows how Muslim fathers are taking action to change this controlling image, e.g. through various ‘father groups’. The paper is inspired by the phenomenological notions of ‘experience´ and in the light of this notion, the paper tells the fathers’ stories about their upbringing of their children and aspirations for fatherhood. These stories share light on how these fathers take care of their children in everyday life. The study also shows that the controlling image of these fathers have affected how some Muslim fathers are actually being fathers. The study shows that fear of family-interventions from teachers or social workers e.g. have left some Muslim fathers in a limbo, being afraid of scolding their children, and being confused of ‘what good parenting in Denmark is’. This seems to have led to a more lassie fair upbringing than these fathers actually wanted. This study is important since anthropologists generally have underexposed the notion of fatherhood, and how fathers engage in the upbringing of their children. Over more, the vast majority of qualitative studies of fatherhood have been on white middleclass fathers, living in nuclear families. In addition, this study is crucial at this very moment due to the major refugee crisis in Denmark and in the Western world in general. A crisis, which has resulted in a vast number of scare campaigns against Islam from different nationalistic political parties, which enforces the negative controlling image of Muslim fathers.Keywords: fatherhood, Muslim fathers, mistrust, education
Procedia PDF Downloads 191241 Rapid Soil Classification Using Computer Vision with Electrical Resistivity and Soil Strength
Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, P. L. Goh, Grace H. B. Foo, M. L. Leong
Abstract:
This paper presents the evaluation of various soil testing methods such as the four-probe soil electrical resistivity method and cone penetration test (CPT) that can complement a newly developed novel rapid soil classification scheme using computer vision, to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from the local construction industry are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labor-intensive. Thus, a rapid classification method is needed at the SGs. Four-probe soil electrical resistivity and CPT were evaluated for their feasibility as suitable additions to the computer vision system to further develop this innovative non-destructive and instantaneous classification method. The computer vision technique comprises soil image acquisition using an industrial-grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the following three items were targeted to be added onto the computer vision scheme: the apparent electrical resistivity of soil (ρ) measured using a set of four probes arranged in Wenner’s array, the soil strength measured using a modified mini cone penetrometer, and w measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay,” and a mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay” and are feasible as complementing methods to the computer vision system.Keywords: computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification
Procedia PDF Downloads 239240 Holographic Art as an Approach to Enhance Visual Communication in Egyptian Community: Experimental Study
Authors: Diaa Ahmed Mohamed Ahmedien
Abstract:
Nowadays, it cannot be denied that the most important interactive arts trends have appeared as a result of significant scientific mutations in the modern sciences, and holographic art is not an exception, where it is considered as a one of the most important major contemporary interactive arts trends in visual arts. Holographic technique had been evoked through the modern physics application in late 1940s, for the improvement of the quality of electron microscope images by Denis Gabor, until it had arrived to Margaret Benyon’s art exhibitions, and then it passed through a lot of procedures to enhance its quality and artistic applications technically and visually more over 70 years in visual arts. As a modest extension to these great efforts, this research aimed to invoke extraordinary attempt to enroll sample of normal people in Egyptian community in holographic recording program to record their appreciated objects or antiques, therefore examine their abilities to interact with modern techniques in visual communication arts. So this research tried to answer to main three questions: 'can we use the analog holographic techniques to unleash new theoretical and practical knowledge in interactive arts for public in Egyptian community?', 'to what extent holographic art can be familiar with public and make them able to produce interactive artistic samples?', 'are there possibilities to build holographic interactive program for normal people which lead them to enhance their understanding to visual communication in public and, be aware of interactive arts trends?' This research was depending in its first part on experimental methods, where it conducted in Laser lab at Cairo University, using Nd: Yag Laser 532 nm, and holographic optical layout, with selected samples of Egyptian people that they have been asked to record their appreciated object, after they had already learned recording methods, and in its second part on a lot of discussion panel had conducted to discuss the result and how participants felt towards their holographic artistic products through survey, questionnaires, take notes and critiquing holographic artworks. Our practical experiments and final discussions have already lead us to say that this experimental research was able to make most of participants pass through paradigm shift in their visual and conceptual experiences towards more interaction with contemporary visual arts trends, as an attempt to emphasize to the role of mature relationship between the art, science and technology, to spread interactive arts out in our community through the latest scientific and artistic mutations around the world and the role of this relationship in our societies particularly with those who have never been enrolled in practical arts programs before.Keywords: Egyptian community, holographic art, laser art, visual art
Procedia PDF Downloads 479239 The Functions of Spatial Structure in Supporting Socialization in Urban Parks
Authors: Navid Nasrolah Mazandarani, Faezeh Mohammadi Tahrodi, Jr., Norshida Ujang, Richard Jan Pech
Abstract:
Human evolution has designed us to be dependent on social and natural settings, but designed of our modern cities often ignore this fact. It is evident that high-rise buildings dominate most metropolitan city centers. As a result urban parks are very limited and in many cases are not socially responsive to our social needs in these urban ‘jungles’. This paper emphasizes the functions of urban morphology in supporting socialization in Lake Garden, one of the main urban parks in Kuala Lumpur, Malaysia. It discusses two relevant theories; first the concept of users’ experience coined by Kevin Lynch (1960) which states that way-finding is related to the process of forming mental maps of environmental surroundings. Second, the concept of social activity coined by Jan Gehl (1987) which holds that urban public spaces can be more attractive when they provide welcoming places in which people can walk around and spend time. Until recently, research on socio-spatial behavior mainly focused on social ties, place attachment and human well-being; with less focus on the spatial dimension of social behavior. This paper examines the socio-spatial behavior within the spatial structure of the urban park by exploring the relationship between way-finding and social activity. The urban structures defined by the paths and nodes were analyzed as the fundamental topological structure of space to understand their effects on the social engagement pattern. The study uses a photo questionnaire survey to inspect the spatial dimension in relation to the social activities within paths and nodes. To understand the legibility of the park, spatial cognition was evaluated using sketch maps produced by 30 participants who visited the park. The results of the sketch mapping indicated that a spatial image has a strong interrelation with socio-spatial behavior. Moreover, an integrated spatial structure of the park generated integrated use and social activity. It was found that people recognized and remembered the spaces where they engaged in social activities. They could experience the park more thoroughly, when they found their way continuously through an integrated park structure. Therefore, the benefits of both perceptual and social dimensions of planning and design happened simultaneously. The findings can assist urban planners and designers to redevelop urban parks by considering the social quality design that contributes to clear mental images of these places.Keywords: spatial structure, social activities, sketch map, urban park, way-finding
Procedia PDF Downloads 315238 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps
Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo
Abstract:
With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.Keywords: interactive applications, power management, QoS, Web apps, WebGL
Procedia PDF Downloads 192237 Informational Habits and Ideology as Predictors for Political Efficacy: A Survey Study of the Brazilian Political Context
Authors: Pedro Cardoso Alves, Ana Lucia Galinkin, José Carlos Ribeiro
Abstract:
Political participation, can be a somewhat tricky subject to define, not in small part due to the constant changes in the concept fruit of the effort to include new forms of participatory behavior that go beyond traditional institutional channels. With the advent of the internet and mobile technologies, defining political participation has become an even more complicated endeavor, given de amplitude of politicized behaviors that are expressed throughout these mediums, be it in the very organization of social movements, in the propagation of politicized texts, videos and images, or in the micropolitical behaviors that are expressed in daily interaction. In fact, the very frontiers that delimit physical and digital spaces have become ever more diluted due to technological advancements, leading to a hybrid existence that is simultaneously physical and digital, not limited, as it once was, to the temporal limitations of classic communications. Moving away from those institutionalized actions of traditional political behavior, an idea of constant and fluid participation, which occurs in our daily lives through conversations, posts, tweets and other digital forms of expression, is discussed. This discussion focuses on the factors that precede more direct forms of political participation, interpreting the relation between informational habits, ideology, and political efficacy. Though some of the informational habits can be considered political participation, by some authors, a distinction is made to establish a logical flow of behaviors leading to participation, that is, one must gather and process information before acting on it. To reach this objective, a quantitative survey is currently being applied in Brazilian social media, evaluating feelings of political efficacy, social and economic issue-based ideological stances and informational habits pertaining to collection, fact-checking, and diversity of sources and ideological positions present in the participant’s political information network. The measure being used for informational habits relies strongly on a mix of information literacy and political sophistication concepts, bringing a more up-to-date understanding of information and knowledge production and processing in contemporary hybrid (physical-digital) environments. Though data is still being collected, preliminary analysis point towards a strong correlation between information habits and political efficacy, while ideology shows a weaker influence over efficacy. Moreover, social ideology and economic ideology seem to have a strong correlation in the sample, such intermingling between social and economic ideals is generally considered a red flag for political polarization.Keywords: political efficacy, ideology, information literacy, cyberpolitics
Procedia PDF Downloads 234236 Biopolymers: A Solution for Replacing Polyethylene in Food Packaging
Authors: Sonia Amariei, Ionut Avramia, Florin Ursachi, Ancuta Chetrariu, Ancuta Petraru
Abstract:
The food industry is one of the major generators of plastic waste derived from conventional synthetic petroleum-based polymers, which are non-biodegradable, used especially for packaging. These packaging materials, after the food is consumed, accumulate serious environmental concerns due to the materials but also to the organic residues that adhere to them. It is the concern of specialists, researchers to eliminate problems related to conventional materials that are not biodegradable or unnecessary plastic and replace them with biodegradable and edible materials, supporting the common effort to protect the environment. Even though environmental and health concerns will cause more consumers to switch to a plant-based diet, most people will continue to add more meat to their diet. The paper presents the possibility of replacing the polyethylene packaging from the surface of the trays for meat preparations with biodegradable packaging obtained from biopolymers. During the storage of meat products may occur deterioration by lipids oxidation and microbial spoilage, as well as the modification of the organoleptic characteristics. For this reason, different compositions of polymer mixtures and film conditions for obtaining must be studied to choose the best packaging material to achieve food safety. The compositions proposed for packaging are obtained from alginate, agar, starch, and glycerol as plasticizers. The tensile strength, elasticity, modulus of elasticity, thickness, density, microscopic images of the samples, roughness, opacity, humidity, water activity, the amount of water transferred as well as the speed of water transfer through these packaging materials were analyzed. A number of 28 samples with various compositions were analyzed, and the results showed that the sample with the highest values for hardness, density, and opacity, as well as the smallest water vapor permeability, of 1.2903E-4 ± 4.79E-6, has the ratio of components as alginate: agar: glycerol (3:1.25:0.75). The water activity of the analyzed films varied between 0.2886 and 0.3428 (aw< 0.6), demonstrating that all the compositions ensure the preservation of the products in the absence of microorganisms. All the determined parameters allow the appreciation of the quality of the packaging films in terms of mechanical resistance, its protection against the influence of light, the transfer of water through the packaging. Acknowledgments: This work was supported by a grant of the Ministry of Research, Innovation, and Digitization, CNCS/CCCDI – UEFISCDI, project number PN-III-P2-2.1-PED-2019-3863, within PNCDI III.Keywords: meat products, alginate, agar, starch, glycerol
Procedia PDF Downloads 168235 Visual Aid and Imagery Ramification on Decision Making: An Exploratory Study Applicable in Emergency Situations
Authors: Priyanka Bharti
Abstract:
Decades ago designs were based on common sense and tradition, but after an enhancement in visualization technology and research, we are now able to comprehend the cognitive ability involved in the decoding of the visual information. However, many fields in visuals need intense research to deliver an efficient explanation for the events. Visuals are an information representation mode through images, symbols and graphics. It plays an impactful role in decision making by facilitating quick recognition, comprehension, and analysis of a situation. They enhance problem-solving capabilities by enabling the processing of more data without overloading the decision maker. As research proves that, visuals offer an improved learning environment by a factor of 400 compared to textual information. Visual information engages learners at a cognitive level and triggers the imagination, which enables the user to process the information faster (visuals are processed 60,000 times faster in the brain than text). Appropriate information, visualization, and its presentation are known to aid and intensify the decision-making process for the users. However, most literature discusses the role of visual aids in comprehension and decision making during normal conditions alone. Unlike emergencies, in a normal situation (e.g. our day to day life) users are neither exposed to stringent time constraints nor face the anxiety of survival and have sufficient time to evaluate various alternatives before making any decision. An emergency is an unexpected probably fatal real-life situation which may inflict serious ramifications on both human life and material possessions unless corrective measures are taken instantly. The situation demands the exposed user to negotiate in a dynamic and unstable scenario in the absence or lack of any preparation, but still, take swift and appropriate decisions to save life/lives or possessions. But the resulting stress and anxiety restricts cue sampling, decreases vigilance, reduces the capacity of working memory, causes premature closure in evaluating alternative options, and results in task shedding. Limited time, uncertainty, high stakes and vague goals negatively affect cognitive abilities to take appropriate decisions. More so, theory of natural decision making by experts has been understood with far more depth than that of an ordinary user. Therefore, in this study, the author aims to understand the role of visual aids in supporting rapid comprehension to take appropriate decisions during an emergency situation.Keywords: cognition, visual, decision making, graphics, recognition
Procedia PDF Downloads 268234 Retrospective Cartography of Tbilisi and Surrounding Area
Authors: Dali Nikolaishvili, Nino Khareba, Mariam Tsitsagi
Abstract:
Tbilisi has been a capital of Georgia since the 5ᵗʰ century. City area was covered by forest in historical past. Nowadays the situation has been changing dramatically. Dozens of problems are caused by damages/destruction of green cover and solution, at one glance, seems to be uncomplicated (planting trees and creating green quarters), but on the other hand, according to the increasing tendency, the built up of areas still remains unsolved. Finding out the ways to overcome such obstacles is important even for protecting the health of society. Making of Retrospective cartography of the forest area of Tbilisi with use of GIS technology and remote sensing was the main aim of the research. Research about the dynamic of forest-cover in Tbilisi and its surroundings included the following steps: assessment of the dynamic of forest in Tbilisi and its surroundings. The survey was mainly based on the retrospective mapping method. Using of GIS technology, studying, comparing and identifying the narrative sources was the next step. And the last one was analyzed of the changes from the 80s to the present days on the basis of decryption of remotely sensed images. After creating a unified cartographic basis, the mapping and plans of different periods have been linked to this geodatabase. Data about green parks, individual old plants existing in the private yards and respondents' Information (according to a questionnaire created in advance) was added to the basic database, the general plan of Tbilisi and Scientific works as well. On the basis of analysis of historic, including cartographic sources, forest-cover maps for different periods of time were made. In addition, was made the catalog of individual green parks (location, area, typical composition, name and so on), which was the basis of creating several thematic maps. Areas with a high rate of green area degradation were identified. Several maps depicting the dynamics of forest cover of Tbilisi were created and analyzed. The methods of linking the data of the old cartographic sources to the modern basis were developed too, the result of which may be used in Urban Planning of Tbilisi. Understanding, perceiving and analyzing the real condition of green cover in Tbilisi and its problems, in turn, will help to take appropriate measures for the maintenance of ancient plants, to develop forests and to plan properly parks, squares, and recreational sites. Because the healthy environment is the main condition of human health and implies to the rational development of the city.Keywords: catalogue of green area, GIS, historical cartography, cartography, remote sensing, Tbilisi
Procedia PDF Downloads 137233 Cardiothoracic Ratio in Postmortem Computed Tomography: A Tool for the Diagnosis of Cardiomegaly
Authors: Alex Eldo Simon, Abhishek Yadav
Abstract:
This study aimed to evaluate the utility of postmortem computed tomography (CT) and heart weight measurements in the assessment of cardiomegaly in cases of sudden death due to cardiac origin by comparing the results of these two diagnostic methods. The study retrospectively analyzed postmortem computed tomography (PMCT) data from 54 cases of sudden natural death and compared the findings with those of the autopsy. The study involved measuring the cardiothoracic ratio (CTR) from coronal computed tomography (CT) images and determining the actual cardiac weight by weighing the heart during the autopsy. The inclusion criteria for the study were cases of sudden death suspected to be caused by cardiac pathology, while exclusion criteria included death due to unnatural causes such as trauma or poisoning, diagnosed natural causes of death related to organs other than the heart, and cases of decomposition. Sensitivity, specificity, and diagnostic accuracy were calculated, and to evaluate the accuracy of using the cardiothoracic ratio (CTR) to detect an enlarged heart, the study generated receiver operating characteristic (ROC) curves. The cardiothoracic ratio (CTR) is a radiological tool used to assess cardiomegaly by measuring the maximum cardiac diameter in relation to the maximum transverse diameter of the chest wall. The clinically used criteria for CTR have been modified from 0.50 to 0.57 for use in postmortem settings, where abnormalities can be detected by comparing CTR values to this threshold. A CTR value of 0.57 or higher is suggestive of hypertrophy but not conclusive. Similarly, heart weight is measured during the traditional autopsy, and a cardiac weight greater than 450 grams is defined as hypertrophy. Of the 54 cases evaluated, 22 (40.7%) had a cardiothoracic ratio (CTR) ranging from > 0.50 to equal 0.57, and 12 cases (22.2%) had a CTR greater than 0.57, which was defined as hypertrophy. The mean CTR was calculated as 0.52 ± 0.06. Among the 54 cases evaluated, the weight of the heart was measured, and the mean was calculated as 369.4 ± 99.9 grams. Out of the 54 cases evaluated, 12 were found to have hypertrophy as defined by PMCT, while only 9 cases were identified with hypertrophy in traditional autopsy. The sensitivity and specificity of the test were calculated as 55.56% and 84.44%, respectively. The sensitivity of the hypertrophy test was found to be 55.56% (95% CI: 26.66, 81.12¹), the specificity was 84.44% (95% CI: 71.22, 92.25¹), and the diagnostic accuracy was 79.63% (95% CI: 67.1, 88.23¹). The limitation of the study was a low sample size of only 54 cases, which may limit the generalizability of the findings. The comparison of the cardiothoracic ratio with heart weight in this study suggests that PMCT may serve as a screening tool for medico-legal autopsies when performed by forensic pathologists. However, it should be noted that the low sensitivity of the test (55.5%) may limit its diagnostic accuracy, and therefore, further studies with larger sample sizes and more diverse populations are needed to validate these findings.Keywords: PMCT, virtopsy, CTR, cardiothoracic ratio
Procedia PDF Downloads 81232 Diselenide-Linked Redox Stimuli-Responsive Methoxy Poly(Ethylene Glycol)-b-Poly(Lactide-Co-Glycolide) Micelles for the Delivery of Doxorubicin in Cancer Cells
Authors: Yihenew Simegniew Birhan, Hsieh Chih Tsai
Abstract:
The recent advancements in synthetic chemistry and nanotechnology fostered the development of different nanocarriers for enhanced intracellular delivery of pharmaceutical agents to tumor cells. Polymeric micelles (PMs), characterized by small size, appreciable drug loading capacity (DLC), better accumulation in tumor tissue via enhanced permeability and retention (EPR) effect, and the ability to avoid detection and subsequent clearance by the mononuclear phagocyte (MNP) system, are convenient to improve the poor solubility, slow absorption and non-selective biodistribution of payloads embedded in their hydrophobic cores and hence, enhance the therapeutic efficacy of chemotherapeutic agents. Recently, redox-responsive polymeric micelles have gained significant attention for the delivery and controlled release of anticancer drugs in tumor cells. In this study, we synthesized redox-responsive diselenide bond containing amphiphilic polymer, Bi(mPEG-PLGA)-Se₂ from mPEG-PLGA, and 3,3'-diselanediyldipropanoic acid (DSeDPA) using DCC/DMAP as coupling agents. The successful synthesis of the copolymers was verified by different spectroscopic techniques. Above the critical micelle concentration, the amphiphilic copolymer, Bi(mPEG-PLGA)-Se₂, self-assembled into stable micelles. The DLS data indicated that the hydrodynamic diameter of the micelles (123.9 ± 0.85 nm) was suitable for extravasation into the tumor cells through the EPR effect. The drug loading content (DLC) and encapsulation efficiency (EE) of DOX-loaded micelles were found to be 6.61 wt% and 54.9%, respectively. The DOX-loaded micelles showed initial burst release accompanied by sustained release trend where 73.94% and 69.54% of encapsulated DOX was released upon treatment with 6mM GSH and 0.1% H₂O₂, respectively. The biocompatible nature of Bi(mPEG-PLGA)-Se₂ copolymer was confirmed by the cell viability study. In addition, the DOX-loaded micelles exhibited significant inhibition against HeLa cells (44.46%), at a maximum dose of 7.5 µg/mL. The fluorescent microscope images of HeLa cells treated with 3 µg/mL (equivalent DOX concentration) revealed efficient internalization and accumulation of DOX-loaded Bi(mPEG-PLGA)-Se₂ micelles in the cytosol of cancer cells. In conclusion, the intelligent, biocompatible, and the redox stimuli-responsive behavior of Bi(mPEG-PLGA)-Se₂ copolymer marked the potential applications of diselenide-linked mPEG-PLGA micelles for the delivery and on-demand release of chemotherapeutic agents in cancer cells.Keywords: anticancer drug delivery, diselenide bond, polymeric micelles, redox-responsive
Procedia PDF Downloads 110231 Nondestructive Inspection of Reagents under High Attenuated Cardboard Box Using Injection-Seeded THz-Wave Parametric Generator
Authors: Shin Yoneda, Mikiya Kato, Kosuke Murate, Kodo Kawase
Abstract:
In recent years, there have been numerous attempts to smuggle narcotic drugs and chemicals by concealing them in international mail. Combatting this requires a non-destructive technique that can identify such illicit substances in mail. Terahertz (THz) waves can pass through a wide variety of materials, and many chemicals show specific frequency-dependent absorption, known as a spectral fingerprint, in the THz range. Therefore, it is reasonable to investigate non-destructive mail inspection techniques that use THz waves. For this reason, in this work, we tried to identify reagents under high attenuation shielding materials using injection-seeded THz-wave parametric generator (is-TPG). Our THz spectroscopic imaging system using is-TPG consisted of two non-linear crystals for emission and detection of THz waves. A micro-chip Nd:YAG laser and a continuous wave tunable external cavity diode laser were used as the pump and seed source, respectively. The pump beam and seed beam were injected to the LiNbO₃ crystal satisfying the noncollinear phase matching condition in order to generate high power THz-wave. The emitted THz wave was irradiated to the sample which was raster scanned by the x-z stage while changing the frequencies, and we obtained multispectral images. Then the transmitted THz wave was focused onto another crystal for detection and up-converted to the near infrared detection beam based on nonlinear optical parametric effects, wherein the detection beam intensity was measured using an infrared pyroelectric detector. It was difficult to identify reagents in a cardboard box because of high noise levels. In this work, we introduce improvements for noise reduction and image clarification, and the intensity of the near infrared detection beam was converted correctly to the intensity of the THz wave. A Gaussian spatial filter is also introduced for a clearer THz image. Through these improvements, we succeeded in identification of reagents hidden in a 42-mm thick cardboard box filled with several obstacles, which attenuate 56 dB at 1.3 THz, by improving analysis methods. Using this system, THz spectroscopic imaging was possible for saccharides and may also be applied to cases where illicit drugs are hidden in the box, and multiple reagents are mixed together. Moreover, THz spectroscopic imaging can be achieved through even thicker obstacles by introducing an NIR detector with higher sensitivity.Keywords: nondestructive inspection, principal component analysis, terahertz parametric source, THz spectroscopic imaging
Procedia PDF Downloads 177230 Gray’s Anatomy for Students: First South Asia Edition Highlights
Authors: Raveendranath Veeramani, Sunil Jonathan Holla, Parkash Chand, Sunil Chumber
Abstract:
Gray’s Anatomy for Students has been a well-appreciated book among undergraduate students of anatomy in Asia. However, the current curricular requirements of anatomy require a more focused and organized approach. The editors of the first South Asia edition of Gray’s Anatomy for Students hereby highlight the modifications and importance of this edition. There is an emphasis on active learning by making the clinical relevance of anatomy explicit. Learning anatomy in context has been fostered by the association between anatomists and clinicians in keeping with the emerging integrated curriculum of the 21st century. The language has been simplified to aid students who have studied in the vernacular. The original illustrations have been retained, and few illustrations have been added. There are more figure numbers mentioned in the text to encourage students to refer to the illustrations while learning. The text has been made more student-friendly by adding generalizations, classifications and summaries. There are useful review materials at the beginning of the chapters which include digital resources for self-study. There are updates on imaging techniques to encourage students to appreciate the importance of essential knowledge of the relevant anatomy to interpret images, due emphasis has been laid on dissection. Additional importance has been given to the cranial nerves, by describing their relevant details with several additional illustrations and flowcharts. This new edition includes innovative features such as set inductions, outlines for subchapters and flowcharts to facilitate learning. Set inductions are mostly clinical scenarios to create interest in the need to study anatomy for healthcare professions. The outlines are a modern multimodal facilitating approach towards various topics to empower students to explore content and direct their learning and include learning objectives and material for review. The components of the outline encourage the student to be aware of the need to create solutions to clinical problems. The outlines help students direct their learning to recall facts, demonstrate and analyze relationships, use reason to explain concepts, appreciate the significance of structures and their relationships and apply anatomical knowledge. The 'structures to be identified in a dissection' are given as Level I, II and III which represent the 'must know, desirable to know and nice to know' content respectively. The flowcharts have been added to get an overview of the course of a structure, recapitulate important details about structures, and as an aid to recall. There has been a great effort to balance the need to have content that would enable students to understand concepts as well as get the basic material for the current condensed curriculum.Keywords: Grays anatomy, South Asia, human anatomy, students anatomy
Procedia PDF Downloads 201229 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 133228 Land-Use Transitions and Its Implications on Food Production Systems in Rural Landscape of Southwestern Ghana
Authors: Evelyn Asante Yeboah, Kwabena O. Asubonteng, Justice Camillus Mensah, Christine Furst
Abstract:
Smallholder-dominated mosaic landscapes in rural Africa are relevant for food production, biodiversity conservation, and climate regulation. Land-use transitions threaten the multifunctionality of such landscapes, especially the production capacity of arable lands resulting in food security challenges. Using land-cover maps derived from maximum likelihood classification of Landsat satellite images for the years 2002, 2015, and 2020, post-classification change detection, landscape metrics, and key informant interviews, the study assessed the implications of rubber plantation expansion and oil business development on the food production capacity of Ahanta West District, Ghana. The analysis reveals that settlement and rubber areas expanded by 5.82% and 10.33% of the landscape area, respectively, between 2002 and 2020. This increase translates into over twice their initial sizes (144% in settlement change and 101% in rubber change). Rubber plantation spread dominates the north and southwestern areas, whereas settlement is widespread in the eastern parts of the landscape. Rubber and settlement expanded at the expense of cropland, palm, and shrublands. Land-use transitions between cropland, palm, and shrubland were targeting each other, but the net loss in shrubland was higher (-17.27%). Isolation, subdivision, connectedness, and patch adjacency indices showed patch consolidation in the landscape configuration from 2002 to 2015 and patch fragmentation from 2015 to 2020. The study also found patches with consistent increasing connectivity in settlement areas indicating the influence of oil discovery developments and fragmentation tendencies in rubber, shrubland, cropland, and palm, indicating springing up of smaller rubber farms, the disappearance of shrubland, and splitting up of cropland and palm areas respectively. The results revealed a trend in land-use transitions in favor of smallholder rubber plantation expansion and oil discovery developments, which suggest serious implications on food production systems and poses a risk for food security and landscape multifunctional characteristics. To ensure sustainability in land uses, this paper recommends the enforcement of legislative instruments governing spatial planning and land use in Ghana as embedded in the 2016 land-use and spatial planning act.Keywords: food production systems, food security, Ghana’s west coast, land-use transitions, multifunctional rural landscapes
Procedia PDF Downloads 145227 Adjustment of the Level of Vibrational Force on Targeted Teeth
Authors: Amin Akbari, Dongcai Wang, Huiru Li, Xiaoping Du, Jie Chen
Abstract:
The effect of vibrational force (VF) on accelerating orthodontic tooth movement depends on the level of delivered stimulation to the tooth in terms of peak load (PL), which requires contacts between the tooth and the VF device. A personalized device ensures the contacts, but the resulting PL distribution on the teeth is unknown. Furthermore, it is unclear whether the PL on particular teeth can be adjusted to the prescribed values. The objective of this study was to investigate the efficacy of apersonalized VF device in controlling the level of stimulation on two teeth, the mandibular canines and 2nd molars. A 3-D finite element (FE) model of human dentition, including teeth, PDL, and alveolar bone, was created from the cone beam computed tomography images of an anonymous subject. The VF was applied to the teeth through a VFdevice consisting of a mouthpiece with engraved tooth profile of the subject and a VF source that applied 0.3 N force with the frequency of 30 Hz. The dentition and mouthpiece were meshed using 10-node tetrahedral elements. Interface elements were created at the interfaces between the teeth and the mouthpiece. The upper and lower teeth bite on the mouthpiece to receive the vibration. The depth of engraved individual tooth profile could be adjusted, which was accomplished by adding a layer of material as an interference or removing a layer of material as a clearance to change the PL on the tooth. The interference increases the PL while the clearance decreases it. Fivemouthpiece design cases were simulated, which included a mouthpiece without interference/clearance; the mouthpieces with bilateral interferences on both mandibular canines and 2nd molars with magnitudes of 0.1, 0.15, and 0.2-mm, respectively; and mouthpiece with bilateral 0.3-mm clearances on the four teeth. Then, the force distributions on the entire dentition were compared corresponding to these adjustments. The PL distribution on the teeth is uneven when there is no interference or clearance. Among all teeth, the anterior segment receives the highest level of PL. Adding 0.1, 0.15, and 0.2-mm interferences to the canines and 2nd molars bilaterally leads to increase of the PL on the canines by 10, 62, and 73 percent and on the 2nd molar by 14, 55, and 87 percent, respectively. Adding clearances to the canines and 2nd molars by removing the contactsbetween these teeth and the mouthpiece results in zero PL on them. Moreover, introducing interference to mandibular canines and 2nd molarsredistributes the PL on the entireteeth. The share of the PL on the anterior teeth are reduced. The use of the personalized mouthpiece ensures contactsof the teeth to the mouthpiece so that all teeth can be stimulated. However, the PL distribution is uneven. Adding interference between a tooth and the mouthpiece increases the PL while introducing clearance decreases the PL. As a result, the PL is redistributed. This study confirms that the level of VF stimulation on the individual tooth can be adjusted to a prescribed value.Keywords: finite element method, orthodontic treatment, stress analysis, tooth movement, vibrational force
Procedia PDF Downloads 224226 Pre-Implementation of Total Body Irradiation Using Volumetric Modulated Arc Therapy: Full Body Anthropomorphic Phantom Development
Authors: Susana Gonçalves, Joana Lencart, Anabela Gregório Dias
Abstract:
Introduction: In combination with chemotherapy, Total Body Irradiation (TBI) is most used as part of the conditioning regimen prior to allogeneic hematopoietic stem cell transplantation. Conventional TBI techniques have a long application time but non-conformality of beam-application with the inability to individually spare organs at risk. Our institution’s intention is to start using Volumetric Modulated Arc Therapy (VMAT) techniques to increase homogeneity of delivered radiation. As a first approach, a dosimetric plan was performed on a computed tomography (CT) scan of a Rando Alderson antropomorfic phantom (head and torso), using a set of six arcs distributed along the phantom. However, a full body anthropomorphic phantom is essential to carry out technique validation and implementation. Our aim is to define the physical and chemical characteristics and the ideal manufacturing procedure of upper and lower limbs to our anthropomorphic phantom, for later validate TBI using VMAT. Materials and Methods: To study the better fit between our phantom and limbs, a CT scan of Rando Alderson anthropomorphic phantom was acquired. CT was performed on GE Healthcare equipment (model Optima CT580 W), with slice thickness of 2.5 mm. This CT was also used to access the electronic density of soft tissue and bone through Hounsfield units (HU) analysis. Results: CT images were analyzed and measures were made for the ideal upper and lower limbs. Upper limbs should be build under the following measures: 43cm length and 7cm diameter (next to the shoulder section). Lower limbs should be build under the following measures: 79cm length and 16.5cm diameter (next to the thigh section). As expected, soft tissue and bone have very different electronic density. This is important to choose and analyze different materials to better represent soft tissue and bone characteristics. The approximate HU values of the soft tissue and for bone shall be 35HU and 250HU, respectively. Conclusion: At the moment, several compounds are being developed based on different types of resins and additives in order to be able to control and mimic the various constituent densities of the tissues. Concurrently, several manufacturing techniques are being explored to make it possible to produce the upper and lower limbs in a simple and non-expensive way, in order to finally carry out a systematic and appropriate study of the total body irradiation. This preliminary study was a good starting point to demonstrate the feasibility of TBI with VMAT.Keywords: TBI, VMAT, anthropomorphic phantom, tissue equivalent materials
Procedia PDF Downloads 80225 Zn-, Mg- and Ni-Al-NO₃ Layered Double Hydroxides Intercalated by Nitrate Anions for Treatment of Textile Wastewater
Authors: Fatima Zahra Mahjoubi, Abderrahim Khalidi, Mohamed Abdennouri, Omar Cherkaoui, Noureddine Barka
Abstract:
Industrial effluents are one of the major causes of environmental pollution, especially effluents discharged from various dyestuff manufactures, plastic, and paper making industries. These effluents can give rise to certain hazards and environmental problems for their highly colored suspended organic solid. Dye effluents are not only aesthetic pollutants, but coloration of water by the dyes may affect photochemical activities in aquatic systems by reducing light penetration. It has been also reported that several commonly used dyes are carcinogenic and mutagenic for aquatic organisms. Therefore, removing dyes from effluents is of significant importance. Many adsorbent materials have been prepared in the removal of dyes from wastewater, including anionic clay or layered double hydroxyde. The zinc/aluminium (Zn-AlNO₃), magnesium/aluminium (Mg-AlNO₃) and nickel/aluminium (Ni-AlNO₃) layered double hydroxides (LDHs) were successfully synthesized via coprecipitation method. Samples were characterized by XRD, FTIR, TGA/DTA, TEM and pHPZC analysis. XRD patterns showed a basal spacing increase in the order of Zn-AlNO₃ (8.85Å)> Mg-AlNO₃ (7.95Å)> Ni-AlNO₃ (7.82Å). FTIR spectrum confirmed the presence of nitrate anions in the LDHs interlayer. The TEM images indicated that the Zn-AlNO3 presents circular to shaped particles with an average particle size of approximately 30 to 40 nm. Small plates assigned to sheets with hexagonal form were observed in the case of Mg-AlNO₃. Ni-AlNO₃ display nanostructured sphere in diameter between 5 and 10 nm. The LDHs were used as adsorbents for the removal of methyl orange (MO), as a model dye and for the treatment of an effluent generated by a textile factory. Adsorption experiments for MO were carried out as function of solution pH, contact time and initial dye concentration. Maximum adsorption was occurred at acidic solution pH. Kinetic data were tested using pseudo-first-order and pseudo-second-order kinetic models. The best fit was obtained with the pseudo-second-order kinetic model. Equilibrium data were correlated to Langmuir and Freundlich isotherm models. The best conditions for color and COD removal from textile effluent sample were obtained at lower values of pH. Total color removal was obtained with Mg-AlNO₃ and Ni-AlNO₃ LDHs. Reduction of COD to limits authorized by Moroccan standards was obtained with 0.5g/l LDHs dose.Keywords: chemical oxygen demand, color removal, layered double hydroxides, textile wastewater treatment
Procedia PDF Downloads 354224 Nude Cosmetic Water-Rich Compositions for Skin Care and Consumer Emotions
Authors: Emmanuelle Merat, Arnaud Aubert, Sophie Cambos, Francis Vial, Patrick Beau
Abstract:
Basically, consumers are sensitive to many stimuli when applying a cream: brand, packaging and indeed formulation compositions. Many studies demonstrated the influence of some stimuli such as brand, packaging, formula color and odor (e.g. in make-up applications). Those parameters influence perceived quality of the product. The objective of this work is to further investigate the relationship between nude skincare basic compositions with different textures and consumer experience. A tentative final step will be to connect the consumer feelings with key ingredients in the compositions. A new approach was developed to better understand touch-related subjective experience in consumers based on a combination of methods: sensory analysis with ten experts, preference mapping on one hundred female consumers and emotional assessments on thirty consumers (verbal and non-verbal through prosody and gesture monitoring). Finally, a methodology based on ‘sensorial trip’ (after olfactory, haptic and musical stimuli) has been experimented on the most interesting textures with 10 consumers. The results showed more or less impact depending on compositions and also on key ingredients. Three types of formulation particularly attracted the consumer: an aqueous gel, an oil-in-water emulsion, and a patented gel-in-oil formulation type. Regarding these three formulas, the preferences were both revealed through sensory and emotion tests. One was recognized as the most innovative in consumer sensory test whereas the two other formulas were discriminated in emotions evaluation. The positive emotions were highlighted especially in prosody criteria. The non-verbal analysis, which corresponds to the physical parameters of the voice, showed high pitch and amplitude values; linked to positive emotions. Verbatim, verbal content of responses (i.e., ideas, concepts, mental images), confirmed the first conclusion. On the formulas selected for their positive emotions generation, the ‘sensorial trip’ provided complementary information to characterize each emotional profile. In the second step, dedicated to better understand ingredients power, two types of ingredients demonstrated an obvious input on consumer preference: rheology modifiers and emollients. As a conclusion, nude cosmetic compositions with well-chosen textures and ingredients can positively stimulate consumer emotions contributing to capture their preference. For a complete achievement of the study, a global approach (Asia, America territories...) should be developed.Keywords: sensory, emotion, cosmetic formulations, ingredients' influence
Procedia PDF Downloads 179223 Call-Back Laterality and Bilaterality: Possible Screening Mammography Quality Metrics
Authors: Samson Munn, Virginia H. Kim, Huija Chen, Sean Maldonado, Michelle Kim, Paul Koscheski, Babak N. Kalantari, Gregory Eckel, Albert Lee
Abstract:
In terms of screening mammography quality, neither the portion of reports that advise call-back imaging that should be bilateral versus unilateral nor how much the unilateral call-backs may appropriately diverge from 50–50 (left versus right) is known. Many factors may affect detection laterality: display arrangement, reflections preferentially striking one display location, hanging protocols, seating positions with respect to others and displays, visual field cuts, health, etc. The call-back bilateral fraction may reflect radiologist experience (not in our data) or confidence level. Thus, laterality and bilaterality of call-backs advised in screening mammography reports could be worthy quality metrics. Here, laterality data did not reveal a concern until drilling down to individuals. Bilateral screening mammogram report recommendations by five breast imaging, attending radiologists at Harbor-UCLA Medical Center (Torrance, California) 9/1/15--8/31/16 and 9/1/16--8/31/17 were retrospectively reviewed. Recommended call-backs for bilateral versus unilateral, and for left versus right, findings were counted. Chi-square (χ²) statistic was applied. Year 1: of 2,665 bilateral screening mammograms, reports of 556 (20.9%) recommended call-back, of which 99 (17.8% of the 556) were for bilateral findings. Of the 457 unilateral recommendations, 222 (48.6%) regarded the left breast. Year 2: of 2,106 bilateral screening mammograms, reports of 439 (20.8%) recommended call-back, of which 65 (14.8% of the 439) were for bilateral findings. Of the 374 unilateral recommendations, 182 (48.7%) regarded the left breast. Individual ranges of call-backs that were bilateral were 13.2–23.3%, 10.2–22.5%, and 13.6–17.9%, by year(s) 1, 2, and 1+2, respectively; these ranges were unrelated to experience level; the two-year mean was 15.8% (SD=1.9%). The lowest χ² p value of the group's sidedness disparities years 1, 2, and 1+2 was > 0.4. Regarding four individual radiologists, the lowest p value was 0.42. However, the fifth radiologist disfavored the left, with p values of 0.21, 0.19, and 0.07, respectively; that radiologist had the greatest number of years of experience. There was a concerning, 93% likelihood that bias against left breast findings evidenced by one of our radiologists was not random. Notably, very soon after the period under review, he retired, presented with leukemia, and died. We call for research to be done, particularly by large departments with many radiologists, of two possible, new, quality metrics in screening mammography: laterality and bilaterality. (Images, patient outcomes, report validity, and radiologist psychological confidence levels were not assessed. No intervention nor subsequent data collection was conducted. This uncomplicated collection of data and simple appraisal were not designed, nor had there been any intention to develop or contribute, to generalizable knowledge (per U.S. DHHS 45 CFR, part 46)).Keywords: mammography, screening mammography, quality, quality metrics, laterality
Procedia PDF Downloads 162222 DTI Connectome Changes in the Acute Phase of Aneurysmal Subarachnoid Hemorrhage Improve Outcome Classification
Authors: Sarah E. Nelson, Casey Weiner, Alexander Sigmon, Jun Hua, Haris I. Sair, Jose I. Suarez, Robert D. Stevens
Abstract:
Graph-theoretical information from structural connectomes indicated significant connectivity changes and improved acute prognostication in a Random Forest (RF) model in aneurysmal subarachnoid hemorrhage (aSAH), which can lead to significant morbidity and mortality and has traditionally been fraught by poor methods to predict outcome. This study’s hypothesis was that structural connectivity changes occur in canonical brain networks of acute aSAH patients, and that these changes are associated with functional outcome at six months. In a prospective cohort of patients admitted to a single institution for management of acute aSAH, patients underwent diffusion tensor imaging (DTI) as part of a multimodal MRI scan. A weighted undirected structural connectome was created of each patient’s images using Constant Solid Angle (CSA) tractography, with 176 regions of interest (ROIs) defined by the Johns Hopkins Eve atlas. ROIs were sorted into four networks: Default Mode Network, Executive Control Network, Salience Network, and Whole Brain. The resulting nodes and edges were characterized using graph-theoretic features, including Node Strength (NS), Betweenness Centrality (BC), Network Degree (ND), and Connectedness (C). Clinical (including demographics and World Federation of Neurologic Surgeons scale) and graph features were used separately and in combination to train RF and Logistic Regression classifiers to predict two outcomes: dichotomized modified Rankin Score (mRS) at discharge and at six months after discharge (favorable outcome mRS 0-2, unfavorable outcome mRS 3-6). A total of 56 aSAH patients underwent DTI a median (IQR) of 7 (IQR=8.5) days after admission. The best performing model (RF) combining clinical and DTI graph features had a mean Area Under the Receiver Operator Characteristic Curve (AUROC) of 0.88 ± 0.00 and Area Under the Precision Recall Curve (AUPRC) of 0.95 ± 0.00 over 500 trials. The combined model performed better than the clinical model alone (AUROC 0.81 ± 0.01, AUPRC 0.91 ± 0.00). The highest-ranked graph features for prediction were NS, BC, and ND. These results indicate reorganization of the connectome early after aSAH. The performance of clinical prognostic models was increased significantly by the inclusion of DTI-derived graph connectivity metrics. This methodology could significantly improve prognostication of aSAH.Keywords: connectomics, diffusion tensor imaging, graph theory, machine learning, subarachnoid hemorrhage
Procedia PDF Downloads 189221 Expanding the Atelier: Design Lead Academic Project Using Immersive User-Generated Mobile Images and Augmented Reality
Authors: David Sinfield, Thomas Cochrane, Marcos Steagall
Abstract:
While there is much hype around the potential and development of mobile virtual reality (VR), the two key critical success factors are the ease of user experience and the development of a simple user-generated content ecosystem. Educational technology history is littered with the debris of over-hyped revolutionary new technologies that failed to gain mainstream adoption or were quickly superseded. Examples include 3D television, interactive CDROMs, Second Life, and Google Glasses. However, we argue that this is the result of curriculum design that substitutes new technologies into pre-existing pedagogical strategies that are focused upon teacher-delivered content rather than exploring new pedagogical strategies that enable student-determined learning or heutagogy. Visual Communication design based learning such as Graphic Design, Illustration, Photography and Design process is heavily based on the traditional forms of the classroom environment whereby student interaction takes place both at peer level and indeed teacher based feedback. In doing so, this makes for a healthy creative learning environment, but does raise other issue in terms of student to teacher learning ratios and reduced contact time. Such issues arise when students are away from the classroom and cannot interact with their peers and teachers and thus we see a decline in creative work from the student. Using AR and VR as a means of stimulating the students and to think beyond the limitation of the studio based classroom this paper will discuss the outcomes of a student project considering the virtual classroom and the techniques involved. The Atelier learning environment is especially suited to the Visual Communication model as it deals with the creative processing of ideas that needs to be shared in a collaborative manner. This has proven to have been a successful model over the years, in the traditional form of design education, but has more recently seen a shift in thinking as we move into a more digital model of learning and indeed away from the classical classroom structure. This study focuses on the outcomes of a student design project that employed Augmented Reality and Virtual Reality technologies in order to expand the dimensions of the classroom beyond its physical limits. Augmented Reality when integrated into the learning experience can improve the learning motivation and engagement of students. This paper will outline some of the processes used and the findings from the semester-long project that took place.Keywords: augmented reality, blogging, design in community, enhanced learning and teaching, graphic design, new technologies, virtual reality, visual communications
Procedia PDF Downloads 238220 Fine-Scale Modeling the Influencing Factors of Multi-Time Dimensions of Transit Ridership at Station Level: The Study of Guangzhou City
Authors: Dijiang Lyu, Shaoying Li, Zhangzhi Tan, Zhifeng Wu, Feng Gao
Abstract:
Nowadays, China is experiencing rapidly urban rail transit expansions in the world. The purpose of this study is to finely model factors influencing transit ridership at multi-time dimensions within transit stations’ pedestrian catchment area (PCA) in Guangzhou, China. This study was based on multi-sources spatial data, including smart card data, high spatial resolution images, points of interest (POIs), real-estate online data and building height data. Eight multiple linear regression models using backward stepwise method and Geographic Information System (GIS) were created at station-level. According to Chinese code for classification of urban land use and planning standards of development land, residential land-use were divided into three categories: first-level (e.g. villa), second-level (e.g. community) and third-level (e.g. urban villages). Finally, it concluded that: (1) four factors (CBD dummy, number of feeder bus route, number of entrance or exit and the years of station operation) were proved to be positively correlated with transit ridership, but the area of green land-use and water land-use negative correlated instead. (2) The area of education land-use, the second-level and third-level residential land-use were found to be highly connected to the average value of morning peak boarding and evening peak alighting ridership. But the area of commercial land-use and the average height of buildings, were significantly positive associated with the average value of morning peak alighting and evening peak boarding ridership. (3) The area of the second-level residential land-use was rarely correlated with ridership in other regression models. Because private car ownership is still large in Guangzhou now, and some residents living in the community around the stations go to work by transit at peak time, but others are much more willing to drive their own car at non-peak time. The area of the third-level residential land-use, like urban villages, was highly positive correlated with ridership in all models, indicating that residents who live in the third-level residential land-use are the main passenger source of the Guangzhou Metro. (4) The diversity of land-use was found to have a significant impact on the passenger flow on the weekend, but was non-related to weekday. The findings can be useful for station planning, management and policymaking.Keywords: fine-scale modeling, Guangzhou city, multi-time dimensions, multi-sources spatial data, transit ridership
Procedia PDF Downloads 142